Sample records for standards based approach

  1. 78 FR 1690 - Semiannual Agenda of Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... organizations subject to the advanced approaches capital rules, a supplementary leverage ratio that incorporates... risk-based and leverage capital requirements. Regulatory Capital Rules (Part 2): Standardized Approach... (``Standardized Approach NPR'') includes proposed changes to the agencies' general risk-based capital requirements...

  2. Density Matters: Review of Approaches to Setting Organism-Based Ballast Water Discharge Standards

    EPA Science Inventory

    As part of their effort to develop national ballast water discharge standards under NPDES permitting, the Office of Water requested that WED scientists identify and review existing approaches to generating organism-based discharge standards for ballast water. Six potential appro...

  3. A Risk and Standards Based Approach to Quality Assurance in Australia's Diverse Higher Education Sector

    ERIC Educational Resources Information Center

    Australian Government Tertiary Education Quality and Standards Agency, 2015

    2015-01-01

    The Australian Government Tertiary Education Quality and Standards Agency's (TEQSA's) role is to assure that quality standards are being met by all registered higher education providers. This paper explains how TEQSA's risk-based approach to assuring higher education standards is applied in broad terms to a diverse sector. This explanation is…

  4. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    PubMed

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. A Standards-Based Approach for Reporting Assessment Results in South Africa

    ERIC Educational Resources Information Center

    Kanjee, Anil; Moloi, Qetelo

    2016-01-01

    This article proposes the use of a standards-based approach to reporting results from large-scale assessment surveys in South Africa. The use of this approach is intended to address the key shortcomings observed in the current reporting framework prescribed in the national curriculum documents. Using the Angoff method and data from the Annual…

  6. Density matters: Review of approaches to setting organism-based ballast water discharge standards

    USGS Publications Warehouse

    Lee II,; Frazier,; Ruiz,

    2010-01-01

    As part of their effort to develop national ballast water discharge standards under NPDES permitting, the Office of Water requested that WED scientists identify and review existing approaches to generating organism-based discharge standards for ballast water. Six potential approaches were identified and the utility and uncertainties of each approach was evaluated. During the process of reviewing the existing approaches, the WED scientists, in conjunction with scientists at the USGS and Smithsonian Institution, developed a new approach (per capita invasion probability or "PCIP") that addresses many of the limitations of the previous methodologies. THE PCIP approach allows risk managers to generate quantitative discharge standards using historical invasion rates, ballast water discharge volumes, and ballast water organism concentrations. The statistical power of sampling ballast water for both the validation of ballast water treatment systems and ship-board compliance monitoring with the existing methods, though it should be possible to obtain sufficient samples during treatment validation. The report will go to a National Academy of Sciences expert panel that will use it in their evaluation of approaches to developing ballast water discharge standards for the Office of Water.

  7. Peculiarities of Professional Training Standards Development and Implementation within Competency-Based Approach: Foreign Experience

    ERIC Educational Resources Information Center

    Desyatov, Tymofiy

    2015-01-01

    The article analyzes the development of competency-based professional training standards and their implementation into educational process in foreign countries. It determines that the main idea of competency-based approach is competency-and-active learning, which aims at complex acquirement of diverse skills and ways of practice activities via…

  8. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  9. A new approach in the development of quality management systems for (micro)electronics

    NASA Astrophysics Data System (ADS)

    Bacivarov, Ioan C.; Bacivarov, Angelica; Gherghina, Cǎtǎlina

    2016-12-01

    This paper presents the new approach in the analysis of the Quality Management Systems (QMS) of companies, based on the revised standard ISO 9001:2015. In the first part of the paper, QMS based on ISO 9001 certification are introduced; the changes and the updates proposed for the new version of ISO 9001:2015 are critically analyzed, based on the documents elaborated by ISO/TC 176. The approach based on ISO 9001:2015 could be considered as "beginning of a new era in development of quality management systems". A comparison between the between the "old" standard ISO 9001:2008 and the "new" standard ISO 9001:2015 is made. In the second part of the paper, steps to be followed in a company to implement this new standard are presented. A peculiar attention is given to the new concept of risk-based thinking in order to support and improve application of the process based approach. The authors conclude that, by considering risk throughout the organization the likelihood of achieving stated objectives is improved, output is more consistent and customers can be confident that they will receive the expected results. Finally, the benefits of the new approach in the development of quality management systems are outlined, as well as how they are reflected in the management of companies in general and those in electronics field, in particular. As demonstrated in this paper, well understood and properly applied, the new approach based on the revised standard ISO9001:2015 could offer a better quality management for companies operating in electronics and beyond.

  10. A Principles-Based Approach to Teaching International Financial Reporting Standards (IFRS)

    ERIC Educational Resources Information Center

    Persons, Obeua

    2014-01-01

    This article discusses the principles-based approach that emphasizes a "why" question by using the International Accounting Standards Board (IASB) "Conceptual Framework for Financial Reporting" to question and understand the basis for specific differences between IFRS and U.S. generally accepted accounting principles (U.S.…

  11. Comparison of two head-up displays in simulated standard and noise abatement night visual approaches

    NASA Technical Reports Server (NTRS)

    Cronn, F.; Palmer, E. A., III

    1975-01-01

    Situation and command head-up displays were evaluated for both standard and two segment noise abatement night visual approaches in a fixed base simulation of a DC-8 transport aircraft. The situation display provided glide slope and pitch attitude information. The command display provided glide slope information and flight path commands to capture a 3 deg glide slope. Landing approaches were flown in both zero wind and wind shear conditions. For both standard and noise abatement approaches, the situation display provided greater glidepath accuracy in the initial phase of the landing approaches, whereas the command display was more effective in the final approach phase. Glidepath accuracy was greater for the standard approaches than for the noise abatement approaches in all phases of the landing approach. Most of the pilots preferred the command display and the standard approach. Substantial agreement was found between each pilot's judgment of his performance and his actual performance.

  12. LC-MS/MS-based approach for obtaining exposure estimates of metabolites in early clinical trials using radioactive metabolites as reference standards.

    PubMed

    Zhang, Donglu; Raghavan, Nirmala; Chando, Theodore; Gambardella, Janice; Fu, Yunlin; Zhang, Duxi; Unger, Steve E; Humphreys, W Griffith

    2007-12-01

    An LC-MS/MS-based approach that employs authentic radioactive metabolites as reference standards was developed to estimate metabolite exposures in early drug development studies. This method is useful to estimate metabolite levels in studies done with non-radiolabeled compounds where metabolite standards are not available to allow standard LC-MS/MS assay development. A metabolite mixture obtained from an in vivo source treated with a radiolabeled compound was partially purified, quantified, and spiked into human plasma to provide metabolite standard curves. Metabolites were analyzed by LC-MS/MS using the specific mass transitions and an internal standard. The metabolite concentrations determined by this approach were found to be comparable to those determined by valid LC-MS/MS assays. This approach does not requires synthesis of authentic metabolites or the knowledge of exact structures of metabolites, and therefore should provide a useful method to obtain early estimates of circulating metabolites in early clinical or toxicological studies.

  13. Packing Up for the Moon: Human Exploration Project Engineering Design Challenge. Design, Build and Evaluate. A Standards-Based Middle School Unit Guide. Engineering By Design: Advancing Technological Literacy--A Standards-Based Program Series

    ERIC Educational Resources Information Center

    NASA Educator Resource Center at Marshall Space Flight Center, 2007

    2007-01-01

    The Human Exploration Project (HEP) units have several common characteristics. All units: (1) Are based upon the Technological Literacy standards (ITEA, 2000/2002); (2) Coordinate with Science (AAAS, 1993) and Mathematics standards (NCTM, 2000); (3) Utilize a standards-based development approach (ITEA, 2005); (4) Stand alone and coordinate with…

  14. Approaches to setting organism-based ballast water discharge standards

    USGS Publications Warehouse

    Lee, Henry; Reusser, Deborah A.; Frazier, Melanie

    2013-01-01

    As a vector by which foreign species invade coastal and freshwater waterbodies, ballast water discharge from ships is recognized as a major environmental threat. The International Maritime Organization (IMO) drafted an international treaty establishing ballast water discharge standards based on the number of viable organisms per volume of ballast discharge for different organism size classes. Concerns that the IMO standards are not sufficiently protective have initiated several state and national efforts in the United States to develop more stringent standards. We evaluated seven approaches to establishing discharge standards for the >50-μm size class: (1) expert opinion/management consensus, (2) zero detectable living organisms, (3) natural invasion rates, (4) reaction–diffusion models, (5) population viability analysis (PVA) models, (6) per capita invasion probabilities (PCIP), and (7) experimental studies. Because of the difficulty in synthesizing scientific knowledge in an unbiased and transparent fashion, we recommend the use of quantitative models instead of expert opinion. The actual organism concentration associated with a “zero detectable organisms” standard is defined by the statistical rigor of its monitoring program; thus it is not clear whether such a standard is as stringent as other standards. For several reasons, the natural invasion rate, reaction–diffusion, and experimental approaches are not considered suitable for generating discharge standards. PVA models can be used to predict the likelihood of establishment of introduced species but are limited by a lack of population vital rates for species characteristic of ballast water discharges. Until such rates become available, PVA models are better suited to evaluate relative efficiency of proposed standards rather than predicting probabilities of invasion. The PCIP approach, which is based on historical invasion rates at a regional scale, appears to circumvent many of the indicated problems, although it may underestimate invasions by asexual and parthenogenic species. Further research is needed to better define propagule dose–responses, densities at which Allee effects occur, approaches to predicting the likelihood of invasion from multi-species introductions, and generation of formal comparisons of approaches using standardized scenarios.

  15. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  16. Mathematical Knowledge for Teaching, Standards-Based Mathematics Teaching Practices, and Student Achievement in the Context of the "Responsive Classroom Approach"

    ERIC Educational Resources Information Center

    Ottmar, Erin R.; Rimm-Kaufman, Sara E.; Larsen, Ross A.; Berry, Robert Q.

    2015-01-01

    This study investigates the effectiveness of the Responsive Classroom (RC) approach, a social and emotional learning intervention, on changing the relations between mathematics teacher and classroom inputs (mathematical knowledge for teaching [MKT] and standards-based mathematics teaching practices) and student mathematics achievement. Work was…

  17. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  18. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation.

    PubMed

    Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K

    2017-07-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n  = 5) and transfemoral ( n  = 5) amputation were compared to able-limb controls ( n  = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.

  19. Some contingencies of spelling

    PubMed Central

    Lee, Vicki L.; Sanderson, Gwenda M.

    1987-01-01

    This paper presents some speculation about the contingencies that might select standard spellings. The speculation is based on a new development in the teaching of spelling—the process writing approach, which lets standard spellings emerge collateral to a high frequency of reading and writing. The paper discusses this approach, contrasts it with behavior-analytic research on spelling, and suggests some new directions for this latter research based on a behavioral interpretation of the process writing approach to spelling. PMID:22477529

  20. A transversal approach to predict gene product networks from ontology-based similarity

    PubMed Central

    Chabalier, Julie; Mosser, Jean; Burgun, Anita

    2007-01-01

    Background Interpretation of transcriptomic data is usually made through a "standard" approach which consists in clustering the genes according to their expression patterns and exploiting Gene Ontology (GO) annotations within each expression cluster. This approach makes it difficult to underline functional relationships between gene products that belong to different expression clusters. To address this issue, we propose a transversal analysis that aims to predict functional networks based on a combination of GO processes and data expression. Results The transversal approach presented in this paper consists in computing the semantic similarity between gene products in a Vector Space Model. Through a weighting scheme over the annotations, we take into account the representativity of the terms that annotate a gene product. Comparing annotation vectors results in a matrix of gene product similarities. Combined with expression data, the matrix is displayed as a set of functional gene networks. The transversal approach was applied to 186 genes related to the enterocyte differentiation stages. This approach resulted in 18 functional networks proved to be biologically relevant. These results were compared with those obtained through a standard approach and with an approach based on information content similarity. Conclusion Complementary to the standard approach, the transversal approach offers new insight into the cellular mechanisms and reveals new research hypotheses by combining gene product networks based on semantic similarity, and data expression. PMID:17605807

  1. Quantitative photoacoustic imaging in the acoustic regime using SPIM

    NASA Astrophysics Data System (ADS)

    Beigl, Alexander; Elbau, Peter; Sadiq, Kamran; Scherzer, Otmar

    2018-05-01

    While in standard photoacoustic imaging the propagation of sound waves is modeled by the standard wave equation, our approach is based on a generalized wave equation with variable sound speed and material density, respectively. In this paper we present an approach for photoacoustic imaging, which in addition to the recovery of the absorption density parameter, the imaging parameter of standard photoacoustics, also allows us to reconstruct the spatially varying sound speed and density, respectively, of the medium. We provide analytical reconstruction formulas for all three parameters based in a linearized model based on single plane illumination microscopy (SPIM) techniques.

  2. Situating Standard Setting within Argument-Based Validity

    ERIC Educational Resources Information Center

    Papageorgiou, Spiros; Tannenbaum, Richard J.

    2016-01-01

    Although there has been substantial work on argument-based approaches to validation as well as standard-setting methodologies, it might not always be clear how standard setting fits into argument-based validity. The purpose of this article is to address this lack in the literature, with a specific focus on topics related to argument-based…

  3. Standardization approaches in absolute quantitative proteomics with mass spectrometry.

    PubMed

    Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo

    2017-07-31

    Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and elemental) proteomics is provided in this review. © 2017 Wiley Periodicals, Inc.

  4. An Open Metadata Schema for Clinical Pathway (openCP) in China.

    PubMed

    Xu, Wei; Zhu, Yanxin; Wang, Xia

    2017-01-01

    China has issued and implemented standard clinical pathways (Chinese standard CPs) since 2009; however, they are still paper-based CPs. The aim of the study is to reorganize Chinese standard CPs based on related Chinese medical standards, by using archetype approach, and develop an Open platform for CP (openCP) in China.

  5. Comparison of Web-Based and Face-to-Face Standard Setting Using the Angoff Method

    ERIC Educational Resources Information Center

    Katz, Irvin R.; Tannenbaum, Richard J.

    2014-01-01

    Web-based standard setting holds promise for reducing the travel and logistical inconveniences of traditional, face-to-face standard setting meetings. However, because there are few published reports of setting standards via remote meeting technology, little is known about the practical potential of the approach, including technical feasibility of…

  6. Evaluation of portfolio credit risk based on survival analysis for progressive censored data

    NASA Astrophysics Data System (ADS)

    Jaber, Jamil J.; Ismail, Noriszura; Ramli, Siti Norafidah Mohd

    2017-04-01

    In credit risk management, the Basel committee provides a choice of three approaches to the financial institutions for calculating the required capital: the standardized approach, the Internal Ratings-Based (IRB) approach, and the Advanced IRB approach. The IRB approach is usually preferred compared to the standard approach due to its higher accuracy and lower capital charges. This paper use several parametric models (Exponential, log-normal, Gamma, Weibull, Log-logistic, Gompertz) to evaluate the credit risk of the corporate portfolio in the Jordanian banks based on the monthly sample collected from January 2010 to December 2015. The best model is selected using several goodness-of-fit criteria (MSE, AIC, BIC). The results indicate that the Gompertz distribution is the best model parametric model for the data.

  7. Short Shrift to Long Lists: An Alternative Approach to the Development of Performance Standards for School Principals.

    ERIC Educational Resources Information Center

    Louden, William; Wildy, Helen

    1999-01-01

    Describes examples of standards frameworks for principals' work operant in three countries and describes an alternative approach based on interviewing 40 Australian principals. By combining qualitative case studies with probabilistic measurement techniques, the alternative approach provides contextually rich descriptions of growth in performance…

  8. Report on Pairing-based Cryptography.

    PubMed

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST's position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed.

  9. Report on Pairing-based Cryptography

    PubMed Central

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST’s position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed. PMID:26958435

  10. Comparison of validity of mapping between drug indications and ICD-10. Direct and indirect terminology based approaches.

    PubMed

    Choi, Y; Jung, C; Chae, Y; Kang, M; Kim, J; Joung, K; Lim, J; Cho, S; Sung, S; Lee, E; Kim, S

    2014-01-01

    Mapping of drug indications to ICD-10 was undertaken in Korea by a public and a private institution for their own purposes. A different mapping approach was used by each institution, which presented a good opportunity to compare the validity of the two approaches. This study was undertaken to compare the validity of a direct mapping approach and an indirect terminology based mapping approach of drug indications against the gold standard drawn from the results of the two mapping processes. Three hundred and seventy-five cardiovascular reference drugs were selected from all listed cardiovascular drugs for the study. In the direct approach, two experienced nurse coders mapped the free text indications directly to ICD-10. In the indirect terminology based approach, the indications were extracted and coded in the Korean Standard Terminology of Medicine. These terminology coded indications were then manually mapped to ICD-10. The results of the two approaches were compared to the gold standard. A kappa statistic was calculated to see the compatibility of both mapping approaches. Recall, precision and F1 score of each mapping approach were calculated and analyzed using a paired t-test. The mean number of indications for the study drugs was 5.42. The mean number of ICD-10 codes that matched in direct approach was 46.32 and that of indirect terminology based approach was 56.94. The agreement of the mapping results between the two approaches were poor (kappa = 0.19). The indirect terminology based approach showed higher recall (86.78%) than direct approach (p < 0.001). However, there was no difference in precision and F1 score between the two approaches. Considering no differences in the F1 scores, both approaches may be used in practice for mapping drug indications to ICD-10. However, in terms of consistency, time and manpower, better results are expected from the indirect terminology based approach.

  11. Variability of pesticide detections and concentrations in field replicate water samples collected for the National Water-Quality Assessment Program, 1992-97

    USGS Publications Warehouse

    Martin, Jeffrey D.

    2002-01-01

    Correlation analysis indicates that for most pesticides and concentrations, pooled estimates of relative standard deviation rather than pooled estimates of standard deviation should be used to estimate variability because pooled estimates of relative standard deviation are less affected by heteroscedasticity. The 2 Variability of Pesticide Detections and Concentrations in Field Replicate Water Samples, 1992–97 median pooled relative standard deviation was calculated for all pesticides to summarize the typical variability for pesticide data collected for the NAWQA Program. The median pooled relative standard deviation was 15 percent at concentrations less than 0.01 micrograms per liter (µg/L), 13 percent at concentrations near 0.01 µg/L, 12 percent at concentrations near 0.1 µg/L, 7.9 percent at concentrations near 1 µg/L, and 2.7 percent at concentrations greater than 5 µg/L. Pooled estimates of standard deviation or relative standard deviation presented in this report are larger than estimates based on averages, medians, smooths, or regression of the individual measurements of standard deviation or relative standard deviation from field replicates. Pooled estimates, however, are the preferred method for characterizing variability because they provide unbiased estimates of the variability of the population. Assessments of variability based on standard deviation (rather than variance) underestimate the true variability of the population. Because pooled estimates of variability are larger than estimates based on other approaches, users of estimates of variability must be cognizant of the approach used to obtain the estimate and must use caution in the comparison of estimates based on different approaches.

  12. Standard metrics for a plug-and-play tracker

    NASA Astrophysics Data System (ADS)

    Antonisse, Jim; Young, Darrell

    2012-06-01

    The Motion Imagery Standards Board (MISB) has previously established a metadata "micro-architecture" for standards-based tracking. The intent of this work is to facilitate both the collaborative development of competent tracking systems, and the potentially distributed and dispersed execution of tracker system components in real-world execution environments. The approach standardizes a set of five quasi-sequential modules in image-based tracking. However, in order to make the plug-and-play architecture truly useful we need metrics associated with each module (so that, for instance, a researcher who "plugs in" a new component can ascertain whether he/she did better or worse with the component). This paper proposes the choice of a new, unifying set of metrics based on an informationtheoretic approach to tracking, which the MISB is nominating as DoD/IC/NATO standards.

  13. Classroom to Community and Back: Using Culturally Responsive, Standards-Based Teaching to Strengthen Family and Community Partnerships and Increase Student Achievement

    ERIC Educational Resources Information Center

    Saifer, Steffen; Edwards, Keisha; Ellis, Debbie; Ko, Lena; Stuczynski, Amy

    2005-01-01

    This document describes how educators can use the knowledge and culture students bring to school in a standards-based curriculum that supports student success. The authors call this approach culturally responsive, standards-based (CRSB) teaching. Unlike multicultural education--which is an important way to incorporate all the world's cultural and…

  14. A Phenomenological Study on the Lived Experience of First and Second Year Teachers in Standards-Based Grading Districts

    ERIC Educational Resources Information Center

    Battistone, William A., Jr.

    2017-01-01

    Problem: There is an existing cycle of questionable grading practices at the K-12 level. As a result, districts continue to search for innovative methods of evaluating and reporting student progress. One result of this effort has been the adoption of a standards-based grading approach. Research concerning standards-based grading implementation has…

  15. Starting Strong: Evidence-­Based Early Literacy Practices

    ERIC Educational Resources Information Center

    Blamey, Katrin; Beauchat, Katherine

    2016-01-01

    Four evidence-based instructional approaches create an essential resource for any early literacy teacher or coach. Improve your teaching practices in all areas of early literacy. Use four proven instructional approaches--standards based, evidenced based, assessment based, and student based--to improve their teaching practice in all areas of early…

  16. [Poverty and Health: The Living Standard Approach as a Supplementary Concept to Measure Relative Poverty. Results from the German Socio-Economic Panel (GSOEP 2011)].

    PubMed

    Pförtner, T-K

    2016-06-01

    A common indicator of the measurement of relative poverty is the disposable income of a household. Current research introduces the living standard approach as an alternative concept for describing and measuring relative poverty. This study compares both approaches with regard to subjective health status of the German population, and provides theoretical implications for the utilisation of the income and living standard approach in health research. Analyses are based on the German Socio-Economic Panel (GSOEP) from the year 2011 that includes 12 290 private households and 21106 survey members. Self-rated health was based on a subjective assessment of general health status. Income poverty is based on the equalised disposable income and is applied to a threshold of 60% of the median-based average income. A person will be denoted as deprived (inadequate living standard) if 3 or more out of 11 living standard items are lacking due to financial reasons. To calculate the discriminate power of both poverty indicators, descriptive analyses and stepwise logistic regression models were applied separately for men and women adjusted for age, residence, nationality, educational level, occupational status and marital status. The results of the stepwise regression revealed a stronger poverty-health relationship for the living standard indicator. After adjusting for all control variables and the respective poverty indicator, income poverty was statistically not significantly associated with a poor subjective health status among men (OR Men: 1.33; 95% CI: 1.00-1.77) and women (OR Women: 0.98; 95% CI: 0.78-1.22). In contrast, the association between deprivation and subjective health status was statistically significant for men (OR Men: 2.00; 95% CI: 1.57-2.52) and women (OR Women: 2.11; 95% CI: 1.76-2.64). The results of the present study indicate that the income and standard of living approach measure different dimensions of poverty. In comparison to the income approach, the living standard approach measures stronger shortages of wealth and is relatively robust towards gender differences. This study expands the current debate about complementary research on the association between poverty and health. © Georg Thieme Verlag KG Stuttgart · New York.

  17. A model-driven approach to information security compliance

    NASA Astrophysics Data System (ADS)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  18. Council for Exceptional Children: Standards for Evidence-Based Practices in Special Education

    ERIC Educational Resources Information Center

    TEACHING Exceptional Children, 2014

    2014-01-01

    In this article, the "Council for Exceptional Children (CEC)" presents Standards for Evidence-Based Practices in Special Education. The statement presents an approach for categorizing the evidence base of practices in special education. The quality indicators and the criteria for categorizing the evidence base of special education…

  19. Improving performance of Zambia Defence Force antiretroviral therapy providers: evaluation of a standards-based approach

    PubMed Central

    Kim, Young Mi; Banda, Joseph; Kanjipite, Webby; Sarkar, Supriya; Bazant, Eva; Hiner, Cyndi; Tholandi, Maya; Reinhardt, Stephanie; Njobvu, Panganani Dalisani; Kols, Adrienne; Benavides, Bruno

    2013-01-01

    ABSTRACT Background: The Zambia Defence Force (ZDF) has applied the Standards-Based Management and Recognition (SBM-R®) approach, which uses detailed performance standards, at some health facilities to improve HIV-related services offered to military personnel and surrounding civilian communities. This study examines the effectiveness of the SBM-R approach in improving facility readiness and provider performance at ZDF facilities. Methods: We collected data on facility readiness and provider performance before and after the 2010–2012 intervention at 4 intervention sites selected for their relatively poor performance and 4 comparison sites. Assessors observed whether each facility met 16 readiness standards and whether providers met 9 performance standards during consultations with 354 returning antiretroviral therapy (ART) clients. We then calculated the percentages of criteria achieved for each readiness and performance standard and conducted bivariate and multivariate analyses of provider performance data. Results: Facilities' ART readiness scores exceeded 80% before the intervention at both intervention and comparison sites. At endline, scores improved on 4 facility readiness standards in the intervention group but on only 1 standard in the comparison group. Multivariate analysis found that the overall provider performance score increased significantly in the intervention group (from 58% to 84%; P<.01) but not in the comparison group (from 62% to 70%). The before-and-after improvement in scores was significantly greater among intervention sites than among comparison sites for 2 standards—initial assessment of the client's condition and nutrition counseling. Conclusion: The standards-based approach, which involved intensive and mutually reinforcing intervention activities, showed modest improvements in some aspects of providers' performance during ART consultations. Further research is needed to determine whether improvements in provider performance affect client outcomes such as adherence to ART. PMID:25276534

  20. Prediction of Regulation Reserve Requirements in California ISO Control Area based on BAAL Standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Makarov, Yuri V.; Samaan, Nader A.

    This paper presents new methodologies developed at Pacific Northwest National Laboratory (PNNL) to estimate regulation capacity requirements in the California ISO control area. Two approaches have been developed: (1) an approach based on statistical analysis of actual historical area control error (ACE) and regulation data, and (2) an approach based on balancing authority ACE limit control performance standard. The approaches predict regulation reserve requirements on a day-ahead basis including upward and downward requirements, for each operating hour of a day. California ISO data has been used to test the performance of the proposed algorithms. Results show that software tool allowsmore » saving up to 30% on the regulation procurements cost .« less

  1. 78 FR 51101 - Regulatory Capital Rules: Regulatory Capital, Enhanced Supplementary Leverage Ratio Standards for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-20

    ... every $100 of current generally applicable leverage exposure based on a group of advanced approaches... approaches adopted by the agencies in July, 2013 (2013 revised capital approaches), the agencies established... organizations subject to the advanced approaches risk-based capital rules. In this notice of proposed rulemaking...

  2. The standard data model approach to patient record transfer.

    PubMed Central

    Canfield, K.; Silva, M.; Petrucci, K.

    1994-01-01

    This paper develops an approach to electronic data exchange of patient records from Ambulatory Encounter Systems (AESs). This approach assumes that the AES is based upon a standard data model. The data modeling standard used here is IDEFIX for Entity/Relationship (E/R) modeling. Each site that uses a relational database implementation of this standard data model (or a subset of it) can exchange very detailed patient data with other such sites using industry standard tools and without excessive programming efforts. This design is detailed below for a demonstration project between the research-oriented geriatric clinic at the Baltimore Veterans Affairs Medical Center (BVAMC) and the Laboratory for Healthcare Informatics (LHI) at the University of Maryland. PMID:7949973

  3. NASIS data base management system: IBM 360 TSS implementation. Volume 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlines. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficency of the programming task.

  4. A Cradle-to-Grave Integrated Approach to Using UNIFORMAT II

    ERIC Educational Resources Information Center

    Schneider, Richard C.; Cain, David A.

    2009-01-01

    The ASTM E1557/UNIFORMAT II standard is a three-level, function-oriented classification which links the schematic phase Preliminary Project Descriptions (PPD), based on Construction Standard Institute (CSI) Practice FF/180, to elemental cost estimates based on R.S. Means Cost Data. With the UNIFORMAT II Standard Classification for Building…

  5. 78 FR 62417 - Regulatory Capital Rules: Regulatory Capital, Implementation of Basel III, Capital Adequacy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ..., Standardized Approach for Risk-Weighted Assets, Market Discipline and Disclosure Requirements, Advanced Approaches Risk-Based Capital Rule, and Market Risk Capital Rule AGENCY: Federal Deposit Insurance... Assets, Market Discipline and Disclosure Requirements, Advanced Approaches Risk-Based Capital Rule, and...

  6. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  7. A standards-based approach to quality improvement for HIV services at Zambia Defence Force facilities: results and lessons learned.

    PubMed

    Kols, Adrienne; Kim, Young-Mi; Bazant, Eva; Necochea, Edgar; Banda, Joseph; Stender, Stacie

    2015-07-01

    The Zambia Defence Force adopted the Standards-Based Management and Recognition approach to improve the quality of the HIV-related services at its health facilities. This quality improvement intervention relies on comprehensive, detailed assessment tools to communicate and verify adherence to national standards of care, and to test and implement changes to improve performance. A quasi-experimental evaluation of the intervention was conducted at eight Zambia Defence Force primary health facilities (four facilities implemented the intervention and four did not). Data from three previous analyses are combined to assess the effect of Standards-Based Management and Recognition on three domains: facility readiness to provide services; observed provider performance during antiretroviral therapy (ART) and antenatal care consultations; and provider perceptions of the work environment. Facility readiness scores for ART improved on four of the eight standards at intervention sites, and one standard at comparison sites. Facility readiness scores for prevention of mother-to-child transmission (PMTCT) of HIV increased by 15 percentage points at intervention sites and 7 percentage points at comparison sites. Provider performance improved significantly at intervention sites for both ART services (from 58 to 84%; P < 0.01) and PMTCT services (from 58 to 73%; P = 0.003); there was no significant change at comparison sites. Providers' perceptions of the work environment generally improved at intervention sites and declined at comparison sites; differences in trends between study groups were significant for eight items. A standards-based approach to quality improvement proved effective in supporting healthcare managers and providers to deliver ART and PMTCT services in accordance with evidence-based standards in a health system suffering from staff shortages.

  8. Blueprint. Number 3

    ERIC Educational Resources Information Center

    White, April D., Ed.

    2009-01-01

    States have taken a "standards-based" approach to education during the past two decades; however, as reported in the Hunt Institute-sponsored study by the National Research Council, that approach has fallen short of its lofty and admirable goals. A comprehensive and integrated system of standards, assessments, curriculum, instructional materials,…

  9. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  10. 76 FR 42395 - Business Conduct Standards for Security-Based Swap Dealers and Major Security-Based Swap...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-18

    ... received. Table of Contents I. Introduction A. Statutory Framework B. Consultations C. Approach to Drafting.... Generally B. Consistency With CFTC Approach IV. Paperwork Reduction Act A. Summary of Collections of... that may rely on security-based swaps to manage risk and reduce volatility. C. Approach to Drafting the...

  11. Using a logical information model-driven design process in healthcare.

    PubMed

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  12. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  13. Potential standards support for activity-based GeoINT

    NASA Astrophysics Data System (ADS)

    Antonisse, Jim

    2012-06-01

    The Motion Imagery Standards Board (MISB) is engaged in multiple initiatives that may provide support for Activity-Based GeoINT (ABG). This paper describes a suite of approaches based on previous MISB work on a standards-based architecture for tracking. It focuses on ABG in the context of standardized tracker results, and shows how the MISB tracker formulation can formalize important components of the ABG problem. The paper proposes a grammar-based formalism for the reporting of activities within a stream of FMV or wide-area surveillance data. Such a grammar would potentially provide an extensible descriptive language for ABG across the community.

  14. Comparison of anchor-based and distributional approaches in estimating important difference in common cold.

    PubMed

    Barrett, Bruce; Brown, Roger; Mundt, Marlon

    2008-02-01

    Evaluative health-related quality-of-life instruments used in clinical trials should be able to detect small but important changes in health status. Several approaches to minimal important difference (MID) and responsiveness have been developed. To compare anchor-based and distributional approaches to important difference and responsiveness for the Wisconsin Upper Respiratory Symptom Survey (WURSS), an illness-specific quality of life outcomes instrument. Participants with community-acquired colds self-reported daily using the WURSS-44. Distribution-based methods calculated standardized effect size (ES) and standard error of measurement (SEM). Anchor-based methods compared daily interval changes to global ratings of change, using: (1) standard MID methods based on correspondence to ratings of "a little better" or "somewhat better," and (2) two-level multivariate regression models. About 150 adults were monitored throughout their colds (1,681 sick days.): 88% were white, 69% were women, and 50% had completed college. The mean age was 35.5 years (SD = 14.7). WURSS scores increased 2.2 points from the first to second day, and then dropped by an average of 8.2 points per day from days 2 to 7. The SEM averaged 9.1 during these 7 days. Standard methods yielded a between day MID of 22 points. Regression models of MID projected 11.3-point daily changes. Dividing these estimates of small-but-important-difference by pooled SDs yielded coefficients of .425 for standard MID, .218 for regression model, .177 for SEM, and .157 for ES. These imply per-group sample sizes of 870 using ES, 616 for SEM, 302 for regression model, and 89 for standard MID, assuming alpha = .05, beta = .20 (80% power), and two-tailed testing. Distribution and anchor-based approaches provide somewhat different estimates of small but important difference, which in turn can have substantial impact on trial design.

  15. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    ERIC Educational Resources Information Center

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  16. A Comparison of Alkaline Water and Mediterranean Diet vs Proton Pump Inhibition for Treatment of Laryngopharyngeal Reflux.

    PubMed

    Zalvan, Craig H; Hu, Shirley; Greenberg, Barbara; Geliebter, Jan

    2017-10-01

    Laryngopharyngeal reflux (LPR) is a common disorder with protean manifestations in the head and neck. In this retrospective study, we report the efficacy of a wholly dietary approach using alkaline water, a plant-based, Mediterranean-style diet, and standard reflux precautions compared with that of the traditional treatment approach of proton pump inhibition (PPI) and standard reflux precautions. To determine whether treatment with a diet-based approach with standard reflux precautions alone can improve symptoms of LPR compared with treatment with PPI and standard reflux precautions. This was a retrospective medical chart review of 2 treatment cohorts. From 2010 to 2012, 85 patients with LPR that were treated with PPI and standard reflux precautions (PS) were identified. From 2013 to 2015, 99 patients treated with alkaline water (pH >8.0), 90% plant-based, Mediterranean-style diet, and standard reflux precautions (AMS) were identified. The outcome was based on change in Reflux Symptom Index (RSI). Recorded change in the RSI after 6 weeks of treatment. Of the 184 patients identified in the PS and AMS cohorts, the median age of participants in each cohort was 60 years (95% CI, 18-82) and 57 years (95% CI, 18-93), respectively (47 [56.3%] and 61 [61.7%] were women, respectively). The percentage of patients achieving a clinically meaningful (≥6 points) reduction in RSI was 54.1% in PS-treated patients and 62.6% in AMS-treated patients (difference between the groups, 8.05; 95% CI, -5.74 to 22.76). The mean reduction in RSI was 27.2% for the PS group and 39.8% in the AMS group (difference, 12.10; 95% CI, 1.53 to 22.68). Our data suggest that the effect of PPI on the RSI based on proportion reaching a 6-point reduction in RSI is not significantly better than that of alkaline water, a plant-based, Mediterranean-style diet, and standard reflux precautions, although the difference in the 2 treatments could be clinically meaningful in favor of the dietary approach. The percent reduction in RSI was significantly greater with the dietary approach. Because the relationship between percent change and response to treatment has not been studied, the clinical significance of this difference requires further study. Nevertheless, this study suggests that a plant-based diet and alkaline water should be considered in the treatment of LPR. This approach may effectively improve symptoms and could avoid the costs and adverse effects of pharmacological intervention as well as afford the additional health benefits associated with a healthy, plant-based diet.

  17. Effects of a Format-based Second Language Teaching Method in Kindergarten.

    ERIC Educational Resources Information Center

    Uilenburg, Noelle; Plooij, Frans X.; de Glopper, Kees; Damhuis, Resi

    2001-01-01

    Focuses on second language teaching with a format-based method. The differences between a format-based teaching method and a standard approach used as treatments in a quasi-experimental, non-equivalent control group are described in detail. Examines whether the effects of a format-based teaching method and a standard foreign language method differ…

  18. Robotic Anterior and Midline Skull Base Surgery: Preclinical Investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Malley, Bert W.; Weinstein, Gregory S.

    Purpose: To develop a minimally invasive surgical technique to access the midline and anterior skull base using the optical and technical advantages of robotic surgical instrumentation. Methods and Materials: Ten experimental procedures focusing on approaches to the nasopharynx, clivus, sphenoid, pituitary sella, and suprasellar regions were performed on one cadaver and one live mongrel dog. Both the cadaver and canine procedures were performed in an approved training facility using the da Vinci Surgical Robot. For the canine experiments, a transoral robotic surgery (TORS) approach was used, and for the cadaver a newly developed combined cervical-transoral robotic surgery (C-TORS) approach wasmore » investigated and compared with standard TORS. The ability to access and dissect tissues within the various areas of the midline and anterior skull base were evaluated, and techniques to enhance visualization and instrumentation were developed. Results: Standard TORS approaches did not provide adequate access to the midline and anterior skull base; however, the newly developed C-TORS approach was successful in providing the surgical access to these regions of the skull base. Conclusion: Robotic surgery is an exciting minimally invasive approach to the skull base that warrants continued preclinical investigation and development.« less

  19. PRA and Risk Informed Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernsen, Sidney A.; Simonen, Fredric A.; Balkey, Kenneth R.

    2006-01-01

    The Boiler and Pressure Vessel Code (BPVC) of the American Society of Mechanical Engineers (ASME) has introduced a risk based approach into Section XI that covers Rules for Inservice Inspection of Nuclear Power Plant Components. The risk based approach requires application of the probabilistic risk assessments (PRA). Because no industry consensus standard existed for PRAs, ASME has developed a standard to evaluate the quality level of an available PRA needed to support a given risk based application. The paper describes the PRA standard, Section XI application of PRAs, and plans for broader applications of PRAs to other ASME nuclear codesmore » and standards. The paper addresses several specific topics of interest to Section XI. Important consideration are special methods (surrogate components) used to overcome the lack of PRA treatments of passive components in PRAs. The approach allows calculations of conditional core damage probabilities both for component failures that cause initiating events and failures in standby systems that decrease the availability of these systems. The paper relates the explicit risk based methods of the new Section XI code cases to the implicit consideration of risk used in the development of Section XI. Other topics include the needed interactions of ISI engineers, plant operating staff, PRA specialists, and members of expert panels that review the risk based programs.« less

  20. Interpreting the Right to an Education as a Norm Referenced Adequacy Standard

    ERIC Educational Resources Information Center

    Pijanowski, John

    2016-01-01

    Our current conceptions of educational adequacy emerged out of an era dominated by equity-based school resource litigation. During that time of transitioning between successful litigation strategies, legal opinions provided clues as to how future courts might view a norm-referenced approach to establishing an adequacy standard--an approach that…

  1. Modeling healthcare authorization and claim submissions using the openEHR dual-model approach

    PubMed Central

    2011-01-01

    Background The TISS standard is a set of mandatory forms and electronic messages for healthcare authorization and claim submissions among healthcare plans and providers in Brazil. It is not based on formal models as the new generation of health informatics standards suggests. The objective of this paper is to model the TISS in terms of the openEHR archetype-based approach and integrate it into a patient-centered EHR architecture. Methods Three approaches were adopted to model TISS. In the first approach, a set of archetypes was designed using ENTRY subclasses. In the second one, a set of archetypes was designed using exclusively ADMIN_ENTRY and CLUSTERs as their root classes. In the third approach, the openEHR ADMIN_ENTRY is extended with classes designed for authorization and claim submissions, and an ISM_TRANSITION attribute is added to the COMPOSITION class. Another set of archetypes was designed based on this model. For all three approaches, templates were designed to represent the TISS forms. Results The archetypes based on the openEHR RM (Reference Model) can represent all TISS data structures. The extended model adds subclasses and an attribute to the COMPOSITION class to represent information on authorization and claim submissions. The archetypes based on all three approaches have similar structures, although rooted in different classes. The extended openEHR RM model is more semantically aligned with the concepts involved in a claim submission, but may disrupt interoperability with other systems and the current tools must be adapted to deal with it. Conclusions Modeling the TISS standard by means of the openEHR approach makes it aligned with ISO recommendations and provides a solid foundation on which the TISS can evolve. Although there are few administrative archetypes available, the openEHR RM is expressive enough to represent the TISS standard. This paper focuses on the TISS but its results may be extended to other billing processes. A complete communication architecture to simulate the exchange of TISS data between systems according to the openEHR approach still needs to be designed and implemented. PMID:21992670

  2. NASIS data base management system - IBM 360/370 OS MVT implementation. 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the NASA Aerospace Safety Information System (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlined. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficiency of the programming task.

  3. A New Proof of the Expected Frequency Spectrum under the Standard Neutral Model.

    PubMed

    Hudson, Richard R

    2015-01-01

    The sample frequency spectrum is an informative and frequently employed approach for summarizing DNA variation data. Under the standard neutral model the expectation of the sample frequency spectrum has been derived by at least two distinct approaches. One relies on using results from diffusion approximations to the Wright-Fisher Model. The other is based on Pólya urn models that correspond to the standard coalescent model. A new proof of the expected frequency spectrum is presented here. It is a proof by induction and does not require diffusion results and does not require the somewhat complex sums and combinatorics of the derivations based on urn models.

  4. Middle school science curriculum design and 8th grade student achievement in Massachusetts public schools

    NASA Astrophysics Data System (ADS)

    Clifford, Betsey A.

    The Massachusetts Department of Elementary and Secondary Education (DESE) released proposed Science and Technology/Engineering standards in 2013 outlining the concepts that should be taught at each grade level. Previously, standards were in grade spans and each district determined the method of implementation. There are two different methods used teaching middle school science: integrated and discipline-based. In the proposed standards, the Massachusetts DESE uses grade-by-grade standards using an integrated approach. It was not known if there is a statistically significant difference in student achievement on the 8th grade science MCAS assessment for students taught with an integrated or discipline-based approach. The results on the 8th grade science MCAS test from six public school districts from 2010 -- 2013 were collected and analyzed. The methodology used was quantitative. Results of an ANOVA showed that there was no statistically significant difference in overall student achievement between the two curriculum models. Furthermore, there was no statistically significant difference for the various domains: Earth and Space Science, Life Science, Physical Science, and Technology/Engineering. This information is useful for districts hesitant to make the change from a discipline-based approach to an integrated approach. More research should be conducted on this topic with a larger sample size to better support the results.

  5. Reusable Models of Pedagogical Concepts--A Framework for Pedagogical and Content Design.

    ERIC Educational Resources Information Center

    Pawlowski, Jan M.

    Standardization initiatives in the field of learning technologies have produced standards for the interoperability of learning environments and learning management systems. Learning resources based on these standards can be reused, recombined, and adapted to the user. However, these standards follow a content-oriented approach; the process of…

  6. Treatment effect heterogeneity for univariate subgroups in clinical trials: Shrinkage, standardization, or else

    PubMed Central

    Varadhan, Ravi; Wang, Sue-Jane

    2016-01-01

    Treatment effect heterogeneity is a well-recognized phenomenon in randomized controlled clinical trials. In this paper, we discuss subgroup analyses with prespecified subgroups of clinical or biological importance. We explore various alternatives to the naive (the traditional univariate) subgroup analyses to address the issues of multiplicity and confounding. Specifically, we consider a model-based Bayesian shrinkage (Bayes-DS) and a nonparametric, empirical Bayes shrinkage approach (Emp-Bayes) to temper the optimism of traditional univariate subgroup analyses; a standardization approach (standardization) that accounts for correlation between baseline covariates; and a model-based maximum likelihood estimation (MLE) approach. The Bayes-DS and Emp-Bayes methods model the variation in subgroup-specific treatment effect rather than testing the null hypothesis of no difference between subgroups. The standardization approach addresses the issue of confounding in subgroup analyses. The MLE approach is considered only for comparison in simulation studies as the “truth” since the data were generated from the same model. Using the characteristics of a hypothetical large outcome trial, we perform simulation studies and articulate the utilities and potential limitations of these estimators. Simulation results indicate that Bayes-DS and Emp-Bayes can protect against optimism present in the naïve approach. Due to its simplicity, the naïve approach should be the reference for reporting univariate subgroup-specific treatment effect estimates from exploratory subgroup analyses. Standardization, although it tends to have a larger variance, is suggested when it is important to address the confounding of univariate subgroup effects due to correlation between baseline covariates. The Bayes-DS approach is available as an R package (DSBayes). PMID:26485117

  7. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    PubMed Central

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106

  8. Clinical Decision Support-based Quality Measurement (CDS-QM) Framework: Prototype Implementation, Evaluation, and Future Directions

    PubMed Central

    Kukhareva, Polina V; Kawamoto, Kensaku; Shields, David E; Barfuss, Darryl T; Halley, Anne M; Tippetts, Tyler J; Warner, Phillip B; Bray, Bruce E; Staes, Catherine J

    2014-01-01

    Electronic quality measurement (QM) and clinical decision support (CDS) are closely related but are typically implemented independently, resulting in significant duplication of effort. While it seems intuitive that technical approaches could be re-used across these two related use cases, such reuse is seldom reported in the literature, especially for standards-based approaches. Therefore, we evaluated the feasibility of using a standards-based CDS framework aligned with anticipated EHR certification criteria to implement electronic QM. The CDS-QM framework was used to automate a complex national quality measure (SCIP-VTE-2) at an academic healthcare system which had previously relied on time-consuming manual chart abstractions. Compared with 305 manually-reviewed reference cases, the recall of automated measurement was 100%. The precision was 96.3% (CI:92.6%-98.5%) for ascertaining the denominator and 96.2% (CI:92.3%-98.4%) for the numerator. We therefore validated that a standards-based CDS-QM framework can successfully enable automated QM, and we identified benefits and challenges with this approach. PMID:25954389

  9. Overcoming the Challenges of Unstructured Data in Multi-site, Electronic Medical Record-based Abstraction

    PubMed Central

    Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy JH

    2014-01-01

    Background Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as this data is often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. Objective As standard abstraction approaches resulted in sub-standard data reliability for unstructured data elements collected as part of a multi-site, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. Research Design We adopted a “fit-for-use” framework to guide the development and evaluation of abstraction methods using a four step, phase-based approach including (1) team building, (2) identification of challenges, (3) adaptation of abstraction methods, and (4) systematic data quality monitoring. Measures Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (e.g., warfarin initiation) and medical follow-up (e.g., timeframe for follow-up). Results After implementation of the phase-based approach, inter-rater reliability for all unstructured data elements demonstrated kappas of ≥ 0.89 -- an average increase of + 0.25 for each unstructured data element. Conclusions As compared to standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multi-site EMR documentation. PMID:27624585

  10. Meta-Modeling-Based Groundwater Remediation Optimization under Flexibility in Environmental Standard.

    PubMed

    He, Li; Xu, Zongda; Fan, Xing; Li, Jing; Lu, Hongwei

    2017-05-01

      This study develops a meta-modeling based mathematical programming approach with flexibility in environmental standards. It integrates numerical simulation, meta-modeling analysis, and fuzzy programming within a general framework. A set of models between remediation strategies and remediation performance can well guarantee the mitigation in computational efforts in the simulation and optimization process. In order to prevent the occurrence of over-optimistic and pessimistic optimization strategies, a high satisfaction level resulting from the implementation of a flexible standard can indicate the degree to which the environmental standard is satisfied. The proposed approach is applied to a naphthalene-contaminated site in China. Results show that a longer remediation period corresponds to a lower total pumping rate and a stringent risk standard implies a high total pumping rate. The wells located near or in the down-gradient direction to the contaminant sources have the most significant efficiency among all of remediation schemes.

  11. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  12. A Stable Whole Building Performance Method for Standard 90.1-Part II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Eley, Charles

    2016-06-01

    In May of 2013 we introduced a new approach for compliance with Standard 90.1 that was under development based on the Performance Rating Method of Appendix G to Standard 90.11. Since then, the approach has been finalized through Addendum BM to Standard 90.1-2013 and will be published in the 2016 edition of the Standard. In the meantime, ASHRAE has published an advanced copy of Appendix G including Addendum BM and several other addenda so that software developers and energy program administrators can get a preview of what is coming in the 2016 edition of the Standard2. This article is anmore » update on Addendum BM, summarizes changes made to the original concept as introduced in May of 2013, and provides an approach for developing performance targets for code compliance and beyond code programs.« less

  13. Clinical risk management approach for long-duration space missions.

    PubMed

    Gray, Gary W; Sargsyan, Ashot E; Davis, Jeffrey R

    2010-12-01

    In the process of crewmember evaluation and certification for long-duration orbital missions, the International Space Station (ISS) Multilateral Space Medicine Board (MSMB) encounters a surprisingly wide spectrum of clinical problems. Some of these conditions are identified within the ISS Medical Standards as requiring special consideration, or as falling outside the consensus Medical Standards promulgated for the ISS program. To assess the suitability for long-duration missions on ISS for individuals with medical problems that fall outside of standards or are otherwise of significant concern, the MSMB has developed a risk matrix approach to assess the risks to the individual, the mission, and the program. The goal of this risk assessment is to provide a more objective, evidence- and risk-based approach for aeromedical disposition. Using a 4 x 4 risk matrix, the probability of an event is plotted against the potential impact. Event probability is derived from a detailed review of clinical and aerospace literature, and based on the best available evidence. The event impact (consequences) is assessed and assigned within the matrix. The result has been a refinement of MSMB case assessment based on evidence-based data incorporated into a risk stratification process. This has encouraged an objective assessment of risk and, in some cases, has resulted in recertification of crewmembers with medical conditions which hitherto would likely have been disqualifying. This paper describes a risk matrix approach developed for MSMB disposition decisions. Such an approach promotes objective, evidence-based decision-making and is broadly applicable within the aerospace medicine community.

  14. Gold-standard for computer-assisted morphological sperm analysis.

    PubMed

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm heads. By using the Fourier descriptor and SVM, we achieved the best mean correct classification: only 49%. We conclude that the SCIAN-MorphoSpermGS will provide a standard tool for evaluation of characterization and classification approaches for human sperm heads. Indeed, there is a clear need for a specific shape-based descriptor for human sperm heads and a specific classification approach to tackle the problem of high variability within subcategories of abnormal sperm cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Figuring It Out: Standard-Based Reforms in Urban Middle Grades.

    ERIC Educational Resources Information Center

    Lewis, Anne C.

    Six urban school districts (Chattanooga, Tennessee, Corpus Christi, Texas, Long Beach, California, Louisville, Kentucky, Minneapolis, Minnesota, and San Diego, California) have been pursuing standard-based reform at the middle school level accepting systemic reform as the norm. This report provides descriptions of their approaches, and commentary…

  16. Control-group feature normalization for multivariate pattern analysis of structural MRI data using the support vector machine.

    PubMed

    Linn, Kristin A; Gaonkar, Bilwaj; Satterthwaite, Theodore D; Doshi, Jimit; Davatzikos, Christos; Shinohara, Russell T

    2016-05-15

    Normalization of feature vector values is a common practice in machine learning. Generally, each feature value is standardized to the unit hypercube or by normalizing to zero mean and unit variance. Classification decisions based on support vector machines (SVMs) or by other methods are sensitive to the specific normalization used on the features. In the context of multivariate pattern analysis using neuroimaging data, standardization effectively up- and down-weights features based on their individual variability. Since the standard approach uses the entire data set to guide the normalization, it utilizes the total variability of these features. This total variation is inevitably dependent on the amount of marginal separation between groups. Thus, such a normalization may attenuate the separability of the data in high dimensional space. In this work we propose an alternate approach that uses an estimate of the control-group standard deviation to normalize features before training. We study our proposed approach in the context of group classification using structural MRI data. We show that control-based normalization leads to better reproducibility of estimated multivariate disease patterns and improves the classifier performance in many cases. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Steps in Moving Evidence-Based Health Informatics from Theory to Practice.

    PubMed

    Rigby, Michael; Magrabi, Farah; Scott, Philip; Doupi, Persephone; Hypponen, Hannele; Ammenwerth, Elske

    2016-10-01

    To demonstrate and promote the importance of applying a scientific process to health IT design and implementation, and of basing this on research principles and techniques. A review by international experts linked to the IMIA Working Group on Technology Assessment and Quality Development. Four approaches are presented, linking to the creation of national professional expectations, adherence to research-based standards, quality assurance approaches to ensure safety, and scientific measurement of impact. Solely marketing- and aspiration-based approaches to health informatics applications are no longer ethical or acceptable when scientifically grounded evidence-based approaches are available and in use.

  18. A process-based standard for the Solar Energetic Particle Event Environment

    NASA Astrophysics Data System (ADS)

    Gabriel, Stephen

    For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE statistical models has been caused by two things: 1) the data set and 2) the definition of an event Because unravelling the dependencies of the outputs of different statistical models on these two parameters is extremely difficult if not impossible, currently comparison of the results from the different models is also extremely difficult and can lead to controversies, especially over which model is the correct one; hence, when it comes to using these models for engineering purposes to calculate, for example, the radiation dose for a particular mission, the user, who is in all likelihood not an expert in this field, could be given two( or even more) very different environments and find it impossible to know how to select one ( or even how to compare them). What is proposed then, is a process-based standard, which in common with nearly all of the current models is composed of 3 elements, a standard data set, a standard event definition and a resulting standard event list. A standard event list is the output of this standard and can then be used with any of the existing (or indeed future) models that are based on events. This standard event list is completely traceable and transparent and represents a reference event list for all the community. When coupled with a statistical model, the results when compared will only be dependent on the statistical model and not on the data set or event definition.

  19. Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture

    NASA Astrophysics Data System (ADS)

    Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.

    2012-12-01

    The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.

  20. CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability

    NASA Technical Reports Server (NTRS)

    Claus, Russell; Weitzer, Ilan

    2002-01-01

    Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.

  1. Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries.

    PubMed

    Shafiey, Hassan; Gan, Xinjun; Waxman, David

    2017-11-01

    To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.

  2. Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries

    NASA Astrophysics Data System (ADS)

    Shafiey, Hassan; Gan, Xinjun; Waxman, David

    2017-11-01

    To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.

  3. A robust approach to using of the redundant information in the temperature calibration

    NASA Astrophysics Data System (ADS)

    Strnad, R.; Kňazovická, L.; Šindelář, M.; Kukal, J.

    2013-09-01

    In the calibration laboratories are used standard procedures for calculating of the calibration model coefficients based on well described standards (EN 60751, ITS-90, EN 60584, etc.). In practice, sensors are mostly calibrated in more points and redundant information is used as a validation of the model. This paper will present the influence of including all measured points with respect to their uncertainties to the measured models using standard weighted least square methods. A special case with regards of the different level of the uncertainty of the measured points in case of the robust approach will be discussed. This will go to the different minimization criteria and different uncertainty propagation methodology. This approach also will eliminate of the influence of the outline measurements in the calibration. In practical part will be three cases of this approach presented, namely industrial calibration according to the standard EN 60751, SPRT according to the ITS-90 and thermocouple according to the standard EN 60584.

  4. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  5. Modified Redundancy based Technique—a New Approach to Combat Error Propagation Effect of AES

    NASA Astrophysics Data System (ADS)

    Sarkar, B.; Bhunia, C. T.; Maulik, U.

    2012-06-01

    Advanced encryption standard (AES) is a great research challenge. It has been developed to replace the data encryption standard (DES). AES suffers from a major limitation of error propagation effect. To tackle this limitation, two methods are available. One is redundancy based technique and the other one is bite based parity technique. The first one has a significant advantage of correcting any error on definite term over the second one but at the cost of higher level of overhead and hence lowering the processing speed. In this paper, a new approach based on the redundancy based technique is proposed that would certainly speed up the process of reliable encryption and hence the secured communication.

  6. Robotic and endoscopic transaxillary thyroidectomies may be cost prohibitive when compared to standard cervical thyroidectomy: a cost analysis.

    PubMed

    Cabot, Jennifer C; Lee, Cho Rok; Brunaud, Laurent; Kleiman, David A; Chung, Woong Youn; Fahey, Thomas J; Zarnegar, Rasa

    2012-12-01

    This study presents a cost analysis of the standard cervical, gasless transaxillary endoscopic, and gasless transaxillary robotic thyroidectomy approaches based on medical costs in the United States. A retrospective review of 140 patients who underwent standard cervical, transaxillary endoscopic, or transaxillary robotic thyroidectomy at 2 tertiary centers was conducted. The cost model included operating room charges, anesthesia fee, consumables cost, equipment depreciation, and maintenance cost. Sensitivity analyses assessed individual cost variables. The mean operative times for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were 121 ± 18.9, 185 ± 26.0, and 166 ± 29.4 minutes, respectively. The total cost for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were $9,028 ± $891, $12,505 ± $1,222, and $13,670 ± $1,384, respectively. Transaxillary approaches were significantly more expensive than the standard cervical technique (standard cervical/transaxillary endoscopic, P < .0001; standard cervical/transaxillary robotic, P < .0001; and transaxillary endoscopic/transaxillary robotic, P = .001). The transaxillary and standard cervical techniques became equivalent in cost when transaxillary endoscopic operative time decreased to 111 minutes and transaxillary robotic operative time decreased to 68 minutes. Increasing the case load did not resolve the cost difference. Transaxillary endoscopic and transaxillary robotic thyroidectomies are significantly more expensive than the standard cervical approach. Decreasing operative times reduces this cost difference. The greater expense may be prohibitive in countries with a flat reimbursement schedule. Copyright © 2012 Mosby, Inc. All rights reserved.

  7. Software Engineering Research/Developer Collaborations (C104)

    NASA Technical Reports Server (NTRS)

    Shell, Elaine; Shull, Forrest

    2005-01-01

    The goal of this collaboration was to produce Flight Software Branch (FSB) process standards for software inspections which could be used across three new missions within the FSB. The standard was developed by Dr. Forrest Shull (Fraunhofer Center for Experimental Software Engineering, Maryland) using the Perspective-Based Inspection approach, (PBI research has been funded by SARP) , then tested on a pilot Branch project. Because the short time scale of the collaboration ruled out a quantitative evaluation, it would be decided whether the standard was suitable for roll-out to other Branch projects based on a qualitative measure: whether the standard received high ratings from Branch personnel as to usability and overall satisfaction. The project used for piloting the Perspective-Based Inspection approach was a multi-mission framework designed for reuse. This was a good choice because key representatives from the three new missions would be involved in the inspections. The perspective-based approach was applied to produce inspection procedures tailored for the specific quality needs of the branch. The technical information to do so was largely drawn through a series of interviews with Branch personnel. The framework team used the procedures to review requirements. The inspections were useful for indicating that a restructuring of the requirements document was needed, which led to changes in the development project plan. The standard was sent out to other Branch personnel for review. Branch personnel were very positive. However, important changes were identified because the perspective of Attitude Control System (ACS) developers had not been adequately represented, a result of the specific personnel interviewed. The net result is that with some further work to incorporate the ACS perspective, and in synchrony with the roll out of independent Branch standards, the PBI approach will be implemented in the FSB. Also, the project intends to continue its collaboration with the technology provider (Dr. Forrest Shull) past the end of the grant, to allow a more rigorous quantitative evaluation.

  8. OntoCR: A CEN/ISO-13606 clinical repository based on ontologies.

    PubMed

    Lozano-Rubí, Raimundo; Muñoz Carrero, Adolfo; Serrano Balazote, Pablo; Pastor, Xavier

    2016-04-01

    To design a new semantically interoperable clinical repository, based on ontologies, conforming to CEN/ISO 13606 standard. The approach followed is to extend OntoCRF, a framework for the development of clinical repositories based on ontologies. The meta-model of OntoCRF has been extended by incorporating an OWL model integrating CEN/ISO 13606, ISO 21090 and SNOMED CT structure. This approach has demonstrated a complete evaluation cycle involving the creation of the meta-model in OWL format, the creation of a simple test application, and the communication of standardized extracts to another organization. Using a CEN/ISO 13606 based system, an indefinite number of archetypes can be merged (and reused) to build new applications. Our approach, based on the use of ontologies, maintains data storage independent of content specification. With this approach, relational technology can be used for storage, maintaining extensibility capabilities. The present work demonstrates that it is possible to build a native CEN/ISO 13606 repository for the storage of clinical data. We have demonstrated semantic interoperability of clinical information using CEN/ISO 13606 extracts. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. The Science of Standards-Based Education

    ERIC Educational Resources Information Center

    Smithson, John

    2017-01-01

    A standards-based model of reform has dominated public education for 30 years. Under the Every Student Succeeds Act (ESSA), it will continue to dominate education policy. Is that model working? State boards of education share an intrinsic interest in this question. While there are many ways to investigate it, one approach that shows promise treats…

  10. Place-Based Pedagogy in the Era of Accountability: An Action Research Study

    ERIC Educational Resources Information Center

    Saracino, Peter C.

    2010-01-01

    Today's most common method of teaching biology--driven by calls for standardization and high-stakes testing--relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep…

  11. Curriculum Standards of Technological and Vocational Education in Taiwan, R.O.C.

    ERIC Educational Resources Information Center

    Lee, Lung-Sheng Steven; Hwang, Jenq-Jye

    In Taiwan, curriculum standards for senior vocational schools and junior colleges are administered and promulgated by the Ministry of Education approximately every 10 years. Curricula for institutes of technology are principally school based. As a result of critiques of the current top-down or administration-based approach system of curriculum…

  12. Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms

    ERIC Educational Resources Information Center

    Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy

    2005-01-01

    Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…

  13. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  14. A Standards-Based Approach to Catholic Principal Preparation: A Case Study

    ERIC Educational Resources Information Center

    Morten, Sandria D.; Lawler, Geralyn A.

    2016-01-01

    Illinois' recent redesign of the principal certification program requires the integration of the Interstate School Leaders Licensure (ISLLC) standards as well as the Southern Regional Education Board Critical Success Factors standards into the coursework and internship, establishing a focus on preparation for instructional leadership. The…

  15. “Gold Standards,” Plurality and Monocultures: The Need for Diversity in Psychotherapy

    PubMed Central

    Leichsenring, Falk; Abbass, Allan; Hilsenroth, Mark J.; Luyten, Patrick; Munder, Thomas; Rabung, Sven; Steinert, Christiane

    2018-01-01

    For psychotherapy of mental disorders, presently several approaches are available, such as interpersonal, humanistic, systemic, psychodynamic or cognitive behavior therapy (CBT). Pointing to the available evidence, proponents of CBT claim that CBT is the gold standard. Some authors even argue for an integrated CBT-based form of psychotherapy as the only form of psychotherapy. CBT undoubtedly has its strengths and CBT researchers have to be credited for developing and testing treatments for many mental disorders. A critical review, however, shows that the available evidence for the theoretical foundations of CBT, assumed mechanisms of change, quality of studies, and efficacy is not as robust as some researchers claim. Most important, there is no consistent evidence that CBT is more efficacious than other evidence-based approaches. These findings do not justify regarding CBT as the gold standard psychotherapy. They even provide less justification for the idea that the future of psychotherapy lies in one integrated CBT-based form of psychotherapy as the only type of psychotherapy. For the different psychotherapeutic approaches a growing body of evidence is available. These approaches have their strengths because of differences in their respective focus on interpersonal relationships, affects, cognitions, systemic perspectives, experiential, or unconscious processes. Different approaches may be suitable to different patients and therapists. As generally assumed, progress in research results from openness to new ideas and learning from diverse perspectives. Thus, different forms of evidence-based psychotherapy are required. Plurality is the future of psychotherapy, not a uniform “one fits all” approach. PMID:29740361

  16. Evidence based herbal drug standardization approach in coping with challenges of holistic management of diabetes: a dreadful lifestyle disorder of 21st century

    PubMed Central

    2013-01-01

    Plants by virtue of its composition of containing multiple constituents developed during its growth under various environmental stresses providing a plethora of chemical families with medicinal utility. Researchers are exploring this wealth and trying to decode its utility for enhancing health standards of human beings. Diabetes is dreadful lifestyle disorder of 21st century caused due to lack of insulin production or insulin physiological unresponsiveness. The chronic impact of untreated diabetes significantly affects vital organs. The allopathic medicines have five classes of drugs, or otherwise insulin in Type I diabetes, targeting insulin secretion, decreasing effect of glucagon, sensitization of receptors for enhanced glucose uptake etc. In addition, diet management, increased food fiber intake, Resistant Starch intake and routine exercise aid in managing such dangerous metabolic disorder. One of the key factors that limit commercial utility of herbal drugs is standardization. Standardization poses numerous challenges related to marker identification, active principle(s), lack of defined regulations, non-availability of universally acceptable technical standards for testing and implementation of quality control/safety standard (toxicological testing). The present study proposed an integrated herbal drug development & standardization model which is an amalgamation of Classical Approach of Ayurvedic Therapeutics, Reverse Pharmacological Approach based on Observational Therapeutics, Technical Standards for complete product cycle, Chemi-informatics, Herbal Qualitative Structure Activity Relationship and Pharmacophore modeling and, Post-Launch Market Analysis. Further studies are warranted to ensure that an effective herbal drug standardization methodology will be developed, backed by a regulatory standard guide the future research endeavors in more focused manner. PMID:23822656

  17. Evidence based herbal drug standardization approach in coping with challenges of holistic management of diabetes: a dreadful lifestyle disorder of 21st century.

    PubMed

    Chawla, Raman; Thakur, Pallavi; Chowdhry, Ayush; Jaiswal, Sarita; Sharma, Anamika; Goel, Rajeev; Sharma, Jyoti; Priyadarshi, Smruti Sagar; Kumar, Vinod; Sharma, Rakesh Kumar; Arora, Rajesh

    2013-07-04

    Plants by virtue of its composition of containing multiple constituents developed during its growth under various environmental stresses providing a plethora of chemical families with medicinal utility. Researchers are exploring this wealth and trying to decode its utility for enhancing health standards of human beings. Diabetes is dreadful lifestyle disorder of 21st century caused due to lack of insulin production or insulin physiological unresponsiveness. The chronic impact of untreated diabetes significantly affects vital organs. The allopathic medicines have five classes of drugs, or otherwise insulin in Type I diabetes, targeting insulin secretion, decreasing effect of glucagon, sensitization of receptors for enhanced glucose uptake etc. In addition, diet management, increased food fiber intake, Resistant Starch intake and routine exercise aid in managing such dangerous metabolic disorder. One of the key factors that limit commercial utility of herbal drugs is standardization. Standardization poses numerous challenges related to marker identification, active principle(s), lack of defined regulations, non-availability of universally acceptable technical standards for testing and implementation of quality control/safety standard (toxicological testing). The present study proposed an integrated herbal drug development & standardization model which is an amalgamation of Classical Approach of Ayurvedic Therapeutics, Reverse Pharmacological Approach based on Observational Therapeutics, Technical Standards for complete product cycle, Chemi-informatics, Herbal Qualitative Structure Activity Relationship and Pharmacophore modeling and, Post-Launch Market Analysis. Further studies are warranted to ensure that an effective herbal drug standardization methodology will be developed, backed by a regulatory standard guide the future research endeavors in more focused manner.

  18. A review of consensus test methods for established medical imaging modalities and their implications for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Pfefer, Joshua; Agrawal, Anant

    2012-03-01

    In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.

  19. Positionalism of Relations and Its Consequences for Fact-Oriented Modelling

    NASA Astrophysics Data System (ADS)

    Keet, C. Maria

    Natural language-based conceptual modelling as well as the use of diagrams have been essential components of fact-oriented modelling from its inception. However, transforming natural language to its corresponding object-role modelling diagram, and vv., is not trivial. This is due to the more fundamental problem of the different underlying ontological commitments concerning positionalism of the fact types. The natural language-based approach adheres to the standard view whereas the diagram-based approach has a positionalist commitment, which is, from an ontological perspective, incompatible with the former. This hinders seamless transition between the two approaches and affects interoperability with other conceptual modelling languages. One can adopt either the limited standard view or the positionalist commitment with fact types that may not be easily verbalisable but which facilitates data integration and reusability of conceptual models with ontological foundations.

  20. Ontology-Based Exchange and Immediate Application of Business Calculation Definitions for Online Analytical Processing

    NASA Astrophysics Data System (ADS)

    Kehlenbeck, Matthias; Breitner, Michael H.

    Business users define calculated facts based on the dimensions and facts contained in a data warehouse. These business calculation definitions contain necessary knowledge regarding quantitative relations for deep analyses and for the production of meaningful reports. The business calculation definitions are implementation and widely organization independent. But no automated procedures facilitating their exchange across organization and implementation boundaries exist. Separately each organization currently has to map its own business calculations to analysis and reporting tools. This paper presents an innovative approach based on standard Semantic Web technologies. This approach facilitates the exchange of business calculation definitions and allows for their automatic linking to specific data warehouses through semantic reasoning. A novel standard proxy server which enables the immediate application of exchanged definitions is introduced. Benefits of the approach are shown in a comprehensive case study.

  1. Towards a Framework for Developing Semantic Relatedness Reference Standards

    PubMed Central

    Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.

    2010-01-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697

  2. ITS evaluation -- phase 3 (2010)

    DOT National Transportation Integrated Search

    2011-05-01

    This report documents the results of applying a previously developed, standardized approach for : evaluating intelligent transportation systems (ITS) projects to 17 ITS earmark projects. The evaluation : approach was based on a questionnaire to inves...

  3. Approaches for estimating minimal clinically important differences in systemic lupus erythematosus.

    PubMed

    Rai, Sharan K; Yazdany, Jinoos; Fortin, Paul R; Aviña-Zubieta, J Antonio

    2015-06-03

    A minimal clinically important difference (MCID) is an important concept used to determine whether a medical intervention improves perceived outcomes in patients. Prior to the introduction of the concept in 1989, studies focused primarily on statistical significance. As most recent clinical trials in systemic lupus erythematosus (SLE) have failed to show significant effects, determining a clinically relevant threshold for outcome scores (that is, the MCID) of existing instruments may be critical for conducting and interpreting meaningful clinical trials as well as for facilitating the establishment of treatment recommendations for patients. To that effect, methods to determine the MCID can be divided into two well-defined categories: distribution-based and anchor-based approaches. Distribution-based approaches are based on statistical characteristics of the obtained samples. There are various methods within the distribution-based approach, including the standard error of measurement, the standard deviation, the effect size, the minimal detectable change, the reliable change index, and the standardized response mean. Anchor-based approaches compare the change in a patient-reported outcome to a second, external measure of change (that is, one that is more clearly understood, such as a global assessment), which serves as the anchor. Finally, the Delphi technique can be applied as an adjunct to defining a clinically important difference. Despite an abundance of methods reported in the literature, little work in MCID estimation has been done in the context of SLE. As the MCID can help determine the effect of a given therapy on a patient and add meaning to statistical inferences made in clinical research, we believe there ought to be renewed focus on this area. Here, we provide an update on the use of MCIDs in clinical research, review some of the work done in this area in SLE, and propose an agenda for future research.

  4. Getting to the Core: How Early Implementers Are Approaching the Common Core in California

    ERIC Educational Resources Information Center

    Brown, Brentt; Vargo, Merrill

    2014-01-01

    California has embarked on a major new wave of curriculum reform with the adoption of the Common Core State Standards (CCSS), the new English Language Development (ELD) standards, and the Next Generation Science Standards (NGSS). The adoption of the CCSS builds a legacy of standards-based education reform in California that began with the…

  5. A mixed-methods research approach to the review of competency standards for orthotist/prosthetists in Australia.

    PubMed

    Ash, Susan; O'Connor, Jackie; Anderson, Sarah; Ridgewell, Emily; Clarke, Leigh

    2015-06-01

    The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis - a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups - an expert and a recent graduate group of Australian orthotist/prosthetists - were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. In stage 1, the expert (n = 10) and the new graduate (n = 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n = 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/prosthetists.

  6. Process-outcome interrelationship and standard setting in medical education: the need for a comprehensive approach.

    PubMed

    Christensen, Leif; Karle, Hans; Nystrup, Jørgen

    2007-09-01

    An outcome-based approach to medical education compared to a process/content orientation is currently being discussed intensively. In this article, the process and outcome interrelationship in medical education is discussed, with specific emphasis on the relation to the definition of standards in basic medical education. Perceptions of outcome have always been an integrated element of curricular planning. The present debate underlines the need for stronger focus on learning objectives and outcome assessment in many medical schools around the world. The need to maintain an integrated approach of process/content and outcome is underlined in this paper. A worry is expressed about the taxonomy of learning in pure outcome-based medical education, in which student assessment can be a major determinant for the learning process, leaving the control of the medical curriculum to medical examiners. Moreover, curricula which favour reductionism by stating everything in terms of instrumental outcomes or competences, do face a risk of lowering quality and do become a prey for political interference. Standards based on outcome alone rise unclarified problems in relationship to licensure requirements of medical doctors. It is argued that the alleged dichotomy between process/content and outcome seems artificial, and that formulation of standards in medical education must follow a comprehensive line in curricular planning.

  7. Approximations to the Truth: Comparing Survey and Microsimulation Approaches to Measuring Income for Social Indicators

    ERIC Educational Resources Information Center

    Figari, Francesco; Iacovou, Maria; Skew, Alexandra J.; Sutherland, Holly

    2012-01-01

    In this paper, we evaluate income distributions in four European countries (Austria, Italy, Spain and Hungary) using two complementary approaches: a standard approach based on reported incomes in survey data, and a microsimulation approach, where taxes and benefits are simulated. These two approaches may be expected to generate slightly different…

  8. International trade standards for commodities and products derived from animals: the need for a system that integrates food safety and animal disease risk management.

    PubMed

    Thomson, G R; Penrith, M-L; Atkinson, M W; Thalwitzer, S; Mancuso, A; Atkinson, S J; Osofsky, S A

    2013-12-01

    A case is made for greater emphasis to be placed on value chain management as an alternative to geographically based disease risk mitigation for trade in commodities and products derived from animals. The geographic approach is dependent upon achievement of freedom in countries or zones from infectious agents that cause so-called transboundary animal diseases, while value chain-based risk management depends upon mitigation of animal disease hazards potentially associated with specific commodities or products irrespective of the locality of production. This commodity-specific approach is founded on the same principles upon which international food safety standards are based, viz. hazard analysis critical control points (HACCP). Broader acceptance of a value chain approach enables animal disease risk management to be combined with food safety management by the integration of commodity-based trade and HACCP methodologies and thereby facilitates 'farm to fork' quality assurance. The latter is increasingly recognized as indispensable to food safety assurance and is therefore a pre-condition to safe trade. The biological principles upon which HACCP and commodity-based trade are based are essentially identical, potentially simplifying sanitary control in contrast to current separate international sanitary standards for food safety and animal disease risks that are difficult to reconcile. A value chain approach would not only enable more effective integration of food safety and animal disease risk management of foodstuffs derived from animals but would also ameliorate adverse environmental and associated socio-economic consequences of current sanitary standards based on the geographic distribution of animal infections. This is especially the case where vast veterinary cordon fencing systems are relied upon to separate livestock and wildlife as is the case in much of southern Africa. A value chain approach would thus be particularly beneficial to under-developed regions of the world such as southern Africa specifically and sub-Saharan Africa more generally where it would reduce incompatibility between attempts to expand and commercialize livestock production and the need to conserve the subcontinent's unparalleled wildlife and wilderness resources. © 2013 Blackwell Verlag GmbH.

  9. Validating a Standards-Based Classroom Assessment of English Proficiency: A Multitrait-Multimethod Approach

    ERIC Educational Resources Information Center

    Llosa, Lorena

    2007-01-01

    The use of standards-based classroom assessments to test English learners' language proficiency is increasingly prevalent in the United States and many other countries. In a large urban school district in California, for example, a classroom assessment is used to make high-stakes decisions about English learners' progress from one level to the…

  10. Building Curriculum-Based Concerts: Tired of the Same Old Approach to Your Ensemble's Concert and Festival Schedule?

    ERIC Educational Resources Information Center

    Russell, Joshua A.

    2006-01-01

    Since--and even before--the National Standards for Music Education were published, music educators have tried to balance the expectations associated with the traditional performance curriculum and contemporary models of music education. The Standards-based curriculum challenges directors to consider how student experiences and learning can be…

  11. Resampling-based Methods in Single and Multiple Testing for Equality of Covariance/Correlation Matrices

    PubMed Central

    Yang, Yang; DeGruttola, Victor

    2016-01-01

    Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients. PMID:22740584

  12. Resampling-based methods in single and multiple testing for equality of covariance/correlation matrices.

    PubMed

    Yang, Yang; DeGruttola, Victor

    2012-06-22

    Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.

  13. Gold-standard evaluation of a folksonomy-based ontology learning model

    NASA Astrophysics Data System (ADS)

    Djuana, E.

    2018-03-01

    Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.

  14. Professional Competence of Teachers in the Age of Globalization

    ERIC Educational Resources Information Center

    Orazbayeva, Kuldarkhan O.

    2016-01-01

    Current challenges of globalization in a democratic post-industrial information society make the competency-based approach a standard in the creation of the global educational environment. This study describes the special aspects of the integration of the competency-based approach into the educational theory and practice of post-Soviet countries,…

  15. The modeler's influence on calculated solubilities for performance assessments at the Aspo Hard-rock Laboratory

    USGS Publications Warehouse

    Ernren, A.T.; Arthur, R.; Glynn, P.D.; McMurry, J.

    1999-01-01

    Four researchers were asked to provide independent modeled estimates of the solubility of a radionuclide solid phase, specifically Pu(OH)4, under five specified sets of conditions. The objectives of the study were to assess the variability in the results obtained and to determine the primary causes for this variability.In the exercise, modelers were supplied with the composition, pH and redox properties of the water and with a description of the mineralogy of the surrounding fracture system A standard thermodynamic data base was provided to all modelers. Each modeler was encouraged to use other data bases in addition to the standard data base and to try different approaches to solving the problem.In all, about fifty approaches were used, some of which included a large number of solubility calculations. For each of the five test cases, the calculated solubilities from different approaches covered several orders of magnitude. The variability resulting from the use of different thermodynamic data bases was in most cases, far smaller than that resulting from the use of different approaches to solving the problem.

  16. Laparoscopic extraperitoneal inguinal hernia repair. A safe approach based on the understanding of rectus sheath anatomy.

    PubMed

    Katkhouda, N; Campos, G M; Mavor, E; Trussler, A; Khalil, M; Stoppa, R

    1999-12-01

    We have devised a reproducible approach to the preperitoneal space for laparoscopic repair of inguinal hernias that is based on an understanding of the abdominal wall anatomy. Laparoscopic totally extraperitoneal herniorrhaphy was performed on 99 hernias in 90 patients at the Los Angeles County-University of Southern California Medical Center, using a standardized approach to the preperitoneal space. Operative times, morbidity, and recurrence rates were recorded prospectively. The median operative time was 37 min (range, 28-60) for unilateral hernias and 46 min (range, 35-73) for bilateral hernias. There were no conversions to open repair, and there was only one conversion to a laparoscopic transabdominal approach. Complications were limited to urinary retention in two patients, pneumoscrotum in one patient, and postoperative pain requiring a large dose of analgesics in one patient. All patients were discharged within 23 h. There were no recurrences or neuralgias on follow-up at 2 years. A standardized approach to the preperitoneal space based on a thorough understanding of the abdominal wall anatomy is essential to a satisfactory outcome in hernia repair.

  17. Educational Approaches When Implementing the Next Generation Science Standards

    NASA Astrophysics Data System (ADS)

    Dwyer, Brian

    This paper overviews the history of science education from the launch of Sputnik through reform movements and associated legislation to the most recent Next Generation Science Standards (NGSS). The paper also considers stakeholder groups that would need to be involved if NGSS is to be implemented properly, including teachers, parents and unions. Each group holds a responsibility within a school system that needs to be addressed from a practical standpoint to increase the likelihood of the effective adoption of the Next Generation Science Standards. This paper provides background and program information about the Next Generation Science Standards (NGSS). It also considers the educational, philosophical, and instructional approach known as inquiry which is strongly advocated by NGSS and explores where and how other well-studied instructional approaches might have a place within an inquiry-based classroom.

  18. 77 FR 52887 - Regulatory Capital Rules: Standardized Approach for Risk-Weighted Assets; Market Discipline and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... securitization framework designed to address the credit risk of exposures that involve the tranching of the... creditworthiness standards and risk-based capital requirements have been designed to be consistent with safety and...

  19. Methods for the guideline-based development of quality indicators--a systematic review

    PubMed Central

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  20. Revision of the design of a standard for the dimensions of school furniture.

    PubMed

    Molenbroek, J F M; Kroon-Ramaekers, Y M T; Snijders, C J

    2003-06-10

    In this study an anthropometric design process was followed. The aim was to improve the fit of school furniture sizes for European children. It was demonstrated statistically that the draft of a European standard does not cover the target population. No literature on design criteria for sizes exists, and in practice it is common to calculate the fit for only the mean values (P50). The calculations reported here used body dimensions of Dutch children, measured by the authors' Department, and used data from German and British national standards. A design process was followed that contains several steps, including: Target group, Anthropometric model and Percentage exclusion. The criteria developed in this study are (1) a fit on the basis of 1% exclusion (P1 or P99), and (2) a prescription based on popliteal height. Based on this new approach it was concluded that prescription of a set size should be based on popliteal height rather than body height. The drafted standard, Pren 1729, can be improved with this approach. A European standard for school furniture should include the exception that for Dutch children an extra large size is required.

  1. 77 FR 1319 - Regulation of Fuels and Fuel Additives: 2012 Renewable Fuel Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-09

    ... of the Domestic Aggregate Compliance Approach E. Assessment of the Canadian Aggregate Compliance Approach II. Projection of Cellulosic Volume and Assessment of Biomass-Based Diesel and Advanced Biofuel... Price for Cellulosic Biofuel Waiver Credits B. Assessment of the Domestic Aggregate Compliance Approach...

  2. Using standardized patients to evaluate hospital-based intervention outcomes.

    PubMed

    Li, Li; Lin, Chunqing; Guan, Jihui

    2014-06-01

    The standardized patient approach has proved to be an effective training tool for medical educators. This article explains the process of employing standardized patients in an HIV stigma reduction intervention in healthcare settings in China. The study was conducted in 40 hospitals in two provinces of China. One year after the stigma reduction intervention, standardized patients made unannounced visits to participating hospitals, randomly approached service providers on duty and presented symptoms related to HIV and disclosed HIV-positive test results. After each visit, the standardized patients evaluated their providers' attitudes and behaviours using a structured checklist. Standardized patients also took open-ended observation notes about their experience and the evaluation process. Seven standardized patients conducted a total of 217 assessments (108 from 20 hospitals in the intervention condition; 109 from 20 hospitals in the control condition). Based on a comparative analysis, the intervention hospitals received a better rating than the control hospitals in terms of general impression and universal precaution compliance as well as a lower score on stigmatizing attitudes and behaviours toward the standardized patients. Standardized patients are a useful supplement to traditional self-report assessments, particularly for measuring intervention outcomes that are sensitive or prone to social desirability. Published by Oxford University Press on behalf of the International Epidemiological Association © The Author 2013; all rights reserved.

  3. Translating standards into practice - one Semantic Web API for Gene Expression.

    PubMed

    Deus, Helena F; Prud'hommeaux, Eric; Miller, Michael; Zhao, Jun; Malone, James; Adamusiak, Tomasz; McCusker, Jim; Das, Sudeshna; Rocca Serra, Philippe; Fox, Ronan; Marshall, M Scott

    2012-08-01

    Sharing and describing experimental results unambiguously with sufficient detail to enable replication of results is a fundamental tenet of scientific research. In today's cluttered world of "-omics" sciences, data standards and standardized use of terminologies and ontologies for biomedical informatics play an important role in reporting high-throughput experiment results in formats that can be interpreted by both researchers and analytical tools. Increasing adoption of Semantic Web and Linked Data technologies for the integration of heterogeneous and distributed health care and life sciences (HCLSs) datasets has made the reuse of standards even more pressing; dynamic semantic query federation can be used for integrative bioinformatics when ontologies and identifiers are reused across data instances. We present here a methodology to integrate the results and experimental context of three different representations of microarray-based transcriptomic experiments: the Gene Expression Atlas, the W3C BioRDF task force approach to reporting Provenance of Microarray Experiments, and the HSCI blood genomics project. Our approach does not attempt to improve the expressivity of existing standards for genomics but, instead, to enable integration of existing datasets published from microarray-based transcriptomic experiments. SPARQL Construct is used to create a posteriori mappings of concepts and properties and linking rules that match entities based on query constraints. We discuss how our integrative approach can encourage reuse of the Experimental Factor Ontology (EFO) and the Ontology for Biomedical Investigations (OBIs) for the reporting of experimental context and results of gene expression studies. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. An optimized web-based approach for collaborative stereoscopic medical visualization

    PubMed Central

    Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C

    2013-01-01

    Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008

  5. ProvenCare-Psoriasis: A disease management model to optimize care.

    PubMed

    Gionfriddo, Michael R; Pulk, Rebecca A; Sahni, Dev R; Vijayanagar, Sonal G; Chronowski, Joseph J; Jones, Laney K; Evans, Michael A; Feldman, Steven R; Pride, Howard

    2018-03-15

    There are a variety of evidence-based treatments available for psoriasis. The transition of this evidence into practice is challenging. In this article, we describe the design of our disease management approach for Psoriasis (ProvenCare®) and present preliminary evidence of the effect of its implementation. In designing our approach, we identified three barriers to optimal care: 1) lack of a standardized and discrete disease activity measure within the electronic health record, 2) lack of a system-wide, standardized approach to care, and 3) non-uniform financial access to appropriate non-pharmacologic treatments. We implemented several solutions, which collectively form our approach. We standardized the documentation of clinical data such as body surface area (BSA), created a disease management algorithm for psoriasis, and aligned incentives to facilitate the implementation of the algorithm. This approach provides more coordinated, cost effective care for psoriasis, while being acceptable to key stakeholders. Future work will examine the effect of the implementation of our approach on important clinical and patient outcomes.

  6. Estimation of Standard Error of Regression Effects in Latent Regression Models Using Binder's Linearization. Research Report. ETS RR-07-09

    ERIC Educational Resources Information Center

    Li, Deping; Oranje, Andreas

    2007-01-01

    Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…

  7. Identifying Children with Intellectual Disabilities in the Tribal Population of Barwani District in State of Madhya Pradesh, India.

    PubMed

    Lakhan, Ram; Mawson, Anthony R

    2016-05-01

    Low-and middle-income countries (LAMI) lack an integrated and systematic approach to identify people with intellectual disabilities. Screening surveys are considered resource-intensive; therefore, alternative approaches are needed. This study attempted to identify children up to age 18 years with intellectual disabilities through a mixed-method approach involving focus group interviews (FGIs) and door-to-door surveys. Focus groups were conducted with the assistance and involvement of local leaders in four villages of Barwani district of Madhya Pradesh with a 99% tribal population in all four villages. A formal survey of the community was then conducted to determine the prevalence of intellectual disabilities based on a standardized screening instrument (NIMH-DDS). Thirty focus group interviews were conducted involving 387 participants (males 284, females 103) over a period of 13 days. The entire adult population (N = 8797) was then surveyed for intellectual disabilities using a standardized screening instrument. The data revealed a close similarity in the prevalence rates of intellectual disabilities, as determined by the two approaches (Focus Group Interviews, 5.22/1000 versus Survey, 5.57/1000). A qualitative method using FGIs successfully identified people with intellectual disabilities in an economically deprived tribal area, showing that a community-based approach provides a close estimate of intellectual disabilities based on a formal survey using standard diagnostic criteria. These data suggest that FGI, along with other qualitative data, could be helpful in designing and in serving as an entree for community-based interventions. © 2015 John Wiley & Sons Ltd.

  8. Comparing implementations of penalized weighted least-squares sinogram restoration.

    PubMed

    Forthmann, Peter; Koehler, Thomas; Defrise, Michel; La Riviere, Patrick

    2010-11-01

    A CT scanner measures the energy that is deposited in each channel of a detector array by x rays that have been partially absorbed on their way through the object. The measurement process is complex and quantitative measurements are always and inevitably associated with errors, so CT data must be preprocessed prior to reconstruction. In recent years, the authors have formulated CT sinogram preprocessing as a statistical restoration problem in which the goal is to obtain the best estimate of the line integrals needed for reconstruction from the set of noisy, degraded measurements. The authors have explored both penalized Poisson likelihood (PL) and penalized weighted least-squares (PWLS) objective functions. At low doses, the authors found that the PL approach outperforms PWLS in terms of resolution-noise tradeoffs, but at standard doses they perform similarly. The PWLS objective function, being quadratic, is more amenable to computational acceleration than the PL objective. In this work, the authors develop and compare two different methods for implementing PWLS sinogram restoration with the hope of improving computational performance relative to PL in the standard-dose regime. Sinogram restoration is still significant in the standard-dose regime since it can still outperform standard approaches and it allows for correction of effects that are not usually modeled in standard CT preprocessing. The authors have explored and compared two implementation strategies for PWLS sinogram restoration: (1) A direct matrix-inversion strategy based on the closed-form solution to the PWLS optimization problem and (2) an iterative approach based on the conjugate-gradient algorithm. Obtaining optimal performance from each strategy required modifying the naive off-the-shelf implementations of the algorithms to exploit the particular symmetry and sparseness of the sinogram-restoration problem. For the closed-form approach, the authors subdivided the large matrix inversion into smaller coupled problems and exploited sparseness to minimize matrix operations. For the conjugate-gradient approach, the authors exploited sparseness and preconditioned the problem to speed up convergence. All methods produced qualitatively and quantitatively similar images as measured by resolution-variance tradeoffs and difference images. Despite the acceleration strategies, the direct matrix-inversion approach was found to be uncompetitive with iterative approaches, with a computational burden higher by an order of magnitude or more. The iterative conjugate-gradient approach, however, does appear promising, with computation times half that of the authors' previous penalized-likelihood implementation. Iterative conjugate-gradient based PWLS sinogram restoration with careful matrix optimizations has computational advantages over direct matrix PWLS inversion and over penalized-likelihood sinogram restoration and can be considered a good alternative in standard-dose regimes.

  9. Towards a framework for developing semantic relatedness reference standards.

    PubMed

    Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G

    2011-04-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. A Virtual Observatory Approach to Planetary Data for Vesta and Ceres

    NASA Astrophysics Data System (ADS)

    Giardino, M.; Fonte, S.; Politi, R.; Ivanovski, S.; Longobardo, A.; Capria, M. T.; Erard, S.; De Sanctis, M. C.

    2018-04-01

    A virtual observatory service for DAWN/VIR spectral dataset is presented, based upon the IVOA standards adapted to the planetary field. Advantages of such an approach will be discussed, especially concerning interoperability and availability.

  11. Methodological Approaches to Online Scoring of Essays.

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.; O'Neil, Harold F., Jr.

    This report examines the feasibility of scoring essays using computer-based techniques. Essays have been incorporated into many of the standardized testing programs. Issues of validity and reliability must be addressed to deploy automated approaches to scoring fully. Two approaches that have been used to classify documents, surface- and word-based…

  12. An ordinal classification approach for CTG categorization.

    PubMed

    Georgoulas, George; Karvelis, Petros; Gavrilis, Dimitris; Stylios, Chrysostomos D; Nikolakopoulos, George

    2017-07-01

    Evaluation of cardiotocogram (CTG) is a standard approach employed during pregnancy and delivery. But, its interpretation requires high level expertise to decide whether the recording is Normal, Suspicious or Pathological. Therefore, a number of attempts have been carried out over the past three decades for development automated sophisticated systems. These systems are usually (multiclass) classification systems that assign a category to the respective CTG. However most of these systems usually do not take into consideration the natural ordering of the categories associated with CTG recordings. In this work, an algorithm that explicitly takes into consideration the ordering of CTG categories, based on binary decomposition method, is investigated. Achieved results, using as a base classifier the C4.5 decision tree classifier, prove that the ordinal classification approach is marginally better than the traditional multiclass classification approach, which utilizes the standard C4.5 algorithm for several performance criteria.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment thatmore » integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches.« less

  14. A combined application of thermal desorber and gas chromatography to the analysis of gaseous carbonyls with the aid of two internal standards.

    PubMed

    Kim, Ki-Hyun; Anthwal, A; Pandey, Sudhir Kumar; Kabir, Ehsanul; Sohn, Jong Ryeul

    2010-11-01

    In this study, a series of GC calibration experiments were conducted to examine the feasibility of the thermal desorption approach for the quantification of five carbonyl compounds (acetaldehyde, propionaldehyde, butyraldehyde, isovaleraldehyde, and valeraldehyde) in conjunction with two internal standard compounds. The gaseous working standards of carbonyls were calibrated with the aid of thermal desorption as a function of standard concentration and of loading volume. The detection properties were then compared against two types of external calibration data sets derived by fixed standard volume and fixed standard concentration approach. According to this comparison, the fixed standard volume-based calibration of carbonyls should be more sensitive and reliable than its fixed standard concentration counterpart. Moreover, the use of internal standard can improve the analytical reliability of aromatics and some carbonyls to a considerable extent. Our preliminary test on real samples, however, indicates that the performance of internal calibration, when tested using samples of varying dilution ranges, can be moderately different from that derivable from standard gases. It thus suggests that the reliability of calibration approaches should be examined carefully with the considerations on the interactive relationships between the compound-specific properties and the operation conditions of the instrumental setups.

  15. A new approach to preparation of standard LEDs for luminous intensity and flux measurement of LEDs

    NASA Astrophysics Data System (ADS)

    Park, Seung-Nam; Park, Seongchong; Lee, Dong-Hoon

    2006-09-01

    This work presents an alternative approach for preparing photometric standard LEDs, which is based on a novel functional seasoning method. The main idea of our seasoning method is simultaneously monitoring the light output and the junction voltage to obtain quantitative information on the temperature dependence and the aging effect of the LED emission. We suggested a general model describing the seasoning process by taking junction temperature variation and aging effect into account and implemented a fully automated seasoning facility, which is capable of seasoning 12 LEDs at the same time. By independent measurements of the temperature dependence, we confirmed the discrepancy of the theoretical model to be less than 0.5 % and evaluate the uncertainty contribution of the functional seasoning to be less than 0.5 % for all the seasoned samples. To demonstrate assigning the reference value to a standard LED, the CIE averaged LED intensity (ALI) of the seasoned LEDs was measured with a spectroradiometer-based instrument and the measurement uncertainty was analyzed. The expanded uncertainty of the standard LED prepared by the new approach amounts to be 4 % ~ 5 % (k=2) depending on color without correction of spectral stray light in the spectroradiometer.

  16. A computational framework for converting textual clinical diagnostic criteria into the quality data model.

    PubMed

    Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian

    2016-10-01

    Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Fast calculation of the line-spread-function by transversal directions decoupling

    NASA Astrophysics Data System (ADS)

    Parravicini, Jacopo; Tartara, Luca; Hasani, Elton; Tomaselli, Alessandra

    2016-07-01

    We propose a simplified method to calculate the optical spread function of a paradigmatic system constituted by a pupil-lens with a line-shaped illumination (‘line-spread-function’). Our approach is based on decoupling the two transversal directions of the beam and treating the propagation by means of the Fourier optics formalism. This requires simpler calculations with respect to the more usual Bessel-function-based method. The model is discussed and compared with standard calculation methods by carrying out computer simulations. The proposed approach is found to be much faster than the Bessel-function-based one (CPU time ≲ 5% of the standard method), while the results of the two methods present a very good mutual agreement.

  18. Assessment of technologies to meet a low carbon fuel standard.

    PubMed

    Yeh, Sonia; Lutsey, Nicholas P; Parker, Nathan C

    2009-09-15

    California's low carbon fuel standard (LCFS) was designed to incentivize a diverse array of available strategies for reducing transportation greenhouse gas (GHG) emissions. It provides strong incentives for fuels with lower GHG emissions, while explicitly requiring a 10% reduction in California's transportation fuel GHG intensity by 2020. This paper investigates the potential for cost-effective GHG reductions from electrification and expanded use of biofuels. The analysis indicates that fuel providers could meetthe standard using a portfolio approach that employs both biofuels and electricity, which would reduce the risks and uncertainties associated with the progress of cellulosic and battery technologies, feedstock prices, land availability, and the sustainability of the various compliance approaches. Our analysis is based on the details of California's development of an LCFS; however, this research approach could be generalizable to a national U.S. standard and to similar programs in Europe and Canada.

  19. Reuse and Interoperability of Avionics for Space Systems

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.

    2007-01-01

    The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.

  20. Comparison of EPA Method 1615 RT-qPCR Assays in Standard and Kit Format

    EPA Science Inventory

    EPA Method 1615 contains protocols for measuring enterovirus and norovirus by reverse transcription quantitative polymerase chain reaction. A commercial kit based upon these protocols was designed and compared to the method's standard approach. Reagent grade, secondary effluent, ...

  1. Parallelism measurement for base plate of standard artifact with multiple tactile approaches

    NASA Astrophysics Data System (ADS)

    Ye, Xiuling; Zhao, Yan; Wang, Yiwen; Wang, Zhong; Fu, Luhua; Liu, Changjie

    2018-01-01

    Nowadays, as workpieces become more precise and more specialized which results in more sophisticated structures and higher accuracy for the artifacts, higher requirements have been put forward for measuring accuracy and measuring methods. As an important method to obtain the size of workpieces, coordinate measuring machine (CMM) has been widely used in many industries. In order to achieve the calibration of a self-developed CMM, it is found that the parallelism of the base plate used for fixing the standard artifact is an important factor which affects the measurement accuracy in the process of studying self-made high-precision standard artifact. And aimed to measure the parallelism of the base plate, by using the existing high-precision CMM, gauge blocks, dial gauge and marble platform with the tactile approach, three methods for parallelism measurement of workpieces are employed, and comparisons are made within the measurement results. The results of experiments show that the final accuracy of all the three methods is able to reach micron level and meets the measurement requirements. Simultaneously, these three approaches are suitable for different measurement conditions which provide a basis for rapid and high-precision measurement under different equipment conditions.

  2. Standards-based metadata procedures for retrieving data for display or mining utilizing persistent (data-DOI) identifiers.

    PubMed

    Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S

    2015-01-01

    We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.

  3. A Comparative Analysis of Meeting the Whole Child Initiatives through Standardized and Competency-Based Education Systems in Terms of Achievement and Meeting the Whole Child Initiatives: Comparing Professional Perceptions and Identified Measurable Results

    ERIC Educational Resources Information Center

    Ward, Jacqueline M.

    2011-01-01

    Traditional education (TE) largely uses a standardized (SbE) approach while alternatives (nTE) tend to more of a competency (CbE), or student-centered approach. This comparative analysis examines essential aspects of such pedagogies in determining the effectiveness of schooling systems in meeting the Whole Child Initiative (Souza, 1999; Carter et…

  4. An Introduction to Message-Bus Architectures for Space Systems

    NASA Technical Reports Server (NTRS)

    Smith, Danford; Gregory, Brian

    2005-01-01

    This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system discussed and time for questions and answers will be provided.

  5. Semiautomated segmentation of head and neck cancers in 18F-FDG PET scans: A just-enough-interaction approach.

    PubMed

    Beichel, Reinhard R; Van Tol, Markus; Ulrich, Ethan J; Bauer, Christian; Chang, Tangel; Plichta, Kristin A; Smith, Brian J; Sunderland, John J; Graham, Michael M; Sonka, Milan; Buatti, John M

    2016-06-01

    The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the "just-enough-interaction" principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction.

  6. Semiautomated segmentation of head and neck cancers in 18F-FDG PET scans: A just-enough-interaction approach

    PubMed Central

    Beichel, Reinhard R.; Van Tol, Markus; Ulrich, Ethan J.; Bauer, Christian; Chang, Tangel; Plichta, Kristin A.; Smith, Brian J.; Sunderland, John J.; Graham, Michael M.; Sonka, Milan; Buatti, John M.

    2016-01-01

    Purpose: The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. Methods: A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the “just-enough-interaction” principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Results: Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Conclusions: Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction. PMID:27277044

  7. Semiautomated segmentation of head and neck cancers in 18F-FDG PET scans: A just-enough-interaction approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beichel, Reinhard R., E-mail: reinhard-beichel@uiowa.edu; Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242; Department of Internal Medicine, University of Iowa, Iowa City, Iowa 52242

    Purpose: The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. Methods: A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behaviormore » of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the “just-enough-interaction” principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Results: Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Conclusions: Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction.« less

  8. Research Problems Associated with Limiting the Applied Force in Vibration Tests and Conducting Base-Drive Modal Vibration Tests

    NASA Technical Reports Server (NTRS)

    Scharton, Terry D.

    1995-01-01

    The intent of this paper is to make a case for developing and conducting vibration tests which are both realistic and practical (a question of tailoring versus standards). Tests are essential for finding things overlooked in the analyses. The best test is often the most realistic test which can be conducted within the cost and budget constraints. Some standards are essential, but the author believes more in the individual's ingenuity to solve a specific problem than in the application of standards which reduce problems (and technology) to their lowest common denominator. Force limited vibration tests and base-drive modal tests are two examples of realistic, but practical testing approaches. Since both of these approaches are relatively new, a number of interesting research problems exist, and these are emphasized herein.

  9. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    PubMed

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  10. A new proposal for greenhouse gas emissions responsibility allocation: best available technologies approach.

    PubMed

    Berzosa, Álvaro; Barandica, Jesús M; Fernández-Sánchez, Gonzalo

    2014-01-01

    In recent years, several methodologies have been developed for the quantification of greenhouse gas (GHG) emissions. However, determining who is responsible for these emissions is also quite challenging. The most common approach is to assign emissions to the producer (based on the Kyoto Protocol), but proposals also exist for its allocation to the consumer (based on an ecological footprint perspective) and for a hybrid approach called shared responsibility. In this study, the existing proposals and standards regarding the allocation of GHG emissions responsibilities are analyzed, focusing on their main advantages and problems. A new model of shared responsibility that overcomes some of the existing problems is also proposed. This model is based on applying the best available technologies (BATs). This new approach allocates the responsibility between the producers and the final consumers based on the real capacity of each agent to reduce emissions. The proposed approach is demonstrated using a simple case study of a 4-step life cycle of ammonia nitrate (AN) fertilizer production. The proposed model has the characteristics that the standards and publications for assignment of GHG emissions responsibilities demand. This study presents a new way to assign responsibilities that pushes all the actors in the production chain, including consumers, to reduce pollution. © 2013 SETAC.

  11. Towards improving the NASA standard soil moisture retrieval algorithm and product

    NASA Astrophysics Data System (ADS)

    Mladenova, I. E.; Jackson, T. J.; Njoku, E. G.; Bindlish, R.; Cosh, M. H.; Chan, S.

    2013-12-01

    Soil moisture mapping using passive-based microwave remote sensing techniques has proven to be one of the most effective ways of acquiring reliable global soil moisture information on a routine basis. An important step in this direction was made by the launch of the Advanced Microwave Scanning Radiometer on the NASA's Earth Observing System Aqua satellite (AMSR-E). Along with the standard NASA algorithm and operational AMSR-E product, the easy access and availability of the AMSR-E data promoted the development and distribution of alternative retrieval algorithms and products. Several evaluation studies have demonstrated issues with the standard NASA AMSR-E product such as dampened temporal response and limited range of the final retrievals and noted that the available global passive-based algorithms, even though based on the same electromagnetic principles, produce different results in terms of accuracy and temporal dynamics. Our goal is to identify the theoretical causes that determine the reduced sensitivity of the NASA AMSR-E product and outline ways to improve the operational NASA algorithm, if possible. Properly identifying the underlying reasons that cause the above mentioned features of the NASA AMSR-E product and differences between the alternative algorithms requires a careful examination of the theoretical basis of each approach. Specifically, the simplifying assumptions and parametrization approaches adopted by each algorithm to reduce the dimensionality of unknowns and characterize the observing system. Statistically-based error analyses, which are useful and necessary, provide information on the relative accuracy of each product but give very little information on the theoretical causes, knowledge that is essential for algorithm improvement. Thus, we are currently examining the possibility of improving the standard NASA AMSR-E global soil moisture product by conducting a thorough theoretically-based review of and inter-comparisons between several well established global retrieval techniques. A detailed discussion focused on the theoretical basis of each approach and algorithms sensitivity to assumptions and parametrization approaches will be presented. USDA is an equal opportunity provider and employer.

  12. Improved Blood Pressure Control to Reduce Cardiovascular Disease Morbidity and Mortality: The Standardized Hypertension Treatment and Prevention Project

    PubMed Central

    Patel, Pragna; Ordunez, Pedro; DiPette, Donald; Escobar, Maria Cristina; Hassell, Trevor; Wyss, Fernando; Hennis, Anselm; Asma, Samira; Angell, Sonia

    2017-01-01

    Hypertension is the leading remediable risk factor for cardiovascular disease, affecting more than 1 billion people worldwide, and is responsible for more than 10 million preventable deaths globally each year. While hypertension can be successfully diagnosed and treated, only one in seven persons with hypertension have controlled blood pressure. To meet the challenge of improving the control of hypertension, particularly in low- and middle-income countries, the authors developed the Standardized Hypertension Treatment and Prevention Project, which involves a health systems–strengthening approach that advocates for standardized hypertension management using evidence-based interventions. These interventions include the use of standardized treatment protocols, a core set of medications along with improved procurement mechanisms to increase the availability and affordability of these medications, registries for cohort monitoring and evaluation, patient empowerment, team-based care (task shifting), and community engagement. With political will and strong partnerships, this approach provides the groundwork to reduce high blood pressure and cardiovascular disease-related morbidity and mortality. PMID:27378199

  13. [Improved Blood Pressure Control to Reduce Cardiovascular Disease Morbidity and Mortality: The Standardized Hypertension Treatment and Prevention Project].

    PubMed

    Patel, Pragna; Ordunez, Pedro; DiPette, Donald; Escobar, María Cristina; Hassell, Trevor; Wyss, Fernando; Hennis, Anselm; Asma, Samira; Angell, Sonia

    2017-06-08

    Hypertension is the leading remediable risk factor for cardiovascular disease, affecting more than 1 billion people worldwide, and is responsible for more than 10 million preventable deaths globally each year. While hypertension can be successfully diagnosed and treated, only one in seven persons with hypertension have controlled blood pressure. To meet the challenge of improving the control of hypertension, particularly in low- and middle-income countries, the authors developed the Standardized Hypertension Treatment and Prevention Project, which involves a health systems-strengthening approach that advocates for standardized hypertension management using evidence-based interventions. These interventions include the use of standardized treatment protocols, a core set of medications along with improved procurement mechanisms to increase the availability and affordability of these medications, registries for cohort monitoring and evaluation, patient empowerment, team-based care (task shifting), and community engagement. With political will and strong partnerships, this approach provides the groundwork to reduce high blood pressure and cardiovascular disease-related morbidity and mortality.

  14. Teaching Improvisation through Melody and Blues-Based Harmony: A Comprehensive and Sequential Approach

    ERIC Educational Resources Information Center

    Heil, Leila

    2017-01-01

    This article describes a sequential approach to improvisation teaching that can be used with students at various age and ability levels by any educator, regardless of improvisation experience. The 2014 National Core Music Standards include improvisation as a central component in musical learning and promote instructional approaches that are…

  15. An integrated high resolution mass spectrometric and informatics approach for the rapid identification of phenolics in plant extract

    USDA-ARS?s Scientific Manuscript database

    An integrated approach based on high resolution MS analysis (orbitrap), database (db) searching and MS/MS fragmentation prediction for the rapid identification of plant phenols is reported. The approach was firstly validated by using a mixture of phenolic standards (phenolic acids, flavones, flavono...

  16. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  17. Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture

    NASA Astrophysics Data System (ADS)

    Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.

    2012-12-01

    The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.

  18. Comparing an Inquiry-Based Approach Known as the Science Writing Heuristic to Traditional Science Teaching Practices: Are There Differences?

    ERIC Educational Resources Information Center

    Akkus, Recai; Gunel, Murat; Hand, Brian

    2007-01-01

    Many state and federal governments have mandated in such documents as the National Science Education Standards that inquiry strategies should be the focus of the teaching of science within school classrooms. The difficult part for success is changing teacher practices from perceived traditional ways of teaching to more inquiry-based approaches.…

  19. Overcoming the Challenges of Unstructured Data in Multisite, Electronic Medical Record-based Abstraction.

    PubMed

    Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy J H

    2016-10-01

    Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as these data are often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. As standard abstraction approaches resulted in substandard data reliability for unstructured data elements collected as part of a multisite, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. We adopted a "fit-for-use" framework to guide the development and evaluation of abstraction methods using a 4-step, phase-based approach including (1) team building; (2) identification of challenges; (3) adaptation of abstraction methods; and (4) systematic data quality monitoring. Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (eg, warfarin initiation) and medical follow-up (eg, timeframe for follow-up). After implementation of the phase-based approach, interrater reliability for all unstructured data elements demonstrated κ's of ≥0.89-an average increase of +0.25 for each unstructured data element. As compared with standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multisite EMR documentation.

  20. Comparison of Student Achievement Using Didactic, Inquiry-Based, and the Combination of Two Approaches of Science Instruction

    NASA Astrophysics Data System (ADS)

    Foster, Hyacinth Carmen

    Science educators and administrators support the idea that inquiry-based and didactic-based instructional strategies have varying effects on students' acquisition of science concepts. The research problem addressed whether incorporating the two approaches covered the learning requirements of all students in science classes, enabling them to meet state and national standards. The purpose of this quasiexperimental, posttest design research study was to determine if student learning and achievement in high school biology classes differed for each type of instructional method. Constructivism theory suggested that each learner creates knowledge over time because of the learners' interactions with the environment. The optimal teaching method, didactic (teacher-directed), inquiry-based, or a combination of two approaches instructional method, becomes essential if students are to discover ways to learn information. The research question examined which form of instruction had a significant effect on student achievement in biology. The data analysis consisted of single-factor, independent-measures analysis of variance (ANOVA) that tested the hypotheses of the research study. Locally, the results indicated greater and statistically significant differences in standardized laboratory scores for students who were taught using the combination of two approaches. Based on these results, biology instructors will gain new insights into ways of improving the instructional process. Social change may occur as the science curriculum leadership applies the combination of two instructional approaches to improve acquisition of science concepts by biology students.

  1. Lateral Load Capacity of Piles: A Comparative Study Between Indian Standards and Theoretical Approach

    NASA Astrophysics Data System (ADS)

    Jayasree, P. K.; Arun, K. V.; Oormila, R.; Sreelakshmi, H.

    2018-05-01

    As per Indian Standards, laterally loaded piles are usually analysed using the method adopted by IS 2911-2010 (Part 1/Section 2). But the practising engineers are of the opinion that the IS method is very conservative in design. This work aims at determining the extent to which the conventional IS design approach is conservative. This is done through a comparative study between IS approach and the theoretical model based on Vesic's equation. Bore log details for six different bridges were collected from the Kerala Public Works Department. Cast in situ fixed head piles embedded in three soil conditions both end bearing as well as friction piles were considered and analyzed separately. Piles were also modelled in STAAD.Pro software based on IS approach and the results were validated using Matlock and Reese (In Proceedings of fifth international conference on soil mechanics and foundation engineering, 1961) equation. The results were presented as the percentage variation in values of bending moment and deflection obtained by different methods. The results obtained from the mathematical model based on Vesic's equation and that obtained as per the IS approach were compared and the IS method was found to be uneconomical and conservative.

  2. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    PubMed Central

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  3. Image-Based 3d Reconstruction and Analysis for Orthodontia

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  4. Sources of Biased Inference in Alcohol and Drug Services Research: An Instrumental Variable Approach

    PubMed Central

    Schmidt, Laura A.; Tam, Tammy W.; Larson, Mary Jo

    2012-01-01

    Objective: This study examined the potential for biased inference due to endogeneity when using standard approaches for modeling the utilization of alcohol and drug treatment. Method: Results from standard regression analysis were compared with those that controlled for endogeneity using instrumental variables estimation. Comparable models predicted the likelihood of receiving alcohol treatment based on the widely used Aday and Andersen medical care–seeking model. Data were from the National Epidemiologic Survey on Alcohol and Related Conditions and included a representative sample of adults in households and group quarters throughout the contiguous United States. Results: Findings suggested that standard approaches for modeling treatment utilization are prone to bias because of uncontrolled reverse causation and omitted variables. Compared with instrumental variables estimation, standard regression analyses produced downwardly biased estimates of the impact of alcohol problem severity on the likelihood of receiving care. Conclusions: Standard approaches for modeling service utilization are prone to underestimating the true effects of problem severity on service use. Biased inference could lead to inaccurate policy recommendations, for example, by suggesting that people with milder forms of substance use disorder are more likely to receive care than is actually the case. PMID:22152672

  5. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  6. Neutrino oscillation processes in a quantum-field-theoretical approach

    NASA Astrophysics Data System (ADS)

    Egorov, Vadim O.; Volobuev, Igor P.

    2018-05-01

    It is shown that neutrino oscillation processes can be consistently described in the framework of quantum field theory using only the plane wave states of the particles. Namely, the oscillating electron survival probabilities in experiments with neutrino detection by charged-current and neutral-current interactions are calculated in the quantum field-theoretical approach to neutrino oscillations based on a modification of the Feynman propagator in the momentum representation. The approach is most similar to the standard Feynman diagram technique. It is found that the oscillating distance-dependent probabilities of detecting an electron in experiments with neutrino detection by charged-current and neutral-current interactions exactly coincide with the corresponding probabilities calculated in the standard approach.

  7. Invited commentary: G-computation--lost in translation?

    PubMed

    Vansteelandt, Stijn; Keiding, Niels

    2011-04-01

    In this issue of the Journal, Snowden et al. (Am J Epidemiol. 2011;173(7):731-738) give a didactic explanation of G-computation as an approach for estimating the causal effect of a point exposure. The authors of the present commentary reinforce the idea that their use of G-computation is equivalent to a particular form of model-based standardization, whereby reference is made to the observed study population, a technique that epidemiologists have been applying for several decades. They comment on the use of standardized versus conditional effect measures and on the relative predominance of the inverse probability-of-treatment weighting approach as opposed to G-computation. They further propose a compromise approach, doubly robust standardization, that combines the benefits of both of these causal inference techniques and is not more difficult to implement.

  8. On standardization of low symmetry crystal fields

    NASA Astrophysics Data System (ADS)

    Gajek, Zbigniew

    2015-07-01

    Standardization methods of low symmetry - orthorhombic, monoclinic and triclinic - crystal fields are formulated and discussed. Two alternative approaches are presented, the conventional one, based on the second-rank parameters and the standardization based on the fourth-rank parameters. Mainly f-electron systems are considered but some guidelines for d-electron systems and the spin Hamiltonian describing the zero-field splitting are given. The discussion focuses on premises for choosing the most suitable method, in particular on inadequacy of the conventional one. Few examples from the literature illustrate this situation.

  9. Risk-based SMA for Cubesats

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse

    2016-01-01

    This presentation conveys an approach for risk-based safety and mission assurance applied to cubesats. This presentation accompanies a NASA Goddard standard in development that provides guidance for building a mission success plan for cubesats based on the risk tolerance and resources available.

  10. Evolving rule-based systems in two medical domains using genetic programming.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf

    2004-11-01

    To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.

  11. Australian Food Safety Policy Changes from a "Command and Control" to an "Outcomes-Based" Approach: Reflection on the Effectiveness of Its Implementation.

    PubMed

    Smith, James; Ross, Kirstin; Whiley, Harriet

    2016-12-08

    Foodborne illness is a global public health burden. Over the past decade in Australia, despite advances in microbiological detection and control methods, there has been an increase in the incidence of foodborne illness. Therefore improvements in the regulation and implementation of food safety policy are crucial for protecting public health. In 2000, Australia established a national food safety regulatory system, which included the adoption of a mandatory set of food safety standards. These were in line with international standards and moved away from a "command and control" regulatory approach to an "outcomes-based" approach using risk assessment. The aim was to achieve national consistency and reduce foodborne illness without unnecessarily burdening businesses. Evidence demonstrates that a risk based approach provides better protection for consumers; however, sixteen years after the adoption of the new approach, the rates of food borne illness are still increasing. Currently, food businesses are responsible for producing safe food and regulatory bodies are responsible for ensuring legislative controls are met. Therefore there is co-regulatory responsibility and liability and implementation strategies need to reflect this. This analysis explores the challenges facing food regulation in Australia and explores the rationale and evidence in support of this new regulatory approach.

  12. A new IRT-based standard setting method: application to eCat-listening.

    PubMed

    García, Pablo Eduardo; Abad, Francisco José; Olea, Julio; Aguado, David

    2013-01-01

    Criterion-referenced interpretations of tests are highly necessary, which usually involves the difficult task of establishing cut scores. Contrasting with other Item Response Theory (IRT)-based standard setting methods, a non-judgmental approach is proposed in this study, in which Item Characteristic Curve (ICC) transformations lead to the final cut scores. eCat-Listening, a computerized adaptive test for the evaluation of English Listening, was administered to 1,576 participants, and the proposed standard setting method was applied to classify them into the performance standards of the Common European Framework of Reference for Languages (CEFR). The results showed a classification closely related to relevant external measures of the English language domain, according to the CEFR. It is concluded that the proposed method is a practical and valid standard setting alternative for IRT-based tests interpretations.

  13. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Quality assurance and management in microelectronics companies: ISO 9000 versus Six Sigma

    NASA Astrophysics Data System (ADS)

    Lupan, Razvan; Kobi, Abdessamad; Robledo, Christian; Bacivarov, Ioan; Bacivarov, Angelica

    2009-01-01

    A strategy for the implementation of the Six Sigma method as an improvement solution for the ISO 9000:2000 Quality Standard is proposed. Our approach is focused on integrating the DMAIC cycle of the Six Sigma method with the PDCA process approach, highly recommended by the standard ISO 9000:2000. The Six Sigma steps applied to each part of the PDCA cycle are presented in detail, giving some tools and training examples. Based on this analysis the authors conclude that applying Six Sigma philosophy to the Quality Standard implementation process is the best way to achieve the optimal results in quality progress and therefore in customers satisfaction.

  15. ALT-114 and ALT-118 Alternative Approaches to NIST ...

    EPA Pesticide Factsheets

    In 2016, US EPA approved two separate alternatives (ALT 114 and ALT 118) for the preparation and certification of Hydrogen Chloride (HCl) and Mercury (Hg) cylinder reference gas standards that can serve as EPA Protocol gases where EPA Protocol are required, but unavailable. The alternatives were necessary due to the unavailability of NIST reference materials (SRM, NTRM, CRM or RGM) or VSL reference materials (VSL PRM or VSL CRM), reference materials identified in EPA’s Green Book as necessary to establish the traceability of EPA protocol gases. ALT 114 and ALT 118 provides a pathway for gas vendors to prepare and certify traceable gas cylinder standards for use in certifying Hg and HCl CEMS. In this presentation, EPA will describe the mechanics and requirements of the performance-based approach, provide an update on the availability of these gas standards and also discuss the potential for producing and certifying gas standards for other compounds using this approach. This presentation discusses the importance of NIST-traceable reference gases relative to regulatory source compliance emissions monitoring. Specifically this presentation discusses 2 new approaches for making necessary reference gases available in the absence of NIST reference materials. Moreover, these approaches provide an alternative approach to rapidly make available new reference gases for additional HAPS regulatory compliance emissions measurement and monitoring.

  16. Ballast water regulations and the move toward concentration-based numeric discharge limits.

    PubMed

    Albert, Ryan J; Lishman, John M; Saxena, Juhi R

    2013-03-01

    Ballast water from shipping is a principal source for the introduction of nonindigenous species. As a result, numerous government bodies have adopted various ballast water management practices and discharge standards to slow or eliminate the future introduction and dispersal of these nonindigenous species. For researchers studying ballast water issues, understanding the regulatory framework is helpful to define the scope of research needed by policy makers to develop effective regulations. However, for most scientists, this information is difficult to obtain because it is outside the standard scientific literature and often difficult to interpret. This paper provides a brief review of the regulatory framework directed toward scientists studying ballast water and aquatic invasive species issues. We describe different approaches to ballast water management in international, U.S. federal and state, and domestic ballast water regulation. Specifically, we discuss standards established by the International Maritime Organization (IMO), the U.S. Coast Guard and U.S. Environmental Protection Agency, and individual states in the United States including California, New York, and Minnesota. Additionally, outside the United States, countries such as Australia, Canada, and New Zealand have well-established domestic ballast water regulatory regimes. Different approaches to regulation have recently resulted in variations between numeric concentration-based ballast water discharge limits, particularly in the United States, as well as reliance on use of ballast water exchange pending development and adoption of rigorous science-based discharge standards. To date, numeric concentration-based discharge limits have not generally been based upon a thorough application of risk-assessment methodologies. Regulators, making decisions based on the available information and methodologies before them, have consequently established varying standards, or not established standards at all. The review and refinement of ballast water discharge standards by regulatory agencies will benefit from activity by the scientific community to improve and develop more precise risk-assessment methodologies.

  17. Comparing implementations of penalized weighted least-squares sinogram restoration

    PubMed Central

    Forthmann, Peter; Koehler, Thomas; Defrise, Michel; La Riviere, Patrick

    2010-01-01

    Purpose: A CT scanner measures the energy that is deposited in each channel of a detector array by x rays that have been partially absorbed on their way through the object. The measurement process is complex and quantitative measurements are always and inevitably associated with errors, so CT data must be preprocessed prior to reconstruction. In recent years, the authors have formulated CT sinogram preprocessing as a statistical restoration problem in which the goal is to obtain the best estimate of the line integrals needed for reconstruction from the set of noisy, degraded measurements. The authors have explored both penalized Poisson likelihood (PL) and penalized weighted least-squares (PWLS) objective functions. At low doses, the authors found that the PL approach outperforms PWLS in terms of resolution-noise tradeoffs, but at standard doses they perform similarly. The PWLS objective function, being quadratic, is more amenable to computational acceleration than the PL objective. In this work, the authors develop and compare two different methods for implementing PWLS sinogram restoration with the hope of improving computational performance relative to PL in the standard-dose regime. Sinogram restoration is still significant in the standard-dose regime since it can still outperform standard approaches and it allows for correction of effects that are not usually modeled in standard CT preprocessing. Methods: The authors have explored and compared two implementation strategies for PWLS sinogram restoration: (1) A direct matrix-inversion strategy based on the closed-form solution to the PWLS optimization problem and (2) an iterative approach based on the conjugate-gradient algorithm. Obtaining optimal performance from each strategy required modifying the naive off-the-shelf implementations of the algorithms to exploit the particular symmetry and sparseness of the sinogram-restoration problem. For the closed-form approach, the authors subdivided the large matrix inversion into smaller coupled problems and exploited sparseness to minimize matrix operations. For the conjugate-gradient approach, the authors exploited sparseness and preconditioned the problem to speed up convergence. Results: All methods produced qualitatively and quantitatively similar images as measured by resolution-variance tradeoffs and difference images. Despite the acceleration strategies, the direct matrix-inversion approach was found to be uncompetitive with iterative approaches, with a computational burden higher by an order of magnitude or more. The iterative conjugate-gradient approach, however, does appear promising, with computation times half that of the authors’ previous penalized-likelihood implementation. Conclusions: Iterative conjugate-gradient based PWLS sinogram restoration with careful matrix optimizations has computational advantages over direct matrix PWLS inversion and over penalized-likelihood sinogram restoration and can be considered a good alternative in standard-dose regimes. PMID:21158306

  18. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  19. User Interface Composition with COTS-UI and Trading Approaches: Application for Web-Based Environmental Information Systems

    NASA Astrophysics Data System (ADS)

    Criado, Javier; Padilla, Nicolás; Iribarne, Luis; Asensio, Jose-Andrés

    Due to the globalization of the information and knowledge society on the Internet, modern Web-based Information Systems (WIS) must be flexible and prepared to be easily accessible and manageable in real-time. In recent times it has received a special interest the globalization of information through a common vocabulary (i.e., ontologies), and the standardized way in which information is retrieved on the Web (i.e., powerful search engines, and intelligent software agents). These same principles of globalization and standardization should also be valid for the user interfaces of the WIS, but they are built on traditional development paradigms. In this paper we present an approach to reduce the gap of globalization/standardization in the generation of WIS user interfaces by using a real-time "bottom-up" composition perspective with COTS-interface components (type interface widgets) and trading services.

  20. Enhancement of Fast Face Detection Algorithm Based on a Cascade of Decision Trees

    NASA Astrophysics Data System (ADS)

    Khryashchev, V. V.; Lebedev, A. A.; Priorov, A. L.

    2017-05-01

    Face detection algorithm based on a cascade of ensembles of decision trees (CEDT) is presented. The new approach allows detecting faces other than the front position through the use of multiple classifiers. Each classifier is trained for a specific range of angles of the rotation head. The results showed a high rate of productivity for CEDT on images with standard size. The algorithm increases the area under the ROC-curve of 13% compared to a standard Viola-Jones face detection algorithm. Final realization of given algorithm consist of 5 different cascades for frontal/non-frontal faces. One more thing which we take from the simulation results is a low computational complexity of CEDT algorithm in comparison with standard Viola-Jones approach. This could prove important in the embedded system and mobile device industries because it can reduce the cost of hardware and make battery life longer.

  1. The Frontlines of Medicine Project: a proposal for the standardized communication of emergency department data for public health uses including syndromic surveillance for biological and chemical terrorism.

    PubMed

    Barthell, Edward N; Cordell, William H; Moorhead, John C; Handler, Jonathan; Feied, Craig; Smith, Mark S; Cochrane, Dennis G; Felton, Christopher W; Collins, Michael A

    2002-04-01

    The Frontlines of Medicine Project is a collaborative effort of emergency medicine (including emergency medical services and clinical toxicology), public health, emergency government, law enforcement, and informatics. This collaboration proposes to develop a nonproprietary, "open systems" approach for reporting emergency department patient data. The common element is a standard approach to sending messages from individual EDs to regional oversight entities that could then analyze the data received. ED encounter data could be used for various public health initiatives, including syndromic surveillance for chemical and biological terrorism. The interlinking of these regional systems could also permit public health surveillance at a national level based on ED patient encounter data. Advancements in the Internet and Web-based technologies could allow the deployment of these standardized tools in a rapid time frame.

  2. Monitoring of Qualifications and Employment in Austria: An Empirical Approach Based on the Labour Force Survey

    ERIC Educational Resources Information Center

    Lassnigg, Lorenz; Vogtenhuber, Stefan

    2011-01-01

    The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…

  3. The Johns Hopkins RTR Consortium: A Collaborative Approach to Advance Translational Science and Standardize Clinical Monitoring of Restorative Transplantation

    DTIC Science & Technology

    2016-10-01

    based Therapy, Large animal models, Allograft, Hand Transplantation ,Face Transplantation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT...Changes in Approach b. Problems/Delays and Plans for Resolution c. Changes that Impacted Expenditures d. Changes in use or care of vertebrate animals ...Vascularized Composite Allotransplantation Immunoregulation Tolerance Rejection Ischemia Reperfusion Cell based Therapy Large animal models

  4. Product line cost estimation: a standard cost approach.

    PubMed

    Cooper, J C; Suver, J D

    1988-04-01

    Product line managers often must make decisions based on inaccurate cost information. A method is needed to determine costs more accurately. By using a standard costing model, product line managers can better estimate the cost of intermediate and end products, and hence better estimate the costs of the product line.

  5. Establishing Benchmarks for Outcome Indicators: A Statistical Approach to Developing Performance Standards.

    ERIC Educational Resources Information Center

    Henry, Gary T.; And Others

    1992-01-01

    A statistical technique is presented for developing performance standards based on benchmark groups. The benchmark groups are selected using a multivariate technique that relies on a squared Euclidean distance method. For each observation unit (a school district in the example), a unique comparison group is selected. (SLD)

  6. TEQSA and Risk-Based Regulation: Considerations for University Governing Bodies

    ERIC Educational Resources Information Center

    Baird, Jeanette

    2013-01-01

    The advent of a new national regulatory and quality assurance regime in Australia, through the Tertiary Education Quality and Standards Agency (TEQSA), presents additional requirements to university governing bodies, for compliance with standards and for risk management. This paper discusses TEQSA's approach and potential vulnerabilities for TEQSA…

  7. Support vector machines-based modelling of seismic liquefaction potential

    NASA Astrophysics Data System (ADS)

    Pal, Mahesh

    2006-08-01

    This paper investigates the potential of support vector machines (SVM)-based classification approach to assess the liquefaction potential from actual standard penetration test (SPT) and cone penetration test (CPT) field data. SVMs are based on statistical learning theory and found to work well in comparison to neural networks in several other applications. Both CPT and SPT field data sets is used with SVMs for predicting the occurrence and non-occurrence of liquefaction based on different input parameter combination. With SPT and CPT test data sets, highest accuracy of 96 and 97%, respectively, was achieved with SVMs. This suggests that SVMs can effectively be used to model the complex relationship between different soil parameter and the liquefaction potential. Several other combinations of input variable were used to assess the influence of different input parameters on liquefaction potential. Proposed approach suggest that neither normalized cone resistance value with CPT data nor the calculation of standardized SPT value is required with SPT data. Further, SVMs required few user-defined parameters and provide better performance in comparison to neural network approach.

  8. Novel secret key generation techniques using memristor devices

    NASA Astrophysics Data System (ADS)

    Abunahla, Heba; Shehada, Dina; Yeun, Chan Yeob; Mohammad, Baker; Jaoude, Maguy Abi

    2016-02-01

    This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES) and Advanced Encryption Standard (AES) in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC) memristor based security.

  9. Feasibility of Piezoelectric Endoscopic Transsphenoidal Craniotomy: A Cadaveric Study

    PubMed Central

    Tomazic, Peter Valentin; Gellner, Verena; Koele, Wolfgang; Hammer, Georg Philipp; Braun, Eva Maria; Gerstenberger, Claus; Clarici, Georg; Holl, Etienne; Braun, Hannes; Stammberger, Heinz; Mokry, Michael

    2014-01-01

    Objective. Endoscopic transsphenoidal approach has become the gold standard for surgical treatment of treating pituitary adenomas or other lesions in that area. Opening of bony skull base has been performed with burrs, chisels, and hammers or standard instruments like punches and circular top knives. The creation of primary bone flaps—as in external craniotomies—is difficult.The piezoelectric osteotomes used in the present study allows creating a bone flap for endoscopic transnasal approaches in certain areas. The aim of this study was to prove the feasibility of piezoelectric endoscopic transnasal craniotomies. Study Design. Cadaveric study. Methods. On cadaveric specimens (N = 5), a piezoelectric system with specially designed hardware for endonasal application was applied and endoscopic transsphenoidal craniotomies at the sellar floor, tuberculum sellae, and planum sphenoidale were performed up to a size of 3–5 cm2. Results. Bone flaps could be created without fracturing with the piezoosteotome and could be reimplanted. Endoscopic handling was unproblematic and time required was not exceeding standard procedures. Conclusion. In a cadaveric model, the piezoelectric endoscopic transsphenoidal craniotomy (PETC) is technically feasible. This technique allows the surgeon to create a bone flap in endoscopic transnasal approaches similar to existing standard transcranial craniotomies. Future trials will focus on skull base reconstruction using this bone flap. PMID:24689037

  10. Is Dysfunctional Use of the Mobile Phone a Behavioural Addiction? Confronting Symptom-Based Versus Process-Based Approaches.

    PubMed

    Billieux, Joël; Philippot, Pierre; Schmid, Cécile; Maurage, Pierre; De Mol, Jan; Van der Linden, Martial

    2015-01-01

    Dysfunctional use of the mobile phone has often been conceptualized as a 'behavioural addiction' that shares most features with drug addictions. In the current article, we challenge the clinical utility of the addiction model as applied to mobile phone overuse. We describe the case of a woman who overuses her mobile phone from two distinct approaches: (1) a symptom-based categorical approach inspired from the addiction model of dysfunctional mobile phone use and (2) a process-based approach resulting from an idiosyncratic clinical case conceptualization. In the case depicted here, the addiction model was shown to lead to standardized and non-relevant treatment, whereas the clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific, empirically based psychological interventions. This finding highlights that conceptualizing excessive behaviours (e.g., gambling and sex) within the addiction model can be a simplification of an individual's psychological functioning, offering only limited clinical relevance. The addiction model, applied to excessive behaviours (e.g., gambling, sex and Internet-related activities) may lead to non-relevant standardized treatments. Clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific empirically based psychological interventions. The biomedical model might lead to the simplification of an individual's psychological functioning with limited clinical relevance. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Introduction to Message-Bus Architectures for Space Systems

    NASA Technical Reports Server (NTRS)

    Smith, Dan; Gregory, Brian

    2005-01-01

    This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system development process are presented. Benefits and lessons learned will be discussed and time for questions and answers will be provided.

  12. Pediatric Psycho-oncology Care: Standards, Guidelines and Consensus Reports

    PubMed Central

    Wiener, Lori; Viola, Adrienne; Koretski, Julia; Perper, Emily Diana; Patenaude, Andrea Farkas

    2014-01-01

    Objective To identify existing guidelines, standards, or consensus-based reports for psychosocial care of children with cancer and their families. Purpose Psychosocial standards of care for children with cancer can systematize the approach to care and create a replicable model that can be utilized in pediatric hospitals around the world. Determining gaps in existing standards in pediatric psycho-oncology can guide development of useful evidence- and consensus-based standards. Methods The MEDLINE and PubMed databases were searched by investigators at two major pediatric oncology centers for existing guidelines, consensus-based reports, or standards for psychosocial care of pediatric cancer patients and their families published in peer-reviewed journals in English between 1980 and 2013. Results We located 27 articles about psychosocial care that met inclusion criteria: 5 set forth standards, 19 guidelines and 3 were consensus-based reports. None were sufficiently up-to-date, significantly evidence-based, comprehensive and specific enough to serve as a current standard for psychosocial care for children with cancer and their families. Conclusion Despite calls by a number of international pediatric oncology and psycho-oncology professional organizations about the urgency of addressing the psychosocial needs of the child with cancer in order to reduce suffering, there remains a need for development of a widely acceptable, evidence- and consensus-based, comprehensive standard of care to guide provision of essential psychosocial services to all pediatric cancer patients. PMID:24906202

  13. A Kinect-based system for automatic recording of some pigeon behaviors.

    PubMed

    Lyons, Damian M; MacDonall, James S; Cunningham, Kelly M

    2015-12-01

    Contact switches and touch screens are the state of the art for recording pigeons' pecking behavior. Recording other behavior, however, requires a different sensor for each behavior, and some behaviors cannot easily be recorded. We present a flexible and inexpensive image-based approach to detecting and counting pigeon behaviors that is based on the Kinect sensor from Microsoft. Although the system is as easy to set up and use as the standard approaches, it is more flexible because it can record behaviors in addition to key pecking. In this article, we show how both the fast, fine motion of key pecking and the gross body activity of feeding can be measured. Five pigeons were trained to peck at a lighted contact switch, a pigeon key, to obtain food reward. The timing of the pecks and the food reward signals were recorded in a log file using standard equipment. The Kinect-based system, called BehaviorWatch, also measured the pecking and feeding behavior and generated a different log file. For key pecking, BehaviorWatch had an average sensitivity of 95% and a precision of 91%, which were very similar to the pecking measurements from the standard equipment. For detecting feeding activity, BehaviorWatch had a sensitivity of 95% and a precision of 97%. These results allow us to demonstrate that an advantage of the Kinect-based approach is that it can also be reliably used to measure activity other than key pecking.

  14. Approach for Self-Calibrating CO2 Measurements with Linear Membrane-Based Gas Sensors

    PubMed Central

    Lazik, Detlef; Sood, Pramit

    2016-01-01

    Linear membrane-based gas sensors that can be advantageously applied for the measurement of a single gas component in large heterogeneous systems, e.g., for representative determination of CO2 in the subsurface, can be designed depending on the properties of the observation object. A resulting disadvantage is that the permeation-based sensor response depends on operating conditions, the individual site-adapted sensor geometry, the membrane material, and the target gas component. Therefore, calibration is needed, especially of the slope, which could change over several orders of magnitude. A calibration-free approach based on an internal gas standard is developed to overcome the multi-criterial slope dependency. This results in a normalization of sensor response and enables the sensor to assess the significance of measurement. The approach was proofed on the example of CO2 analysis in dry air with tubular PDMS membranes for various CO2 concentrations of an internal standard. Negligible temperature dependency was found within an 18 K range. The transformation behavior of the measurement signal and the influence of concentration variations of the internal standard on the measurement signal were shown. Offsets that were adjusted based on the stated theory for the given measurement conditions and material data from the literature were in agreement with the experimentally determined offsets. A measurement comparison with an NDIR reference sensor shows an unexpectedly low bias (<1%) of the non-calibrated sensor response, and comparable statistical uncertainty. PMID:27869656

  15. Development of a genus-specific next generation sequencing approach for sensitive and quantitative determination of the Legionella microbiome in freshwater systems.

    PubMed

    Pereira, Rui P A; Peplies, Jörg; Brettar, Ingrid; Höfle, Manfred G

    2017-03-31

    Next Generation Sequencing (NGS) has revolutionized the analysis of natural and man-made microbial communities by using universal primers for bacteria in a PCR based approach targeting the 16S rRNA gene. In our study we narrowed primer specificity to a single, monophyletic genus because for many questions in microbiology only a specific part of the whole microbiome is of interest. We have chosen the genus Legionella, comprising more than 20 pathogenic species, due to its high relevance for water-based respiratory infections. A new NGS-based approach was designed by sequencing 16S rRNA gene amplicons specific for the genus Legionella using the Illumina MiSeq technology. This approach was validated and applied to a set of representative freshwater samples. Our results revealed that the generated libraries presented a low average raw error rate per base (<0.5%); and substantiated the use of high-fidelity enzymes, such as KAPA HiFi, for increased sequence accuracy and quality. The approach also showed high in situ specificity (>95%) and very good repeatability. Only in samples in which the gammabacterial clade SAR86 was present more than 1% non-Legionella sequences were observed. Next-generation sequencing read counts did not reveal considerable amplification/sequencing biases and showed a sensitive as well as precise quantification of L. pneumophila along a dilution range using a spiked-in, certified genome standard. The genome standard and a mock community consisting of six different Legionella species demonstrated that the developed NGS approach was quantitative and specific at the level of individual species, including L. pneumophila. The sensitivity of our genus-specific approach was at least one order of magnitude higher compared to the universal NGS approach. Comparison of quantification by real-time PCR showed consistency with the NGS data. Overall, our NGS approach can determine the quantitative abundances of Legionella species, i. e. the complete Legionella microbiome, without the need for species-specific primers. The developed NGS approach provides a new molecular surveillance tool to monitor all Legionella species in qualitative and quantitative terms if a spiked-in genome standard is used to calibrate the method. Overall, the genus-specific NGS approach opens up a new avenue to massive parallel diagnostics in a quantitative, specific and sensitive way.

  16. Application status and its affecting factors of double standard for multinational corporations in Korea.

    PubMed

    Ki, Myung; Choi, Jaewook; Lee, Juneyoung; Park, Heechan; Yoon, Seokjoon; Kim, Namhoon; Heo, Jungyeon

    2004-02-01

    We intended to evaluate the double standard status and to identify factors of determining double standard criteria in multinational corporations of Korea, and specifically those in the occupational health and safety area. A postal questionnaire had been sent, between August 2002 and September 2002, to multinational corporations in Korea. A double standard company was defined as those who answered in more than one item as adopting a different standard among the five items regarding double standard identification. By comparing double standard companies with equivalent standard companies, determinants for double standards were then identified using logistic regression analysis. Of multinational corporations, 45.1% had adopted a double standard. Based on the question naire's scale level, the factor of 'characteristic and size of multinational corporation' was found to have the most potent impact on increasing double standard risk. On the variable level, factors of 'number of affiliated companies' and 'existence of an auditing system with the parent company' showed a strong negative impact on double standard risk. Our study suggests that a distinctive approach is needed to manage the occupational safety and health for multinational corporations. This approach should be focused on the specific level of a corporation, not on a country level.

  17. 75 FR 12753 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-17

    ... effective at improving health care quality. While evidence-based approaches for decisionmaking have become standard in healthcare, this has been limited in laboratory medicine. No single- evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...

  18. Identifying and individuating cognitive systems: a task-based distributed cognition alternative to agent-based extended cognition.

    PubMed

    Davies, Jim; Michaelian, Kourken

    2016-08-01

    This article argues for a task-based approach to identifying and individuating cognitive systems. The agent-based extended cognition approach faces a problem of cognitive bloat and has difficulty accommodating both sub-individual cognitive systems ("scaling down") and some supra-individual cognitive systems ("scaling up"). The standard distributed cognition approach can accommodate a wider variety of supra-individual systems but likewise has difficulties with sub-individual systems and faces the problem of cognitive bloat. We develop a task-based variant of distributed cognition designed to scale up and down smoothly while providing a principled means of avoiding cognitive bloat. The advantages of the task-based approach are illustrated by means of two parallel case studies: re-representation in the human visual system and in a biomedical engineering laboratory.

  19. Standardizing terminology for minimally invasive pancreatic resection.

    PubMed

    Montagnini, Andre L; Røsok, Bård I; Asbun, Horacio J; Barkun, Jeffrey; Besselink, Marc G; Boggi, Ugo; Conlon, Kevin C P; Fingerhut, Abe; Han, Ho-Seong; Hansen, Paul D; Hogg, Melissa E; Kendrick, Michael L; Palanivelu, Chinnusamy; Shrikhande, Shailesh V; Wakabayashi, Go; Zeh, Herbert; Vollmer, Charles M; Kooby, David A

    2017-03-01

    There is a growing body of literature pertaining to minimally invasive pancreatic resection (MIPR). Heterogeneity in MIPR terminology, leads to confusion and inconsistency. The Organizing Committee of the State of the Art Conference on MIPR collaborated to standardize MIPR terminology. After formal literature review for "minimally invasive pancreatic surgery" term, key terminology elements were identified. A questionnaire was created assessing the type of resection, the approach, completion, and conversion. Delphi process was used to identify the level of agreement among the experts. A systematic terminology template was developed based on combining the approach and resection taking into account the completion. For a solitary approach the term should combine "approach + resection" (e.g. "laparoscopic pancreatoduodenectomy); for combined approaches the term must combine "first approach + resection" with "second approach + reconstruction" (e.g. "laparoscopic central pancreatectomy" with "open pancreaticojejunostomy") and where conversion has resulted the recommended term is "first approach" + "converted to" + "second approach" + "resection" (e.g. "robot-assisted" "converted to open" "pancreatoduodenectomy") CONCLUSIONS: The guidelines presented are geared towards standardizing terminology for MIPR, establishing a basis for comparative analyses and registries and allow incorporating future surgical and technological advances in MIPR. Copyright © 2017 International Hepato-Pancreato-Biliary Association Inc. Published by Elsevier Ltd. All rights reserved.

  20. Identification of species based on DNA barcode using k-mer feature vector and Random forest classifier.

    PubMed

    Meher, Prabina Kumar; Sahu, Tanmaya Kumar; Rao, A R

    2016-11-05

    DNA barcoding is a molecular diagnostic method that allows automated and accurate identification of species based on a short and standardized fragment of DNA. To this end, an attempt has been made in this study to develop a computational approach for identifying the species by comparing its barcode with the barcode sequence of known species present in the reference library. Each barcode sequence was first mapped onto a numeric feature vector based on k-mer frequencies and then Random forest methodology was employed on the transformed dataset for species identification. The proposed approach outperformed similarity-based, tree-based, diagnostic-based approaches and found comparable with existing supervised learning based approaches in terms of species identification success rate, while compared using real and simulated datasets. Based on the proposed approach, an online web interface SPIDBAR has also been developed and made freely available at http://cabgrid.res.in:8080/spidbar/ for species identification by the taxonomists. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Standard cell-based implementation of a digital optoelectronic neural-network hardware.

    PubMed

    Maier, K D; Beckstein, C; Blickhan, R; Erhard, W

    2001-03-10

    A standard cell-based implementation of a digital optoelectronic neural-network architecture is presented. The overall structure of the multilayer perceptron network that was used, the optoelectronic interconnection system between the layers, and all components required in each layer are defined. The design process from VHDL-based modeling from synthesis and partly automatic placing and routing to the final editing of one layer of the circuit of the multilayer perceptrons are described. A suitable approach for the standard cell-based design of optoelectronic systems is presented, and shortcomings of the design tool that was used are pointed out. The layout for the microelectronic circuit of one layer in a multilayer perceptron neural network with a performance potential 1 magnitude higher than neural networks that are purely electronic based has been successfully designed.

  2. When is hub gene selection better than standard meta-analysis?

    PubMed

    Langfelder, Peter; Mischel, Paul S; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to gene expression data and presents novel R functions for carrying out consensus network analysis, network based screening, and meta analysis.

  3. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  4. Approaching Cauchy's Theorem

    ERIC Educational Resources Information Center

    Garcia, Stephan Ramon; Ross, William T.

    2017-01-01

    We hope to initiate a discussion about various methods for introducing Cauchy's Theorem. Although Cauchy's Theorem is the fundamental theorem upon which complex analysis is based, there is no "standard approach." The appropriate choice depends upon the prerequisites for the course and the level of rigor intended. Common methods include…

  5. ADEpedia: a scalable and standardized knowledge base of Adverse Drug Events using semantic web technology.

    PubMed

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2011-01-01

    A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.

  6. Implementation of an Evidence-Based and Content Validated Standardized Ostomy Algorithm Tool in Home Care: A Quality Improvement Project.

    PubMed

    Bare, Kimberly; Drain, Jerri; Timko-Progar, Monica; Stallings, Bobbie; Smith, Kimberly; Ward, Naomi; Wright, Sandra

    Many nurses have limited experience with ostomy management. We sought to provide a standardized approach to ostomy education and management to support nurses in early identification of stomal and peristomal complications, pouching problems, and provide standardized solutions for managing ostomy care in general while improving utilization of formulary products. This article describes development and testing of an ostomy algorithm tool.

  7. Identification of watershed priority management areas under water quality constraints: A simulation-optimization approach with ideal load reduction

    NASA Astrophysics Data System (ADS)

    Dong, Feifei; Liu, Yong; Wu, Zhen; Chen, Yihui; Guo, Huaicheng

    2018-07-01

    Targeting nonpoint source (NPS) pollution hot spots is of vital importance for placement of best management practices (BMPs). Although physically-based watershed models have been widely used to estimate nutrient emissions, connections between nutrient abatement and compliance of water quality standards have been rarely considered in NPS hotspot ranking, which may lead to ineffective decision-making. It's critical to develop a strategy to identify priority management areas (PMAs) based on water quality response to nutrient load mitigation. A water quality constrained PMA identification framework was thereby proposed in this study, based on the simulation-optimization approach with ideal load reduction (ILR-SO). It integrates the physically-based Soil and Water Assessment Tool (SWAT) model and an optimization model under constraints of site-specific water quality standards. To our knowledge, it was the first effort to identify PMAs with simulation-based optimization. The SWAT model was established to simulate temporal and spatial nutrient loading and evaluate effectiveness of pollution mitigation. A metamodel was trained to establish a quantitative relationship between sources and water quality. Ranking of priority areas is based on required nutrient load reduction in each sub-watershed targeting to satisfy water quality standards in waterbodies, which was calculated with genetic algorithm (GA). The proposed approach was used for identification of PMAs on the basis of diffuse total phosphorus (TP) in Lake Dianchi Watershed, one of the three most eutrophic large lakes in China. The modeling results demonstrated that 85% of diffuse TP came from 30% of the watershed area. Compared with the two conventional targeting strategies based on overland nutrient loss and instream nutrient loading, the ILR-SO model identified distinct PMAs and narrowed down the coverage of management areas. This study addressed the urgent need to incorporate water quality response into PMA identification and showed that the ILR-SO approach is effective to guide watershed management for aquatic ecosystem restoration.

  8. Recommendation of standardized health learning contents using archetypes and semantic web technologies.

    PubMed

    Legaz-García, María del Carmen; Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2012-01-01

    Linking Electronic Healthcare Records (EHR) content to educational materials has been considered a key international recommendation to enable clinical engagement and to promote patient safety. This would suggest citizens to access reliable information available on the web and to guide them properly. In this paper, we describe an approach in that direction, based on the use of dual model EHR standards and standardized educational contents. The recommendation method will be based on the semantic coverage of the learning content repository for a particular archetype, which will be calculated by applying semantic web technologies like ontologies and semantic annotations.

  9. Multiresidue determination of pesticides in crop plants by the quick, easy, cheap, effective, rugged, and safe method and ultra-high-performance liquid chromatography tandem mass spectrometry using a calibration based on a single level standard addition in the sample.

    PubMed

    Viera, Mariela S; Rizzetti, Tiele M; de Souza, Maiara P; Martins, Manoel L; Prestes, Osmar D; Adaime, Martha B; Zanella, Renato

    2017-12-01

    In this study, a QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) method, optimized by a 2 3 full factorial design, was developed for the determination of 72 pesticides in plant parts of carrot, corn, melon, rice, soy, silage, tobacco, cassava, lettuce and wheat by ultra-high-performance liquid chromatographic tandem mass spectrometry (UHPLC-MS/MS). Considering the complexity of these matrices and the need of use calibration in matrix, a new calibration approach based on single level standard addition in the sample (SLSAS) was proposed in this work and compared with the matrix-matched calibration (MMC), the procedural standard calibration (PSC) and the diluted standard addition calibration (DSAC). All approaches presented satisfactory validation parameters with recoveries from 70 to 120% and relative standard deviations≤20%. SLSAS was the most practical from the evaluated approaches and proved to be an effective way of calibration. Method limit of detection were between 4.8 and 48μgkg -1 and limit of quantification were from 16 to 160μgkg -1 . Method application to different kinds of plants found residues of 20 pesticides that were quantified with z-scores values≤2 in comparison with other calibration approaches. The proposed QuEChERS method combined with UHPLC-MS/MS analysis and using an easy and effective calibration procedure presented satisfactory results for pesticide residues determination in different crop plants and is a good alternative for routine analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Modeling and Control of a Tailsitter with a Ducted Fan

    NASA Astrophysics Data System (ADS)

    Argyle, Matthew Elliott

    There are two traditional aircraft categories: fixed-wing which have a long endurance and a high cruise airspeed and rotorcraft which can take-off and land vertically. The tailsitter is a type of aircraft that has the strengths of both platforms, with no additional mechanical complexity, because it takes off and lands vertically on its tail and can transition the entire aircraft horizontally into high-speed flight. In this dissertation, we develop the entire control system for a tailsitter with a ducted fan. The standard method to compute the quaternion-based attitude error does not generate ideal trajectories for a hovering tailsitter for some situations. In addition, the only approach in the literature to mitigate this breaks down for large attitude errors. We develop an alternative quaternion-based error method which generates better trajectories than the standard approach and can handle large errors. We also derive a hybrid backstepping controller with almost global asymptotic stability based on this error method. Many common altitude and airspeed control schemes for a fixed-wing airplane assume that the altitude and airspeed dynamics are decoupled which leads to errors. The Total Energy Control System (TECS) is an approach that controls the altitude and airspeed by manipulating the total energy rate and energy distribution rate, of the aircraft, in a manner which accounts for the dynamic coupling. In this dissertation, a nonlinear controller, which can handle inaccurate thrust and drag models, based on the TECS principles is derived. Simulation results show that the nonlinear controller has better performance than the standard PI TECS control schemes. Most constant altitude transitions are accomplished by generating an optimal trajectory, and potentially actuator inputs, based on a high fidelity model of the aircraft. While there are several approaches to mitigate the effects of modeling errors, these do not fully remove the accurate model requirement. In this dissertation, we develop two different approaches that can achieve near constant altitude transitions for some types of aircraft. The first method, based on multiple LQR controllers, requires a high fidelity model of the aircraft. However, the second method, based on the energy along the body axes, requires almost no aerodynamic information.

  11. 78 FR 9698 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... effective at improving health care quality. While evidence-based approaches for decision-making have become standard in healthcare, this has been limited in laboratory medicine. No single-evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...

  12. Polarization Impacts on the Water-Leaving Radiance Retrieval from Above-Water Radiometric Measurements

    DTIC Science & Technology

    2012-12-10

    shown, based on time series of col- located data acquired in coastal waters, that the azimuth range of measurements leading to good-quality data is...radiometry, the standard way to de- rive the sea surface reflectance is based on sky radi- ance measurements Lskv acquired at the same time as the total sea...approach for correcting the reflected sea surface signal. Because the radiative-transfer- based approach has been validated over the whole time series

  13. Contextualizing Learning Scenarios According to Different Learning Management Systems

    ERIC Educational Resources Information Center

    Drira, R.; Laroussi, M.; Le Pallec, X.; Warin, B.

    2012-01-01

    In this paper, we first demonstrate that an instructional design process of Technology Enhanced Learning (TEL) systems based on a Model Driven Approach (MDA) addresses the limits of Learning Technology Standards (LTS), such as SCORM and IMS-LD. Although these standards ensure the interoperability of TEL systems across different Learning Management…

  14. Can a Competence or Standards Model Facilitate an Inclusive Approach to Teacher Education?

    ERIC Educational Resources Information Center

    Moran, Anne

    2009-01-01

    The paper seeks to determine whether programmes of initial teacher education (ITE) can contribute to the development of beginning teachers' inclusive attitudes, values and practices. The majority of ITE programmes are based on government prescribed competence or standards frameworks, which are underpinned by Codes of Professional Values. It is…

  15. 77 FR 1013 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-09

    ... airports. These regulatory actions are needed because of the adoption of new or revised criteria, or... this amendment are based on the criteria contained in the U.S. Standard for Terminal Instrument Procedures (TERPS). In developing these changes to SIAPs, the TERPS criteria were applied only to specific...

  16. Combining the Best of Two Standard Setting Methods: The Ordered Item Booklet Angoff

    ERIC Educational Resources Information Center

    Smith, Russell W.; Davis-Becker, Susan L.; O'Leary, Lisa S.

    2014-01-01

    This article describes a hybrid standard setting method that combines characteristics of the Angoff (1971) and Bookmark (Mitzel, Lewis, Patz & Green, 2001) methods. The proposed approach utilizes strengths of each method while addressing weaknesses. An ordered item booklet, with items sorted based on item difficulty, is used in combination…

  17. Writing Instruction in First Grade: An Observational Study

    ERIC Educational Resources Information Center

    Coker, David L., Jr.; Farley-Ripple, Elizabeth; Jackson, Allison F.; Wen, Huijing; MacArthur, Charles A.; Jennings, Austin S.

    2016-01-01

    As schools work to meet the ambitious Common Core State Standards in writing (Common Core State Standards Initiation, 2010), instructional approaches are likely to be examined. However, there is little research that describes the current state of instruction. This study was designed to expand the empirical base on writing instruction in first…

  18. Diagnostic Pathway with Multiparametric Magnetic Resonance Imaging Versus Standard Pathway: Results from a Randomized Prospective Study in Biopsy-naïve Patients with Suspected Prostate Cancer.

    PubMed

    Porpiglia, Francesco; Manfredi, Matteo; Mele, Fabrizio; Cossu, Marco; Bollito, Enrico; Veltri, Andrea; Cirillo, Stefano; Regge, Daniele; Faletti, Riccardo; Passera, Roberto; Fiori, Cristian; De Luca, Stefano

    2017-08-01

    An approach based on multiparametric magnetic resonance imaging (mpMRI) might increase the detection rate (DR) of clinically significant prostate cancer (csPCa). To compare an mpMRI-based pathway with the standard approach for the detection of prostate cancer (PCa) and csPCa. Between November 2014 and April 2016, 212 biopsy-naïve patients with suspected PCa (prostate specific antigen level ≤15 ng/ml and negative digital rectal examination results) were included in this randomized clinical trial. Patients were randomized into a prebiopsy mpMRI group (arm A, n=107) or a standard biopsy (SB) group (arm B, n=105). In arm A, patients with mpMRI evidence of lesions suspected for PCa underwent mpMRI/transrectal ultrasound fusion software-guided targeted biopsy (TB) (n=81). The remaining patients in arm A (n=26) with negative mpMRI results and patients in arm B underwent 12-core SB. The primary end point was comparison of the DR of PCa and csPCa between the two arms of the study; the secondary end point was comparison of the DR between TB and SB. The overall DRs were higher in arm A versus arm B for PCa (50.5% vs 29.5%, respectively; p=0.002) and csPCa (43.9% vs 18.1%, respectively; p<0.001). Concerning the biopsy approach, that is, TB in arm A, SB in arm A, and SB in arm B, the overall DRs were significantly different for PCa (60.5% vs 19.2% vs 29.5%, respectively; p<0.001) and for csPCa (56.8% vs 3.8% vs 18.1%, respectively; p<0.001). The reproducibility of the study could have been affected by the single-center nature. A diagnostic pathway based on mpMRI had a higher DR than the standard pathway in both PCa and csPCa. In this randomized trial, a pathway for the diagnosis of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI) was compared with the standard pathway based on random biopsy. The mpMRI-based pathway had better performance than the standard pathway. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  19. SLAE–CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies

    PubMed Central

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-01-01

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE–CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE–CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE–CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology. PMID:28657577

  20. SLAE-CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies.

    PubMed

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-06-28

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE-CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE-CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE-CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology.

  1. Fibrinolysis standards: a review of the current status.

    PubMed

    Thelwell, C

    2010-07-01

    Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.

  2. Plane-Based Registration of Several Thousand Laser Scans on Standard Hardware

    NASA Astrophysics Data System (ADS)

    Wujanz, D.; Schaller, S.; Gielsdorf, F.; Gründig, L.

    2018-05-01

    The automatic registration of terrestrial laser scans appears to be a solved problem in science as well as in practice. However, this assumption is questionable especially in the context of large projects where an object of interest is described by several thousand scans. A critical issue inherently linked to this task is memory management especially if cloud-based registration approaches such as the ICP are being deployed. In order to process even thousands of scans on standard hardware a plane-based registration approach is applied. As a first step planar features are detected within the unregistered scans. This step drastically reduces the amount of data that has to be handled by the hardware. After determination of corresponding planar features a pairwise registration procedure is initiated based on a graph that represents topological relations among all scans. For every feature individual stochastic characteristics are computed that are consequently carried through the algorithm. Finally, a block adjustment is carried out that minimises the residuals between redundantly captured areas. The algorithm is demonstrated on a practical survey campaign featuring a historic town hall. In total, 4853 scans were registered on a standard PC with four processors (3.07 GHz) and 12 GB of RAM.

  3. Undoing Appropriateness: Raciolinguistic Ideologies and Language Diversity in Education

    ERIC Educational Resources Information Center

    Flores, Nelson; Rosa, Jonathan

    2015-01-01

    In this article, Nelson Flores and Jonathan Rosa critique appropriateness-based approaches to language diversity in education. Those who subscribe to these approaches conceptualize standardized linguistic practices as an objective set of linguistic forms that are appropriate for an academic setting. In contrast, Flores and Rosa highlight the…

  4. Teaching the Korean Folk Song ("Arirang") through Performing, Creating, and Responding

    ERIC Educational Resources Information Center

    Yoo, Hyesoo; Kang, Sangmi

    2017-01-01

    This article introduces a pedagogical approach to teaching one of the renowned Korean folk songs ("Arirang") based on the comprehensive musicianship approach and the 2014 Music Standards (competencies in performing, creating, and responding to music). The authors provide in-depth information for music educators to help their students…

  5. Saxon Elementary School Math. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2006

    2006-01-01

    The What Works Clearinghouse (WWC) reviewed seven studies of the "Saxon Elementary School Math program." A distinguishing feature of "Saxon Elementary School Math" is its use of a distributed approach, as opposed to a chapter-based approach, for instruction and assessment. One of these studies met WWC standards with…

  6. Australian Recognition Framework Arrangements. Australia's National Training Framework.

    ERIC Educational Resources Information Center

    Australian National Training Authority, Brisbane.

    This document explains the objectives, principles, standards, and protocols of the Australian Recognition Framework (ARF), which is a comprehensive approach to national recognition of vocational education and training (VET) that is based on a quality-assured approach to the registration of training organizations seeking to deliver training, assess…

  7. Does the type of weight loss diet affect who participates in a behavioral weight loss intervention? A comparison of participants for a plant-based diet versus a standard diet trial

    PubMed Central

    Turner-McGrievy, Gabrielle M.; Davidson, Charis R.; Wilcox, Sara

    2014-01-01

    Studies have found that people following plant-based eating styles, such as vegan or vegetarian diets, often have different demographic characteristics, eating styles, and physical activity (PA) levels than individuals following an omnivorous dietary pattern. There has been no research examining if there are differences in these characteristics among people who are willing to participate in a weight loss intervention using plant-based dietary approaches as compared to a standard reduced calorie approach, which doesn’t exclude food groups. The present study compared baseline characteristics (demographics, dietary intake, eating behaviors (Eating Behavior Inventory), and PA (Paffenbarger Physical Activity Questionnaire)) of participants enrolling in two different 6-month behavioral weight loss studies: the mobile Pounds Off Digitally (mPOD) study, which used a standard reduced calorie dietary approach and the New Dietary Interventions to Enhance the Treatments for weight loss (New DIETs) study, which randomized participants to follow one of five different dietary approaches (vegan, vegetarian, pesco-vegetarian, semi-vegetarian, or omnivorous diets). There were no differences in baseline demographics with the exception of New DIETs participants being older (48.5 ± 8.3 years vs. 42.9 ± 11.2, P=0.001) and having a higher Body Mass Index (BMI, 35.2 ± 5.3 kg/m2 vs. 32.6 ± 4.7 kg/m2, P=0.001) than mPOD participants. In age- and BMI-adjusted models, there were no differences in EBI scores or in any dietary variables, with the exception of vitamin C (85.6 ± 5.9 mg/d mPOD vs. 63.4 ± 7.4 mg/d New DIETs, P=0.02). New DIETs participants reported higher levels of intentional PA/day (180.0 ± 18.1 kcal/d) than mPOD participants (108.8 ± 14.4 kcal/d, P=0.003), which may have been the result of New DIETs study recommendations to avoid increasing or decreasing PA during the study. The findings of this study demonstrate that using plant-based dietary approaches for weight loss intervention studies does not lead to a population which is significantly different from who enrolls in a standard, behavioral weight loss study using a reduced calorie dietary approach. PMID:24269507

  8. Does the type of weight loss diet affect who participates in a behavioral weight loss intervention? A comparison of participants for a plant-based diet versus a standard diet trial.

    PubMed

    Turner-McGrievy, Gabrielle M; Davidson, Charis R; Wilcox, Sara

    2014-02-01

    Studies have found that people following plant-based eating styles, such as vegan or vegetarian diets, often have different demographic characteristics, eating styles, and physical activity (PA) levels than individuals following an omnivorous dietary pattern. There has been no research examining if there are differences in these characteristics among people who are willing to participate in a weight loss intervention using plant-based dietary approaches as compared to a standard reduced calorie approach, which does not exclude food groups. The present study compared baseline characteristics (demographics, dietary intake, eating behaviors (Eating Behavior Inventory), and PA (Paffenbarger Physical Activity Questionnaire)) of participants enrolling in two different 6-month behavioral weight loss studies: the mobile Pounds Off Digitally (mPOD) study, which used a standard reduced calorie dietary approach and the New Dietary Interventions to Enhance the Treatments for weight loss (New DIETs) study, which randomized participants to follow one of five different dietary approaches (vegan, vegetarian, pesco-vegetarian, semi-vegetarian, or omnivorous diets). There were no differences in baseline demographics with the exception of New DIETs participants being older (48.5±8.3years versus 42.9±11.2, P=0.001) and having a higher Body Mass Index (BMI, 35.2±5.3kg/m(2) versus 32.6±4.7kg/m(2), P=0.001) than mPOD participants. In age- and BMI-adjusted models, there were no differences in EBI scores or in any dietary variables, with the exception of vitamin C (85.6±5.9mg/d mPOD versus 63.4±7.4mg/d New DIETs, P=0.02). New DIETs participants reported higher levels of intentional PA/day (180.0±18.1kcal/d) than mPOD participants (108.8±14.4kcal/d, P=0.003), which may have been the result of New DIETs study recommendations to avoid increasing or decreasing PA during the study. The findings of this study demonstrate that using plant-based dietary approaches for weight loss intervention studies does not lead to a population which is significantly different from who enrolls in a standard, behavioral weight loss study using a reduced calorie dietary approach. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NASA Astrophysics Data System (ADS)

    Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.

  10. Small drinking water systems under spatiotemporal water quality variability: a risk-based performance benchmarking framework.

    PubMed

    Bereskie, Ty; Haider, Husnain; Rodriguez, Manuel J; Sadiq, Rehan

    2017-08-23

    Traditional approaches for benchmarking drinking water systems are binary, based solely on the compliance and/or non-compliance of one or more water quality performance indicators against defined regulatory guidelines/standards. The consequence of water quality failure is dependent on location within a water supply system as well as time of the year (i.e., season) with varying levels of water consumption. Conventional approaches used for water quality comparison purposes fail to incorporate spatiotemporal variability and degrees of compliance and/or non-compliance. This can lead to misleading or inaccurate performance assessment data used in the performance benchmarking process. In this research, a hierarchical risk-based water quality performance benchmarking framework is proposed to evaluate small drinking water systems (SDWSs) through cross-comparison amongst similar systems. The proposed framework (R WQI framework) is designed to quantify consequence associated with seasonal and location-specific water quality issues in a given drinking water supply system to facilitate more efficient decision-making for SDWSs striving for continuous performance improvement. Fuzzy rule-based modelling is used to address imprecision associated with measuring performance based on singular water quality guidelines/standards and the uncertainties present in SDWS operations and monitoring. This proposed R WQI framework has been demonstrated using data collected from 16 SDWSs in Newfoundland and Labrador and Quebec, Canada, and compared to the Canadian Council of Ministers of the Environment WQI, a traditional, guidelines/standard-based approach. The study found that the R WQI framework provides an in-depth state of water quality and benchmarks SDWSs more rationally based on the frequency of occurrence and consequence of failure events.

  11. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  12. Enabling locally-developed content for access through the infobutton by means of automated concept annotation.

    PubMed

    Hulse, Nathan C; Long, Jie; Xu, Xiaomin; Tao, Cui

    2014-01-01

    Infobuttons have proven to be an increasingly important resource in providing a standardized approach to integrating useful educational materials at the point of care in electronic health records (EHRs). They provide a simple, uniform pathway for both patients and providers to receive pertinent education materials in a quick fashion from within EHRs and Personalized Health Records (PHRs). In recent years, the international standards organization Health Level Seven has balloted and approved a standards-based pathway for requesting and receiving data for infobuttons, simplifying some of the barriers for their adoption in electronic medical records and amongst content providers. Local content, developed by the hosting organization themselves, still needs to be indexed and annotated with appropriate metadata and terminologies in order to be fully accessible via the infobutton. In this manuscript we present an approach for automating the annotation of internally-developed patient education sheets with standardized terminologies and compare and contrast the approach with manual approaches used previously. We anticipate that a combination of system-generated and human reviewed annotations will provide the most comprehensive and effective indexing strategy, thereby allowing best access to internally-created content via the infobutton.

  13. Quantum chemical approach to estimating the thermodynamics of metabolic reactions.

    PubMed

    Jinich, Adrian; Rappoport, Dmitrij; Dunn, Ian; Sanchez-Lengeling, Benjamin; Olivares-Amaya, Roberto; Noor, Elad; Even, Arren Bar; Aspuru-Guzik, Alán

    2014-11-12

    Thermodynamics plays an increasingly important role in modeling and engineering metabolism. We present the first nonempirical computational method for estimating standard Gibbs reaction energies of metabolic reactions based on quantum chemistry, which can help fill in the gaps in the existing thermodynamic data. When applied to a test set of reactions from core metabolism, the quantum chemical approach is comparable in accuracy to group contribution methods for isomerization and group transfer reactions and for reactions not including multiply charged anions. The errors in standard Gibbs reaction energy estimates are correlated with the charges of the participating molecules. The quantum chemical approach is amenable to systematic improvements and holds potential for providing thermodynamic data for all of metabolism.

  14. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    NASA Astrophysics Data System (ADS)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  15. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  16. Partially composite particle physics with and without supersymmetry

    NASA Astrophysics Data System (ADS)

    Kramer, Thomas A.

    Theories in which the Standard Model fields are partially compositeness provide elegant and phenomenologically viable solutions to the Hierarchy Problem. In this thesis we will study types of models from two different perspectives. We first derive an effective field theory describing the interactions of the Standard Models fields with their lightest composite partners based on two weakly coupled sectors. Technically, via the AdS/CFT correspondence, our model is dual to a highly deconstructed theory with a single warped extra-dimension. This two sector theory provides a simplified approach to the phenomenology of this important class of theories. We then use this effective field theoretic approach to study models with weak scale accidental supersymmetry. Particularly, we will investigate the possibility that the Standard Model Higgs field is a member of a composite supersymmetric sector interacting weakly with the known Standard Model fields.

  17. Mutation-based learning to improve student autonomy and scientific inquiry skills in a large genetics laboratory course.

    PubMed

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a "mutation" method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the "mutations"; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional "cookbook"-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class.

  18. Using a flipped classroom in an algebra-based physics course

    NASA Astrophysics Data System (ADS)

    Smith, Leigh

    2013-03-01

    The algebra-based physics course is taken by Biology students, Pre-Pharmacy, Pre-Medical, and other health related majors such as medical imaging, physical therapy, and so on. Nearly 500 students take the course each Semester. Student learning is adversely impacted by poor math backgrounds as well as extensive work schedules outside of the classroom. We have been researching the use of an intensive flipped-classroom approach where students spend one to two hours each week preparing for class by reading the book, completing a series of conceptual problems, and viewing videos which describe the material. In class, the new response system Learning Catalytics is used which allows much richer problems to be posed in class and includes sketching figures, numerical or symbolic entries, short answers, highlighting text, etc in addition to the standard multiple choice questions. We make direct comparison of student learning for 1200 sudents who have taken the same tests, 25% of which used the flipped classroom approach, and 75% who took a more standard lecture. There is significant evidence of improvements in student learning for students taking the flipped classroom approach over standard lectures. These benefits appear to impact students at all math backgrounds.

  19. Education in Disaster Management and Emergencies: Defining a New European Course.

    PubMed

    Khorram-Manesh, Amir; Ashkenazi, Michael; Djalali, Ahmadreza; Ingrassia, Pier Luigi; Friedl, Tom; von Armin, Gotz; Lupesco, Olivera; Kaptan, Kubilay; Arculeo, Chris; Hreckovski, Boris; Komadina, Radko; Fisher, Philipp; Voigt, Stefan; James, James; Gursky, Elin

    2015-06-01

    Unremitting natural disasters, deliberate threats, pandemics, and humanitarian suffering resulting from conflict situations necessitate swift and effective response paradigms. The European Union's (EU) increasing visibility as a disaster response enterprise suggests the need not only for financial contribution but also for instituting a coherent disaster response approach and management structure. The DITAC (Disaster Training Curriculum) project identified deficiencies in current responder training approaches and analyzed the characteristics and content required for a new, standardized European course in disaster management and emergencies. Over 35 experts from within and outside the EU representing various organizations and specialties involved in disaster management composed the DITAC Consortium. These experts were also organized into 5 specifically tasked working groups. Extensive literature reviews were conducted to identify requirements and deficiencies and to craft a new training concept based on research trends and lessons learned. A pilot course and program dissemination plan was also developed. The lack of standardization was repeatedly highlighted as a serious deficiency in current disaster training methods, along with gaps in the command, control, and communication levels. A blended and competency-based teaching approach using exercises combined with lectures was recommended to improve intercultural and interdisciplinary integration. The goal of a European disaster management course should be to standardize and enhance intercultural and inter-agency performance across the disaster management cycle. A set of minimal standards and evaluation metrics can be achieved through consensus, education, and training in different units. The core of the training initiative will be a unit that presents a realistic situation "scenario-based training."

  20. A Gold Standards Approach to Training Instructors to Evaluate Crew Performance

    NASA Technical Reports Server (NTRS)

    Baker, David P.; Dismukes, R. Key

    2003-01-01

    The Advanced Qualification Program requires that airlines evaluate crew performance in Line Oriented Simulation. For this evaluation to be meaningful, instructors must observe relevant crew behaviors and evaluate those behaviors consistently and accurately against standards established by the airline. The airline industry has largely settled on an approach in which instructors evaluate crew performance on a series of event sets, using standardized grade sheets on which behaviors specific to event set are listed. Typically, new instructors are given a class in which they learn to use the grade sheets and practice evaluating crew performance observed on videotapes. These classes emphasize reliability, providing detailed instruction and practice in scoring so that all instructors within a given class will give similar scores to similar performance. This approach has value but also has important limitations; (1) ratings within one class of new instructors may differ from those of other classes; (2) ratings may not be driven primarily by the specific behaviors on which the company wanted the crews to be scored; and (3) ratings may not be calibrated to company standards for level of performance skill required. In this paper we provide a method to extend the existing method of training instructors to address these three limitations. We call this method the "gold standards" approach because it uses ratings from the company's most experienced instructors as the basis for training rater accuracy. This approach ties the training to the specific behaviors on which the experienced instructors based their ratings.

  1. Narrative Interest Standard: A Novel Approach to Surrogate Decision-Making for People With Dementia.

    PubMed

    Wilkins, James M

    2017-06-17

    Dementia is a common neurodegenerative process that can significantly impair decision-making capacity as the disease progresses. When a person is found to lack capacity to make a decision, a surrogate decision-maker is generally sought to aid in decision-making. Typical bases for surrogate decision-making include the substituted judgment standard and the best interest standard. Given the heterogeneous and progressive course of dementia, however, these standards for surrogate decision-making are often insufficient in providing guidance for the decision-making for a person with dementia, escalating the likelihood of conflict in these decisions. In this article, the narrative interest standard is presented as a novel and more appropriate approach to surrogate decision-making for people with dementia. Through case presentation and ethical analysis, the standard mechanisms for surrogate decision-making for people with dementia are reviewed and critiqued. The narrative interest standard is then introduced and discussed as a dementia-specific model for surrogate decision-making. Through incorporation of elements of a best interest standard in focusing on the current benefit-burden ratio and elements of narrative to provide context, history, and flexibility for values and preferences that may change over time, the narrative interest standard allows for elaboration of an enriched context for surrogate decision-making for people with dementia. More importantly, however, a narrative approach encourages the direct contribution from people with dementia in authoring the story of what matters to them in their lives.

  2. Handbook for Implementing a Comprehensive Work-Based Learning Program According to the Fair Labor Standards Act. Third Edition. Essential Tools: Improving Secondary Education with Transition for Youth with Disabilities

    ERIC Educational Resources Information Center

    Johnson, David R.; Sword, Carrie; Habhegger, Barbara

    2005-01-01

    Work-Based Learning (WBL) is an effective approach in delivering career and technical education and training to youth with disabilities. This handbook provides guidance to schools operating WBL programs and encourages the adoption of WBL programs by schools not presently using this approach. By following the information and examples in this…

  3. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  4. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS.

    PubMed

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-05-18

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013-2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes.

  5. Query Health: standards-based, cross-platform population health surveillance

    PubMed Central

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Objective Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Materials and methods Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. Results We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. Discussions This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Conclusions Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. PMID:24699371

  6. Query Health: standards-based, cross-platform population health surveillance.

    PubMed

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. Intra-operative adjustment of standard planes in C-arm CT image data.

    PubMed

    Brehler, Michael; Görres, Joseph; Franke, Jochen; Barth, Karl; Vetter, Sven Y; Grützner, Paul A; Meinzer, Hans-Peter; Wolf, Ivo; Nabers, Diana

    2016-03-01

    With the help of an intra-operative mobile C-arm CT, medical interventions can be verified and corrected, avoiding the need for a post-operative CT and a second intervention. An exact adjustment of standard plane positions is necessary for the best possible assessment of the anatomical regions of interest but the mobility of the C-arm causes the need for a time-consuming manual adjustment. In this article, we present an automatic plane adjustment at the example of calcaneal fractures. We developed two feature detection methods (2D and pseudo-3D) based on SURF key points and also transferred the SURF approach to 3D. Combined with an atlas-based registration, our algorithm adjusts the standard planes of the calcaneal C-arm images automatically. The robustness of the algorithms is evaluated using a clinical data set. Additionally, we tested the algorithm's performance for two registration approaches, two resolutions of C-arm images and two methods for metal artifact reduction. For the feature extraction, the novel 3D-SURF approach performs best. As expected, a higher resolution ([Formula: see text] voxel) leads also to more robust feature points and is therefore slightly better than the [Formula: see text] voxel images (standard setting of device). Our comparison of two different artifact reduction methods and the complete removal of metal in the images shows that our approach is highly robust against artifacts and the number and position of metal implants. By introducing our fast algorithmic processing pipeline, we developed the first steps for a fully automatic assistance system for the assessment of C-arm CT images.

  8. Efficient approach to the free energy of crystals via Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Navascués, G.; Velasco, E.

    2015-08-01

    We present a general approach to compute the absolute free energy of a system of particles with constrained center of mass based on the Monte Carlo thermodynamic coupling integral method. The version of the Frenkel-Ladd approach [J. Chem. Phys. 81, 3188 (1984)], 10.1063/1.448024, which uses a harmonic coupling potential, is recovered. Also, we propose a different choice, based on one-particle square-well coupling potentials, which is much simpler, more accurate, and free from some of the difficulties of the Frenkel-Ladd method. We apply our approach to hard spheres and compare with the standard harmonic method.

  9. PheProb: probabilistic phenotyping using diagnosis codes to improve power for genetic association studies.

    PubMed

    Sinnott, Jennifer A; Cai, Fiona; Yu, Sheng; Hejblum, Boris P; Hong, Chuan; Kohane, Isaac S; Liao, Katherine P

    2018-05-17

    Standard approaches for large scale phenotypic screens using electronic health record (EHR) data apply thresholds, such as ≥2 diagnosis codes, to define subjects as having a phenotype. However, the variation in the accuracy of diagnosis codes can impair the power of such screens. Our objective was to develop and evaluate an approach which converts diagnosis codes into a probability of a phenotype (PheProb). We hypothesized that this alternate approach for defining phenotypes would improve power for genetic association studies. The PheProb approach employs unsupervised clustering to separate patients into 2 groups based on diagnosis codes. Subjects are assigned a probability of having the phenotype based on the number of diagnosis codes. This approach was developed using simulated EHR data and tested in a real world EHR cohort. In the latter, we tested the association between low density lipoprotein cholesterol (LDL-C) genetic risk alleles known for association with hyperlipidemia and hyperlipidemia codes (ICD-9 272.x). PheProb and thresholding approaches were compared. Among n = 1462 subjects in the real world EHR cohort, the threshold-based p-values for association between the genetic risk score (GRS) and hyperlipidemia were 0.126 (≥1 code), 0.123 (≥2 codes), and 0.142 (≥3 codes). The PheProb approach produced the expected significant association between the GRS and hyperlipidemia: p = .001. PheProb improves statistical power for association studies relative to standard thresholding approaches by leveraging information about the phenotype in the billing code counts. The PheProb approach has direct applications where efficient approaches are required, such as in Phenome-Wide Association Studies.

  10. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  11. Entropy based file type identification and partitioning

    DTIC Science & Technology

    2017-06-01

    energy spectrum,” Proceedings of the Twenty-Ninth International Florida Artificial Intelligence Research Society Conference, pp. 288–293, 2016...ABBREVIATIONS AES Advanced Encryption Standard ANN Artificial Neural Network ASCII American Standard Code for Information Interchange CWT...the identification of file types and file partitioning. This approach has applications in cybersecurity as it allows for a quick determination of

  12. 75 FR 42820 - Notice of Availability of a Final Environmental Assessment (Final EA) and a Finding of No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... evaluated the construction and operation of a new 20,000-square-foot standard design Terminal Radar Approach Control Facility/Base Building conforming to the guidelines of the Terminal Facilities Design Standards... Airport Layout Plan. The Final EA has been prepared in accordance with the National Environmental Policy...

  13. A Regression Model Approach to First-Year Honors Program Admissions Serving a High-Minority Population

    ERIC Educational Resources Information Center

    Rhea, David M.

    2017-01-01

    Many honors programs make admissions decisions based on student high school GPA and a standardized test score. However, McKay argued that standardized test scores can be a barrier to honors program participation, particularly for minority students. Minority students, particularly Hispanic and African American students, are apt to have lower…

  14. Critical Multimodal Literacy and the Common Core: Subversive Curriculum in the Age of Accountability

    ERIC Educational Resources Information Center

    Perttula, Jill

    2017-01-01

    The purpose of this case study research was to understand the ways in which an innovative, urban secondary English teacher (Ms. B) approached English Language Arts, when a set, standardized curriculum and testing were in place. The Common Core standards were prescribed within a required module-based presentation format. New literacies pedagogy…

  15. In Search of the Excellent Literature Teacher: An Inductive Approach to Constructing Professional Teaching Standards

    ERIC Educational Resources Information Center

    Witte, T. C. H.; Jansen, E. P. W. A.

    2015-01-01

    This study makes a contribution to the development of empirically based, domain-specific teaching standards that are acknowledged by the professional community of teachers and which, therefore, have a good chance of being successfully implemented and used for professional development purposes. It was prompted by the resistance on the part of many…

  16. 78 FR 34559 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    ... airports. These regulatory actions are needed because of the adoption of new or revised criteria, or... Minimums and ODPS contained in this amendment are based on the criteria contained in the U.S. Standard for... criteria were applied to the conditions existing or anticipated at the affected airports. Because of the...

  17. 77 FR 31180 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-25

    ... airports. These regulatory actions are needed because of the adoption of new or revised criteria, or... Minimums and ODPS contained in this amendment are based on the criteria contained in the U.S. Standard for... criteria were applied to the conditions existing or anticipated at the affected airports. Because of the...

  18. LESSONS FROM A RETROSPECTIVE ANALYSIS OF A 5-YR PERIOD OF PRESHIPMENT TESTING AT SAN DIEGO ZOO: A RISK-BASED APPROACH TO PRESHIPMENT TESTING MAY BENEFIT ANIMAL WELFARE.

    PubMed

    Marinkovich, Matt; Wallace, Chelsea; Morris, Pat J; Rideout, Bruce; Pye, Geoffrey W

    2016-03-01

    The preshipment examination, with associated transmissible disease testing, has become standard practice in the movement of animals between zoos. An alternative disease risk-based approach, based on a comprehensive surveillance program including necropsy and preventive medicine examination testing and data, has been in practice since 2006 between the San Diego Zoo and San Diego Zoo Safari Park. A retrospective analysis, evaluating comprehensive necropsy data and preshipment testing over a 5-yr study period, was performed to determine the viability of this model for use with sending animals to other institutions. Animals (607 birds, 704 reptiles and amphibians, and 341 mammals) were shipped to 116 Association of Zoos and Aquariums (AZA)-accredited and 29 non-AZA-accredited institutions. The evaluation showed no evidence of the specific transmissible diseases tested for during the preshipment exam being present within the San Diego Zoo collection. We suggest that a risk-based animal and institution-specific approach to transmissible disease preshipment testing is more cost effective and is in the better interest of animal welfare than the current industry standard of dogmatic preshipment testing.

  19. An Ontology Based Approach to Information Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The semantically structure of knowledge, based on ontology approaches have been increasingly adopted by several expertise from diverse domains. Recently ontologies have been moved from the philosophical and metaphysics disciplines to be used in the construction of models to describe a specific theory of a domain. The development and the use of ontologies promote the creation of a unique standard to represent concepts within a specific knowledge domain. In the scope of information security systems the use of an ontology to formalize and represent the concepts of security information challenge the mechanisms and techniques currently used. This paper intends to present a conceptual implementation model of an ontology defined in the security domain. The model presented contains the semantic concepts based on the information security standard ISO/IEC_JTC1, and their relationships to other concepts, defined in a subset of the information security domain.

  20. Identifying important and feasible policies and actions for health at community sports clubs: a consensus-generating approach.

    PubMed

    Kelly, Bridget; King, Lesley; Bauman, Adrian E; Baur, Louise A; Macniven, Rona; Chapman, Kathy; Smith, Ben J

    2014-01-01

    Children's high participation in organised sport in Australia makes sport an ideal setting for health promotion. This study aimed to generate consensus on priority health promotion objectives for community sports clubs, based on informed expert judgements. Delphi survey using three structured questionnaires. Forty-six health promotion, nutrition, physical activity and sport management/delivery professionals were approached to participate in the survey. Questionnaires used an iterative process to determine aspects of sports clubs deemed necessary for developing healthy sporting environments for children. Initially, participants were provided with a list of potential standards for a range of health promotion areas and asked to rate standards based on their importance and feasibility, and any barriers to implementation. Subsequently, participants were provided with information that summarised ratings for each standard to indicate convergence of the group, and asked to review and potentially revise their responses where they diverged. In a third round, participants ranked confirmed standards by priority. 26 professionals completed round 1, 21 completed round 2, and 18 completed round 3. The highest ranked standards related to responsible alcohol practices, availability of healthy food and drinks at sports canteens, smoke-free club facilities, restricting the sale and consumption of alcohol during junior sporting activities, and restricting unhealthy food and beverage company sponsorship. Identifying and prioritising health promotion areas that are relevant to children's sports clubs assists in focusing public health efforts and may guide future engagement of sports clubs. Approaches for providing informational and financial support to clubs to operationalise these standards are proposed. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  1. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    PubMed

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Candidate gene prioritization by network analysis of differential expression using machine learning approaches

    PubMed Central

    2010-01-01

    Background Discovering novel disease genes is still challenging for diseases for which no prior knowledge - such as known disease genes or disease-related pathways - is available. Performing genetic studies frequently results in large lists of candidate genes of which only few can be followed up for further investigation. We have recently developed a computational method for constitutional genetic disorders that identifies the most promising candidate genes by replacing prior knowledge by experimental data of differential gene expression between affected and healthy individuals. To improve the performance of our prioritization strategy, we have extended our previous work by applying different machine learning approaches that identify promising candidate genes by determining whether a gene is surrounded by highly differentially expressed genes in a functional association or protein-protein interaction network. Results We have proposed three strategies scoring disease candidate genes relying on network-based machine learning approaches, such as kernel ridge regression, heat kernel, and Arnoldi kernel approximation. For comparison purposes, a local measure based on the expression of the direct neighbors is also computed. We have benchmarked these strategies on 40 publicly available knockout experiments in mice, and performance was assessed against results obtained using a standard procedure in genetics that ranks candidate genes based solely on their differential expression levels (Simple Expression Ranking). Our results showed that our four strategies could outperform this standard procedure and that the best results were obtained using the Heat Kernel Diffusion Ranking leading to an average ranking position of 8 out of 100 genes, an AUC value of 92.3% and an error reduction of 52.8% relative to the standard procedure approach which ranked the knockout gene on average at position 17 with an AUC value of 83.7%. Conclusion In this study we could identify promising candidate genes using network based machine learning approaches even if no knowledge is available about the disease or phenotype. PMID:20840752

  3. An Inquiry-Based Contextual Approach as the Primary Mode of Learning Science with Microcomputer-Based Laboratory Technology

    ERIC Educational Resources Information Center

    Espinoza, Fernando; Quarless, Duncan

    2010-01-01

    Science instruction can be designed to be laboratory-data driven. We report on an investigation of the use of thematic inquiry-based tasks with active incorporation of mathematics, science, and microcomputer-based laboratory technology in standards-correlated activities that enhanced learning experiences. Activities involved students in two major…

  4. Arthroscopic Talar Dome Access Using a Standard Versus Wire-Based Traction Method for Ankle Joint Distraction.

    PubMed

    Barg, Alexej; Saltzman, Charles L; Beals, Timothy C; Bachus, Kent N; Blankenhorn, Brad D; Nickisch, Florian

    2016-07-01

    To evaluate the accessibility of the talar dome through anterior and posterior portals for ankle arthroscopy with the standard noninvasive distraction versus wire-based longitudinal distraction using a tensioned wire placed transversely through the calcaneal tuberosity. Seven matched pairs of thigh-to-foot specimens underwent ankle arthroscopy with 1 of 2 methods of distraction: a standard noninvasive strapping technique or a calcaneal tuberosity wire-based technique. The order of the arthroscopic approach and use of a distraction method was randomly determined. The areas accessed from both 2-portal anterior and 2-portal posterior approaches were determined by using a molded translucent grid. The mean talar surface accessible by anterior ankle arthroscopy was comparable with noninvasive versus calcaneal wire distraction with 57.8% ± 17.2% (range, 32.9% to 75.7%) versus 61.5% ± 15.2% (range, 38.5% to 79.1%) of the talar dome, respectively (P = .590). The use of calcaneal wire distraction significantly improved posterior talar dome accessibility compared with noninvasive distraction, with 56.4% ± 20.0% (range, 14.4% to 78.0%) versus 39.8% ± 14.9% (range, 20.0% to 57.6%) of the talar dome, respectively (P = .031). Under the conditions studied, our cadaveric model showed equivalent talar dome access with 2-portal anterior arthroscopy of calcaneal wire-based distraction versus noninvasive strap distraction, but improved access for 2-portal posterior arthroscopy with calcaneal wire-based distraction versus noninvasive strap distraction. The posterior 40% of the talar dome is difficult to access via anterior ankle arthroscopy. Posterior calcaneal tuberosity wire-based longitudinal distraction improved arthroscopic access to the centro-posterior talar dome with a posterior arthroscopic approach. Published by Elsevier Inc.

  5. Synergistic relationships between Analytical Chemistry and written standards.

    PubMed

    Valcárcel, Miguel; Lucena, Rafael

    2013-07-25

    This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Using Arden Syntax to Identify Registry-Eligible Very Low Birth Weight Neonates from the Electronic Health Record

    PubMed Central

    Sarkar, Indra Neil; Chen, Elizabeth S.; Rosenau, Paul T.; Storer, Matthew B.; Anderson, Beth; Horbar, Jeffrey D.

    2014-01-01

    Condition-specific registries are essential resources for supporting epidemiological, quality improvement, and clinical trial studies. The identification of potentially eligible patients for a given registry often involves a manual process or use of ad hoc software tools. With the increased availability of electronic health data, such as within Electronic Health Record (EHR) systems, there is potential to develop healthcare standards based approaches for interacting with these data. Arden Syntax, which has traditionally been used to represent medical knowledge for clinical decision support, is one such standard that may be adapted for the purpose of registry eligibility determination. In this feasibility study, Arden Syntax was explored for its ability to represent eligibility criteria for a registry of very low birth weight neonates. The promising performance (100% recall; 97% precision) of the Arden Syntax approach at a single institution suggests that a standards-based methodology could be used to robustly identify registry-eligible patients from EHRs. PMID:25954412

  7. Simulation-based estimation of mean and standard deviation for meta-analysis via Approximate Bayesian Computation (ABC).

    PubMed

    Kwon, Deukwoo; Reis, Isildinha M

    2015-08-12

    When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.

  8. Green-noise halftoning with dot diffusion

    NASA Astrophysics Data System (ADS)

    Lippens, Stefaan; Philips, Wilfried

    2007-02-01

    Dot diffusion is a halftoning technique that is based on the traditional error diffusion concept, but offers a high degree of parallel processing by its block based approach. Traditional dot diffusion however suffers from periodicity artifacts. To limit the visibility of these artifacts, we propose grid diffusion, which applies different class matrices for different blocks. Furthermore, in this paper we will discuss two approaches in the dot diffusion framework to generate green-noise halftone patterns. The first approach is based on output dependent feedback (hysteresis), analogous to the standard green-noise error diffusion techniques. We observe that the resulting halftones are rather coarse and highly dependent on the used dot diffusion class matrices. In the second approach we don't limit the diffusion to the nearest neighbors. This leads to less coarse halftones, compared to the first approach. The drawback is that it can only cope with rather limited cluster sizes. We can reduce these drawbacks by combining the two approaches.

  9. Towards a Standards-Based Approach to E-Learning Personalization Using Reusable Learning Objects.

    ERIC Educational Resources Information Center

    Conlan, Owen; Dagger, Declan; Wade, Vincent

    E-Learning systems that produce personalized course offerings for the learner are often expensive, both from a time and financial perspective, to develop and maintain. Learning content personalized to a learners' cognitive preferences has been shown to produce more effective learning, however many approaches to realizing this form of…

  10. Comparative Analysis of Future Cooks' Training in Vocational Institutions in Ukraine and Abroad

    ERIC Educational Resources Information Center

    Kankovsky, Ihor; Krasylnykova, Hanna; Drozich, Iryna

    2017-01-01

    The article deals with comparative analysis of conceptual approaches and content of cooks' training in Ukraine, European countries, the USA and Eastern Partnership countries. It has been found out that national vocational education is grounded on education standards and activity-based approach to forming the training content, subject-based…

  11. Crystalline cellulose elastic modulus predicted by atomistic models of uniform deformation and nanoscale indentation

    Treesearch

    Xiawa Wu; Robert J. Moon; Ashlie Martini

    2013-01-01

    The elastic modulus of cellulose Iß in the axial and transverse directions was obtained from atomistic simulations using both the standard uniform deformation approach and a complementary approach based on nanoscale indentation. This allowed comparisons between the methods and closer connectivity to experimental measurement techniques. A reactive...

  12. STS-33 Discovery, OV-103, approaches concrete runway 04 at EAFB, California

    NASA Technical Reports Server (NTRS)

    1989-01-01

    STS-33 Discovery, Orbiter Vehicle (OV) 103, approaches runway 04 at Edwards Air Force Base (EAFB), California. OV-103 with landing gear deployed is silhouetted against the orange sky of a sunset as it glides over the mountains. The landing occurred at 16:31:02 pm Pacific Standard Time (PST).

  13. An Integer Programming-Based Generalized Vehicle Routing Approach for Printed Circuit Board Assembly Optimization

    ERIC Educational Resources Information Center

    Seth, Anupam

    2009-01-01

    Production planning and scheduling for printed circuit, board assembly has so far defied standard operations research approaches due to the size and complexity of the underlying problems, resulting in unexploited automation flexibility. In this thesis, the increasingly popular collect-and-place machine configuration is studied and the assembly…

  14. Experimental Evaluation of Unicast and Multicast CoAP Group Communication

    PubMed Central

    Ishaq, Isam; Hoebeke, Jeroen; Moerman, Ingrid; Demeester, Piet

    2016-01-01

    The Internet of Things (IoT) is expanding rapidly to new domains in which embedded devices play a key role and gradually outnumber traditionally-connected devices. These devices are often constrained in their resources and are thus unable to run standard Internet protocols. The Constrained Application Protocol (CoAP) is a new alternative standard protocol that implements the same principals as the Hypertext Transfer Protocol (HTTP), but is tailored towards constrained devices. In many IoT application domains, devices need to be addressed in groups in addition to being addressable individually. Two main approaches are currently being proposed in the IoT community for CoAP-based group communication. The main difference between the two approaches lies in the underlying communication type: multicast versus unicast. In this article, we experimentally evaluate those two approaches using two wireless sensor testbeds and under different test conditions. We highlight the pros and cons of each of them and propose combining these approaches in a hybrid solution to better suit certain use case requirements. Additionally, we provide a solution for multicast-based group membership management using CoAP. PMID:27455262

  15. Reducing data friction through site-based data curation

    NASA Astrophysics Data System (ADS)

    Thomer, A.; Palmer, C. L.

    2017-12-01

    Much of geoscience research takes place at "scientifically significant sites": localities which have attracted a critical mass of scientific interest, and thereby merit protection by government bodies, as well as the preservation of specimen and data collections and the development of site-specific permitting requirements for access to the site and its associated collections. However, many data standards and knowledge organization schemas do not adequately describe key characteristics of the sites, despite their centrality to research projects. Through work conducted as part of the IMLS-funded Site-Based Data Curation (SBDC) project, we developed a Minimum Information Framework (MIF) for site-based science, in which "information about a site's structure" is considered a core class of information. Here we present our empirically-derived information framework, as well as the methods used to create it. We believe these approaches will lead to the development of more effective data repositories and tools, and thereby will reduce "data friction" in interdisciplinary, yet site-based, geoscience workflows. The Minimum Information Framework for Site-based Research was developed through work at two scientifically significant sites: the hot springs at Yellowstone National Park, which are key to geobiology research; and the La Brea Tar Pits, an important paleontology locality in Southern California. We employed diverse methods of participatory engagement, in which key stakeholders at our sites (e.g. curators, collections managers, researchers, permit officers) were consulted through workshops, focus groups, interviews, action research methods, and collaborative information modeling and systems analysis. These participatory approaches were highly effective in fostering on-going partnership among a diverse team of domain scientists, information scientists, and software developers. The MIF developed in this work may be viewed as a "proto-standard" that can inform future repository development and data standards. Further, the approaches used to develop the MIF represent an important step toward systematic methods of developing geoscience data standards. Finally, we argue that organizing data around aspects of a site makes data collections more accessible to a range of scientific communities.

  16. [Standardization of the terms for Chinese herbal functions based on functional targeting].

    PubMed

    Xiao, Bin; Tao, Ou; Gu, Hao; Wang, Yun; Qiao, Yan-Jiang

    2011-03-01

    Functional analysis concisely summarizes and concentrates on the therapeutic characteristics and features of Chinese herbal medicine. Standardization of the terms for Chinese herbal functions not only plays a key role in modern research and development of Chinese herbal medicine, but also has far-reaching clinical applications. In this paper, a new method for standardizing the terms for Chinese herbal function was proposed. Firstly, functional targets were collected. Secondly, the pathological conditions and the mode of action of every functional target were determined by analyzing the references. Thirdly, the relationships between the pathological condition and the mode of action were determined based on Chinese medicine theory and data. This three-step approach allows for standardization of the terms for Chinese herbal functions. Promoting the standardization of Chinese medicine terms will benefit the overall clinical application of Chinese herbal medicine.

  17. A clinical decision support system for diagnosis of Allergic Rhinitis based on intradermal skin tests.

    PubMed

    Jabez Christopher, J; Khanna Nehemiah, H; Kannan, A

    2015-10-01

    Allergic Rhinitis is a universal common disease, especially in populated cities and urban areas. Diagnosis and treatment of Allergic Rhinitis will improve the quality of life of allergic patients. Though skin tests remain the gold standard test for diagnosis of allergic disorders, clinical experts are required for accurate interpretation of test outcomes. This work presents a clinical decision support system (CDSS) to assist junior clinicians in the diagnosis of Allergic Rhinitis. Intradermal Skin tests were performed on patients who had plausible allergic symptoms. Based on patient׳s history, 40 clinically relevant allergens were tested. 872 patients who had allergic symptoms were considered for this study. The rule based classification approach and the clinical test results were used to develop and validate the CDSS. Clinical relevance of the CDSS was compared with the Score for Allergic Rhinitis (SFAR). Tests were conducted for junior clinicians to assess their diagnostic capability in the absence of an expert. The class based Association rule generation approach provides a concise set of rules that is further validated by clinical experts. The interpretations of the experts are considered as the gold standard. The CDSS diagnoses the presence or absence of rhinitis with an accuracy of 88.31%. The allergy specialist and the junior clinicians prefer the rule based approach for its comprehendible knowledge model. The Clinical Decision Support Systems with rule based classification approach assists junior doctors and clinicians in the diagnosis of Allergic Rhinitis to make reliable decisions based on the reports of intradermal skin tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Interoperability in Personalized Adaptive Learning

    ERIC Educational Resources Information Center

    Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin

    2006-01-01

    Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…

  19. Nanophotonic light-trapping theory for solar cells

    NASA Astrophysics Data System (ADS)

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2011-11-01

    Conventional light-trapping theory, based on a ray-optics approach, was developed for standard thick photovoltaic cells. The classical theory established an upper limit for possible absorption enhancement in this context and provided a design strategy for reaching this limit. This theory has become the foundation for light management in bulk silicon PV cells, and has had enormous influence on the optical design of solar cells in general. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the standard limit can be substantially surpassed when optical modes in the active layer are confined to deep-subwavelength scale, opening new avenues for highly efficient next-generation solar cells.

  20. Credit risk migration rates modeling as open systems: A micro-simulation approach

    NASA Astrophysics Data System (ADS)

    Landini, S.; Uberti, M.; Casellina, S.

    2018-05-01

    The last financial crisis of 2008 stimulated the development of new Regulatory Criteria (commonly known as Basel III) that pushed the banking activity to become more prudential, either in the short and the long run. As well known, in 2014 the International Accounting Standards Board (IASB) promulgated the new International Financial Reporting Standard 9 (IFRS 9) for financial instruments that will become effective in January 2018. Since the delayed recognition of credit losses on loans was identified as a weakness in existing accounting standards, the IASB has introduced an Expected Loss model that requires more timely recognition of credit losses. Specifically, new standards require entities to account both for expected losses from when the impairments are recognized for the first time and for full loan lifetime; moreover, a clear preference toward forward looking models is expressed. In this new framework, it is necessary a re-thinking of the widespread standard theoretical approach on which the well known prudential model is founded. The aim of this paper is then to define an original methodological approach to migration rates modeling for credit risk which is innovative respect to the standard method from the point of view of a bank as well as in a regulatory perspective. Accordingly, the proposed not-standard approach considers a portfolio as an open sample allowing for entries, migrations of stayers and exits as well. While being consistent with the empirical observations, this open-sample approach contrasts with the standard closed-sample method. In particular, this paper offers a methodology to integrate the outcomes of the standard closed-sample method within the open-sample perspective while removing some of the assumptions of the standard method. Three main conclusions can be drawn in terms of economic capital provision: (a) based on the Markovian hypothesis with a-priori absorbing state at default, the standard closed-sample method is to be abandoned for not to predict lenders' bankruptcy by construction; (b) to meet more reliable estimates along with the new regulatory standards, the sample to estimate migration rates matrices for credit risk should include either entries and exits; (c) the static eigen-decomposition standard procedure to forecast migration rates should be replaced with a stochastic process dynamics methodology while conditioning forecasts to macroeconomic scenarios.

  1. Risk analysis and its link with standards of the World Organisation for Animal Health.

    PubMed

    Sugiura, K; Murray, N

    2011-04-01

    Among the agreements included in the treaty that created the World Trade Organization (WTO) in January 1995 is the Agreement on the Application of Sanitary and Phytosanitary Measures (SPS Agreement) that sets out the basic rules for food safety and animal and plant health standards. The SPS Agreement designates the World Organisation for Animal Health (OIE) as the organisation responsible for developing international standards for animal health and zoonoses. The SPS Agreement requires that the sanitary measures that WTO members apply should be based on science and encourages them to either apply measures based on the OIE standards or, if they choose to adopt a higher level of protection than that provided by these standards, apply measures based on a science-based risk assessment. The OIE also provides a procedural framework for risk analysis for its Member Countries to use. Despite the inevitable challenges that arise in carrying out a risk analysis of the international trade in animals and animal products, the OIE risk analysis framework provides a structured approach that facilitates the identification, assessment, management and communication of these risks.

  2. An innovative approach to capability-based emergency operations planning

    PubMed Central

    Keim, Mark E

    2013-01-01

    This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology. PMID:28228987

  3. An innovative approach to capability-based emergency operations planning.

    PubMed

    Keim, Mark E

    2013-01-01

    This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology.

  4. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.

  5. Can the Diagnostics of Triangular Fibrocartilage Complex Lesions Be Improved by MRI-Based Soft-Tissue Reconstruction? An Imaging-Based Workup and Case Presentation.

    PubMed

    Hammer, Niels; Hirschfeld, Ulrich; Strunz, Hendrik; Werner, Michael; Wolfskämpf, Thomas; Löffler, Sabine

    2017-01-01

    Introduction . The triangular fibrocartilage complex (TFCC) provides both mobility and stability of the radiocarpal joint. TFCC lesions are difficult to diagnose due to the complex anatomy. The standard treatment for TFCC lesions is arthroscopy, posing surgery-related risks onto the patients. This feasibility study aimed at developing a workup for soft-tissue reconstruction using clinical imaging, to verify these results in retrospective patient data. Methods . Microcomputed tomography ( μ -CT), 3 T magnetic resonance imaging (MRI), and plastination were used to visualize the TFCC in cadaveric specimens applying segmentation-based 3D reconstruction. This approach further trialed the MRI dataset of a patient with minor radiological TFCC alterations but persistent pain. Results . TFCC reconstruction was impossible using μ -CT only but feasible using MRI, resulting in an appreciation of its substructures, as seen in the plastinates. Applying this approach allowed for visualizing a Palmer 2C lesion in a patient, confirming ex postum the arthroscopy findings, being markedly different from MRI (Palmer 1B). Discussion . This preliminary study showed that image-based TFCC reconstruction may help to identify pathologies invisible in standard MRI. The combined approach of μ -CT, MRI, and plastination allowed for a three-dimensional appreciation of the TFCC. Image quality and time expenditure limit the approach's usefulness as a diagnostic tool.

  6. Can the Diagnostics of Triangular Fibrocartilage Complex Lesions Be Improved by MRI-Based Soft-Tissue Reconstruction? An Imaging-Based Workup and Case Presentation

    PubMed Central

    Hirschfeld, Ulrich; Strunz, Hendrik; Werner, Michael; Wolfskämpf, Thomas; Löffler, Sabine

    2017-01-01

    Introduction. The triangular fibrocartilage complex (TFCC) provides both mobility and stability of the radiocarpal joint. TFCC lesions are difficult to diagnose due to the complex anatomy. The standard treatment for TFCC lesions is arthroscopy, posing surgery-related risks onto the patients. This feasibility study aimed at developing a workup for soft-tissue reconstruction using clinical imaging, to verify these results in retrospective patient data. Methods. Microcomputed tomography (μ-CT), 3 T magnetic resonance imaging (MRI), and plastination were used to visualize the TFCC in cadaveric specimens applying segmentation-based 3D reconstruction. This approach further trialed the MRI dataset of a patient with minor radiological TFCC alterations but persistent pain. Results. TFCC reconstruction was impossible using μ-CT only but feasible using MRI, resulting in an appreciation of its substructures, as seen in the plastinates. Applying this approach allowed for visualizing a Palmer 2C lesion in a patient, confirming ex postum the arthroscopy findings, being markedly different from MRI (Palmer 1B). Discussion. This preliminary study showed that image-based TFCC reconstruction may help to identify pathologies invisible in standard MRI. The combined approach of μ-CT, MRI, and plastination allowed for a three-dimensional appreciation of the TFCC. Image quality and time expenditure limit the approach's usefulness as a diagnostic tool. PMID:28246600

  7. Changing the Rules: Social Architectures in Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Yee, Nick

    In the late 1960s, Mischel (1968) sparked a debate in personality psychology by critiquing the reliance on trait-based frameworks of behavior. While the standard approach had been to measure stable dispositions (such as Extraversion), Mischel argued that behavior was largely determined by situational demands (such as being at a party). In the decades that followed, while there have been loud calls within the field to embrace an interactionist approach, research in personality psychology has still largely sidelined situational factors (Endler and Parker 1992) and has continued to focus on standardizing trait measures (Costa and McCrae 1985; Goldberg 1992).

  8. Standardization of domestic frying processes by an engineering approach.

    PubMed

    Franke, K; Strijowski, U

    2011-05-01

    An approach was developed to enable a better standardization of domestic frying of potato products. For this purpose, 5 domestic fryers differing in heating power and oil capacity were used. A very defined frying process using a highly standardized model product and a broad range of frying conditions was carried out in these fryers and the development of browning representing an important quality parameter was measured. Product-to-oil ratio, oil temperature, and frying time were varied. Quite different color changes were measured in the different fryers although the same frying process parameters were applied. The specific energy consumption for water evaporation (spECWE) during frying related to product amount was determined for all frying processes to define an engineering parameter for characterizing the frying process. A quasi-linear regression approach was applied to calculate this parameter from frying process settings and fryer properties. The high significance of the regression coefficients and a coefficient of determination close to unity confirmed the suitability of this approach. Based on this regression equation, curves for standard frying conditions (SFC curves) were calculated which describe the frying conditions required to obtain the same level of spECWE in the different domestic fryers. Comparison of browning results from the different fryers operated at conditions near the SFC curves confirmed the applicability of the approach. © 2011 Institute of Food Technologists®

  9. Goal is to give every doc an 'AIDS cookbook'.

    PubMed

    1997-06-01

    Before leaving Kaiser Family Foundation in California, Mark Smith helped to lay the groundwork for the first comprehensive HIV treatment standard in almost five years. Smith, former chairman of the Centers for Disease Control and Prevention's (CDC) Advisory Committee on HIV, noted that the most sophisticated practitioners may be using certain information as a basis for their decisions months before it gets published in mainstream journals. The committee to develop the standard includes AIDS clinicians and researchers, as well as managed care medical directors and others who follow evidence-based medicine. The cookbook approach will help unsophisticated practitioners develop effective treatment regimens for their patients, and will direct medical care decisions in managed health facilities. The standards will be a one-stop source for making decisions on everything from viral load testing to immunizations. Critics charge that the standard's approach will leave no room for up-to-date changes.

  10. Systems Harmonization and Convergence - the GIGAS Approach

    NASA Astrophysics Data System (ADS)

    Marchetti, P. G.; Biancalana, A.; Coene, Y.; Uslander, T.

    2009-04-01

    0.1 Background The GIGAS1 Support Action promotes the coherent and interoperable development of the GMES, INSPIRE and GEOSS initiatives through their concerted adoption of standards, protocols, and open architectures. 0.2 Preparing for Coordinated Data Access The GMES Coordinated Data Access System is under design and implementation2. This objective has motivated the definition of the interoperability standards between the contributing missions. The following elements have been addressed with associated papers submitted to OGC: The EO Product Metadata has been based on the OGC Geographic Markup Language, addressing sensor characteristics for optical, radar and atmospheric products. Collection and service discovery: an ISO extension package for CSW ebRim has been proposed. Catalogue Service (CSW): an Earth Observation extension package of the CSW ebRim has been proposed. Feasibility Analysis and Order: an Order interface control document and an Earth Observation profile of the Sensor Planning Service have been proposed. Online Data Access: an Earth Observation profile of the Web Map Services (WMS) for visualization and evaluation purposes has been proposed. Identity (user) management: the objective in the long term is to allow for a single sign-on to the Coordinated Data Access system by users registered in the various Earth Observation ground segments by providing a federated identity across participating ground segments, exploiting OASIS standards. 0.3 The GIGAS proposed harmonization approach The approach proposed by GIGAS is based on three elements: Technology watch Comparative analysis Shaping of initiatives and standards This paper concentrates on the methodology for technology watch and comparative analysis. The complexity of the GIGAS scenario involving huge systems (i.e. GEOSS, INSPIRE, GMES etc.) entails the interaction with different heterogeneous partners, each with a specific competence, expertise and know-how. 0.3.1 Technology watch The methodology proposed is based on an RM-ODP based study supported by interoperability use cases and scenarios used to derive requirements. GIGAS will monitor the INSPIRE, GMES and GEOSS evolution and analyze the requirements, the standards, the services and the architecture, the models, the processes and the consensus mechanisms with the same elements of the other systems under analysis. activities in the fields of standard development that are part of the three initiatives. This task will provide the basis for how these three initiatives will strategically support consensus and efficient standards development going forward. architecture, specifications, innovative concepts and software developments of past or ongoing FP6/FP7 research topics. The use of an RM-ODP approach is selected as: most of the architectural approaches to be compared are based on RM-ODP, it supports distributed processing, it aims at fostering interoperability across heterogeneous systems, it tries to hide distribution to systems developers. However, as most of the systems to be considered have the characteristic of a loosely-coupled network of systems and services instead of a "distributed processing system based on interacting objects", the RM-ODP concepts are tailored for the GIGAS needs. The usage of RM-ODP for GIGAS Requirements and Technology Watch is two-fold: Architectural analysis: It is performed for all projects and initiatives. Its purpose is to identify possibilities but also major obstacles for interoperability. Furthermore, it identifies the major use cases to be analysed in more detail. Use Case Implementation Analysis: It is used to describe how selected use cases of the projects and initiatives are implemented in the different architectures. Its purpose is to identify technological gaps and concrete problems of interoperability. It is performed only for selected use cases. The output of the Technology Watch is an RM-ODP based report containing parallel analysis on the same aspects on the three initiatives integrated by analysis of relevant FP6-FP7 projects and standardization activities. 0.3.2 Comparative Analysis Based on the outcomes of the previous monitoring tasks, GIGAS undertakes a comparative analysis on solutions, requirements, architecture, models, processes and consensus mechanisms used by INSPIRE, GMES and GEOSS, taking into account the inputs from the monitoring of FP6/FP7 research projects and the ongoing standardization activities. Initiative Contact Points will insure that the overall policy framework and schedules for each of the three initiatives will be factored in. The result of the Comparative Analysis includes: A list of recommendations to GEOSS, INSPIRE and GMES to be expanded and processed in depth in the following shaping phase The identification of technological gaps to be explored in the following shaping phase. Guidelines and objectives for the architectural approach within GIGAS Analysis on the schedules of the three initiatives and on the FP6/FP7 programs and standardization activities, with identification of key milestones or intervention points.

  11. Legal ecotones: A comparative analysis of riparian policy protection in the Oregon Coast Range, USA.

    PubMed

    Boisjolie, Brett A; Santelmann, Mary V; Flitcroft, Rebecca L; Duncan, Sally L

    2017-07-15

    Waterways of the USA are protected under the public trust doctrine, placing responsibility on the state to safeguard public resources for the benefit of current and future generations. This responsibility has led to the development of management standards for lands adjacent to streams. In the state of Oregon, policy protection for riparian areas varies by ownership (e.g., federal, state, or private), land use (e.g., forest, agriculture, rural residential, or urban) and stream attributes, creating varying standards for riparian land-management practices along the stream corridor. Here, we compare state and federal riparian land-management standards in four major policies that apply to private and public lands in the Oregon Coast Range. We use a standard template to categorize elements of policy protection: (1) the regulatory approach, (2) policy goals, (3) stream attributes, and (4) management standards. All four policies have similar goals for achieving water-quality standards, but differ in their regulatory approach. Plans for agricultural lands rely on outcome-based standards to treat pollution, in contrast with the prescriptive policy approaches for federal, state, and private forest lands, which set specific standards with the intent of preventing pollution. Policies also differ regarding the stream attributes considered when specifying management standards. Across all policies, 25 categories of unique standards are identified. Buffer widths vary from 0 to ∼152 m, with no buffer requirements for streams in agricultural areas or small, non-fish-bearing, seasonal streams on private forest land; narrow buffer requirements for small, non-fish-bearing perennial streams on private forest land (3 m); and the widest buffer requirements for fish-bearing streams on federal land (two site-potential tree-heights, up to an estimated 152 m). Results provide insight into how ecosystem concerns are addressed by variable policy approaches in multi-ownership landscapes, an important consideration to recovery-planning efforts for threatened species. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  13. Proposed standards for reporting outcomes of treating biliary injuries.

    PubMed

    Cho, Jai Young; Baron, Todd H; Carr-Locke, David L; Chapman, William C; Costamagna, Guido; de Santibanes, Eduardo; Dominguez Rosado, Ismael; Garden, O James; Gouma, Dirk; Lillemoe, Keith D; Angel Mercado, Miguel; Mullady, Daniel K; Padbury, Robert; Picus, Daniel; Pitt, Henry A; Sherman, Stuart; Shlansky-Goldberg, Richard; Tornqvist, Bjorn; Strasberg, Steven M

    2018-04-01

    There is no standard nor widely accepted way of reporting outcomes of treatment of biliary injuries. This hinders comparison of results among approaches and among centers. This paper presents a proposal to standardize terminology and reporting of results of treating biliary injuries. The proposal was developed by an international group of surgeons, biliary endoscopists and interventional radiologists. The method is based on the concept of "patency" and is similar to the approach used to create reporting standards for arteriovenous hemodialysis access. The group considered definitions and gradings under the following headings: Definition of Patency, Definition of Index Treatment Periods, Grading of Severity of Biliary Injury, Grading of Patency, Metrics, Comparison of Surgical to Non Surgical Treatments and Presentation of Case Series. A standard procedure for reporting outcomes of treating biliary injuries has been produced. It is applicable to presenting results of treatment by surgery, endoscopy, and interventional radiology. Copyright © 2017 International Hepato-Pancreato-Biliary Association Inc. Published by Elsevier Ltd. All rights reserved.

  14. Comparison of bootstrap approaches for estimation of uncertainties of DTI parameters.

    PubMed

    Chung, SungWon; Lu, Ying; Henry, Roland G

    2006-11-01

    Bootstrap is an empirical non-parametric statistical technique based on data resampling that has been used to quantify uncertainties of diffusion tensor MRI (DTI) parameters, useful in tractography and in assessing DTI methods. The current bootstrap method (repetition bootstrap) used for DTI analysis performs resampling within the data sharing common diffusion gradients, requiring multiple acquisitions for each diffusion gradient. Recently, wild bootstrap was proposed that can be applied without multiple acquisitions. In this paper, two new approaches are introduced called residual bootstrap and repetition bootknife. We show that repetition bootknife corrects for the large bias present in the repetition bootstrap method and, therefore, better estimates the standard errors. Like wild bootstrap, residual bootstrap is applicable to single acquisition scheme, and both are based on regression residuals (called model-based resampling). Residual bootstrap is based on the assumption that non-constant variance of measured diffusion-attenuated signals can be modeled, which is actually the assumption behind the widely used weighted least squares solution of diffusion tensor. The performances of these bootstrap approaches were compared in terms of bias, variance, and overall error of bootstrap-estimated standard error by Monte Carlo simulation. We demonstrate that residual bootstrap has smaller biases and overall errors, which enables estimation of uncertainties with higher accuracy. Understanding the properties of these bootstrap procedures will help us to choose the optimal approach for estimating uncertainties that can benefit hypothesis testing based on DTI parameters, probabilistic fiber tracking, and optimizing DTI methods.

  15. Accelerated pharmacokinetic map determination for dynamic contrast enhanced MRI using frequency-domain based Tofts model.

    PubMed

    Vajuvalli, Nithin N; Nayak, Krupa N; Geethanath, Sairam

    2014-01-01

    Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) is widely used in the diagnosis of cancer and is also a promising tool for monitoring tumor response to treatment. The Tofts model has become a standard for the analysis of DCE-MRI. The process of curve fitting employed in the Tofts equation to obtain the pharmacokinetic (PK) parameters is time-consuming for high resolution scans. Current work demonstrates a frequency-domain approach applied to the standard Tofts equation to speed-up the process of curve-fitting in order to obtain the pharmacokinetic parameters. The results obtained show that using the frequency domain approach, the process of curve fitting is computationally more efficient compared to the time-domain approach.

  16. Predicting ESI/MS Signal Change for Anions in Different Solvents.

    PubMed

    Kruve, Anneli; Kaupmees, Karl

    2017-05-02

    LC/ESI/MS is a technique widely used for qualitative and quantitative analysis in various fields. However, quantification is currently possible only for compounds for which the standard substances are available, as the ionization efficiency of different compounds in ESI source differs by orders of magnitude. In this paper we present an approach for quantitative LC/ESI/MS analysis without standard substances. This approach relies on accurately predicting the ionization efficiencies in ESI source based on a model, which uses physicochemical parameters of analytes. Furthermore, the model has been made transferable between different mobile phases and instrument setups by using a suitable set of calibration compounds. This approach has been validated both in flow injection and chromatographic mode with gradient elution.

  17. Quantum Chemical Approach to Estimating the Thermodynamics of Metabolic Reactions

    PubMed Central

    Jinich, Adrian; Rappoport, Dmitrij; Dunn, Ian; Sanchez-Lengeling, Benjamin; Olivares-Amaya, Roberto; Noor, Elad; Even, Arren Bar; Aspuru-Guzik, Alán

    2014-01-01

    Thermodynamics plays an increasingly important role in modeling and engineering metabolism. We present the first nonempirical computational method for estimating standard Gibbs reaction energies of metabolic reactions based on quantum chemistry, which can help fill in the gaps in the existing thermodynamic data. When applied to a test set of reactions from core metabolism, the quantum chemical approach is comparable in accuracy to group contribution methods for isomerization and group transfer reactions and for reactions not including multiply charged anions. The errors in standard Gibbs reaction energy estimates are correlated with the charges of the participating molecules. The quantum chemical approach is amenable to systematic improvements and holds potential for providing thermodynamic data for all of metabolism. PMID:25387603

  18. FloPSy - Search-Based Floating Point Constraint Solving for Symbolic Execution

    NASA Astrophysics Data System (ADS)

    Lakhotia, Kiran; Tillmann, Nikolai; Harman, Mark; de Halleux, Jonathan

    Recently there has been an upsurge of interest in both, Search-Based Software Testing (SBST), and Dynamic Symbolic Execution (DSE). Each of these two approaches has complementary strengths and weaknesses, making it a natural choice to explore the degree to which the strengths of one can be exploited to offset the weakness of the other. This paper introduces an augmented version of DSE that uses a SBST-based approach to handling floating point computations, which are known to be problematic for vanilla DSE. The approach has been implemented as a plug in for the Microsoft Pex DSE testing tool. The paper presents results from both, standard evaluation benchmarks, and two open source programs.

  19. Ontology- and graph-based similarity assessment in biological networks.

    PubMed

    Wang, Haiying; Zheng, Huiru; Azuaje, Francisco

    2010-10-15

    A standard systems-based approach to biomarker and drug target discovery consists of placing putative biomarkers in the context of a network of biological interactions, followed by different 'guilt-by-association' analyses. The latter is typically done based on network structural features. Here, an alternative analysis approach in which the networks are analyzed on a 'semantic similarity' space is reported. Such information is extracted from ontology-based functional annotations. We present SimTrek, a Cytoscape plugin for ontology-based similarity assessment in biological networks. http://rosalind.infj.ulst.ac.uk/SimTrek.html francisco.azuaje@crp-sante.lu Supplementary data are available at Bioinformatics online.

  20. Quantum Fragment Based ab Initio Molecular Dynamics for Proteins.

    PubMed

    Liu, Jinfeng; Zhu, Tong; Wang, Xianwei; He, Xiao; Zhang, John Z H

    2015-12-08

    Developing ab initio molecular dynamics (AIMD) methods for practical application in protein dynamics is of significant interest. Due to the large size of biomolecules, applying standard quantum chemical methods to compute energies for dynamic simulation is computationally prohibitive. In this work, a fragment based ab initio molecular dynamics approach is presented for practical application in protein dynamics study. In this approach, the energy and forces of the protein are calculated by a recently developed electrostatically embedded generalized molecular fractionation with conjugate caps (EE-GMFCC) method. For simulation in explicit solvent, mechanical embedding is introduced to treat protein interaction with explicit water molecules. This AIMD approach has been applied to MD simulations of a small benchmark protein Trpcage (with 20 residues and 304 atoms) in both the gas phase and in solution. Comparison to the simulation result using the AMBER force field shows that the AIMD gives a more stable protein structure in the simulation, indicating that quantum chemical energy is more reliable. Importantly, the present fragment-based AIMD simulation captures quantum effects including electrostatic polarization and charge transfer that are missing in standard classical MD simulations. The current approach is linear-scaling, trivially parallel, and applicable to performing the AIMD simulation of proteins with a large size.

  1. Medical paraclinical standards, political economy of clinic, and patients' clinical dependency; a critical conversation analysis of clinical counseling in South of iran.

    PubMed

    Kalateh Sadati, Ahmad; Iman, Mohammad Taghi; Bagheri Lankarani, Kamran

    2014-07-01

    Despite its benefits and importance, clinical counseling affects the patient both psychosocially and socially. Illness labeling not only leads to many problems for patient and his/her family but also it imposes high costs to health care system. Among various factors, doctor-patient relationship has an important role in the clinical counseling and its medical approach. The goal of this study is to evaluate the nature of clinical counseling based on critical approach. The context of research is the second major medical training center in Shiraz, Iran. In this study, Critical Conversation Analysis was used based on the methodologies of critical theories. Among about 50 consultation meetings digitally recorded, 33 were selected for this study. RESULTS show that the nature of doctor-patient relationship in these cases is based on paternalistic model. On the other hand, in all consultations, the important values that were legitimated with physicians were medical paraclinical standards. Paternalism in one hand and standardization on the other leads to dependency of patients to the clinic. Although we can't condone the paraclinical standards, clinical counseling and doctor-patient relationship need to reduce its dominance over counseling based on interpretation of human relations, paying attention to social and economical differences of peoples and biosocial and biocultural differences, and focusing on clinical examinations. Also, we need to accept that medicine is an art of interaction that can't reduce it to instrumental and linear methods of body treatment.

  2. Using Semantic Web technologies for the generation of domain-specific templates to support clinical study metadata standards.

    PubMed

    Jiang, Guoqian; Evans, Julie; Endle, Cory M; Solbrig, Harold R; Chute, Christopher G

    2016-01-01

    The Biomedical Research Integrated Domain Group (BRIDG) model is a formal domain analysis model for protocol-driven biomedical research, and serves as a semantic foundation for application and message development in the standards developing organizations (SDOs). The increasing sophistication and complexity of the BRIDG model requires new approaches to the management and utilization of the underlying semantics to harmonize domain-specific standards. The objective of this study is to develop and evaluate a Semantic Web-based approach that integrates the BRIDG model with ISO 21090 data types to generate domain-specific templates to support clinical study metadata standards development. We developed a template generation and visualization system based on an open source Resource Description Framework (RDF) store backend, a SmartGWT-based web user interface, and a "mind map" based tool for the visualization of generated domain-specific templates. We also developed a RESTful Web Service informed by the Clinical Information Modeling Initiative (CIMI) reference model for access to the generated domain-specific templates. A preliminary usability study is performed and all reviewers (n = 3) had very positive responses for the evaluation questions in terms of the usability and the capability of meeting the system requirements (with the average score of 4.6). Semantic Web technologies provide a scalable infrastructure and have great potential to enable computable semantic interoperability of models in the intersection of health care and clinical research.

  3. Distance Education Assessment Infrastructure and Process Design Based on International Standard 23988

    ERIC Educational Resources Information Center

    Shaffer, Steven C.

    2012-01-01

    Assessment is an important part of distance education (DE). As class sizes get larger and workloads increase, the IT infrastructure and processes used for DE assessments become more of an issue. Using the BS ISO/IEC 23988:2007 Standard for the use of technology in the delivery of assessments as a guide, this paper describes a rational approach to…

  4. Implementing an Exemplar-Based Approach in an Interaction Design Subject: Enhancing Students' Awareness of the Need to Be Creative

    ERIC Educational Resources Information Center

    Hendry, Graham D.; Tomitsch, Martin

    2014-01-01

    In higher education effective teaching includes making learning goals and standards clear to students. In architecture and design education in particular, goals and standards around assessment are often not well articulated. There is good evidence that when teachers engage students before an assessment in marking exemplars, and explain why the…

  5. An agenda for assessing and improving conservation impacts of sustainability standards in tropical agriculture.

    PubMed

    Milder, Jeffrey C; Arbuthnot, Margaret; Blackman, Allen; Brooks, Sharon E; Giovannucci, Daniele; Gross, Lee; Kennedy, Elizabeth T; Komives, Kristin; Lambin, Eric F; Lee, Audrey; Meyer, Daniel; Newton, Peter; Phalan, Ben; Schroth, Götz; Semroc, Bambi; Van Rikxoort, Henk; Zrust, Michal

    2015-04-01

    Sustainability standards and certification serve to differentiate and provide market recognition to goods produced in accordance with social and environmental good practices, typically including practices to protect biodiversity. Such standards have seen rapid growth, including in tropical agricultural commodities such as cocoa, coffee, palm oil, soybeans, and tea. Given the role of sustainability standards in influencing land use in hotspots of biodiversity, deforestation, and agricultural intensification, much could be gained from efforts to evaluate and increase the conservation payoff of these schemes. To this end, we devised a systematic approach for monitoring and evaluating the conservation impacts of agricultural sustainability standards and for using the resulting evidence to improve the effectiveness of such standards over time. The approach is oriented around a set of hypotheses and corresponding research questions about how sustainability standards are predicted to deliver conservation benefits. These questions are addressed through data from multiple sources, including basic common information from certification audits; field monitoring of environmental outcomes at a sample of certified sites; and rigorous impact assessment research based on experimental or quasi-experimental methods. Integration of these sources can generate time-series data that are comparable across sites and regions and provide detailed portraits of the effects of sustainability standards. To implement this approach, we propose new collaborations between the conservation research community and the sustainability standards community to develop common indicators and monitoring protocols, foster data sharing and synthesis, and link research and practice more effectively. As the role of sustainability standards in tropical land-use governance continues to evolve, robust evidence on the factors contributing to effectiveness can help to ensure that such standards are designed and implemented to maximize benefits for biodiversity conservation. © 2014 Society for Conservation Biology.

  6. Does science speak clearly and fairly in trade and food safety disputes? The search for an optimal response of WTO adjudication to problematic international standard-making.

    PubMed

    Ni, Kuei-Jung

    2013-01-01

    Most international health-related standards are voluntary per se. However, the incorporation of international standard-making into WTO agreements like the SPS Agreement has drastically changed the status and effectiveness of the standards. WTO members are urged to follow international standards, even when not required to comply fully with them. Indeed, such standards have attained great influence in the trade system. Yet evidence shows that the credibility of the allegedly scientific approach of these international standard-setting institutions, especially the Codex Alimentarius Commission (Codex) governing food safety standards, has been eroded and diluted by industrial and political influences. Its decision-making is no longer based on consensus, but voting. The adoption of new safety limits for the veterinary drug ractopamine in 2012, by a very close vote, is simply another instance of the problematic operations of the Codex. These dynamics have led skeptics to question the legitimacy of the standard setting body and to propose solutions to rectify the situation. Prior WTO rulings have yet to pay attention to the defect in the decision-making processes of the Codex. Nevertheless, the recent Appellate Body decision on Hormones II is indicative of a deferential approach to national measures that are distinct from Codex formulas. The ruling also rejects the reliance on those experts who authored the Codex standards to assess new measures of the European Community. This approach provides an opportunity to contemplate what the proper relationship between the WTO and Codex ought to be. Through a critical review of WTO rulings and academic proposals, this article aims to analyze how the WTO ought to define such interactions and respond to the politicized standard-making process in an optimal manner. This article argues that building a more systematic approach and normative basis for WTO judicial review of standard-setting decisions and the selection of technical experts would be instrumental to strengthening the mutual supports between the WTO and international standard-setting organizations, and may help avoid the introduction of a prejudice toward a justified science finding.

  7. Patient-reported outcomes in child and adolescent mental health services (CAMHS): use of idiographic and standardized measures.

    PubMed

    Wolpert, Miranda; Ford, Tamsin; Trustam, Emma; Law, Duncan; Deighton, Jessica; Flannery, Halina; Fugard, Andrew J B; Fugard, Rew J B

    2012-04-01

    There is increasing emphasis on use of patient-reported outcome measures (PROMs) in mental health but little research on the best approach, especially where there are multiple perspectives. To present emerging findings from both standardized and idiographic child-, parent- and clinician-rated outcomes in child and adolescent mental health services (CAMHS) and consider their correlations. Outcomes were collected in CAMHS across the UK. These comprised idiographic measures (goal-based outcomes) and standardized measures (practitioner-rated Children's Global Assessment Scale; child- and parent-rated Strengths and Difficulties Questionnaire). There was reliable positive change from the beginning of treatment to later follow-up according to all informants. Standardized clinician function report was correlated with standardized child difficulty report (r  =  - 0.26), standardized parent report (r  =  - 0.28) and idiographic joint client-determined goals (r  =  0.38) in the expected directions. These results suggest that routine outcome monitoring is feasible, and suggest the possibility of using jointly agreed idiographic measures alongside particular perspectives on outcome as part of a PROMs approach.

  8. Integration of Multiple Components in Polystyrene-based Microfluidic Devices Part 1: Fabrication and Characterization

    PubMed Central

    Johnson, Alicia S.; Anderson, Kari B.; Halpin, Stephen T.; Kirkpatrick, Douglas C.; Spence, Dana M.; Martin, R. Scott

    2012-01-01

    In Part I of a two-part series, we describe a simple, and inexpensive approach to fabricate polystyrene devices that is based upon melting polystyrene (from either a Petri dish or powder form) against PDMS molds or around electrode materials. The ability to incorporate microchannels in polystyrene and integrate the resulting device with standard laboratory equipment such as an optical plate reader for analyte readout and micropipettors for fluid propulsion is first described. A simple approach for sample and reagent delivery to the device channels using a standard, multi-channel micropipette and a PDMS-based injection block is detailed. Integration of the microfluidic device with these off-chip functions (sample delivery and readout) enables high throughput screens and analyses. An approach to fabricate polystyrene-based devices with embedded electrodes is also demonstrated, thereby enabling the integration of microchip electrophoresis with electrochemical detection through the use of a palladium electrode (for a decoupler) and carbon-fiber bundle (for detection). The device was sealed against a PDMS-based microchannel and used for the electrophoretic separation and amperometric detection of dopamine, epinephrine, catechol, and 3,4-dihydroxyphenylacetic acid. Finally, these devices were compared against PDMS-based microchips in terms of their optical transparency and absorption of an anti-platelet drug, clopidogrel. Part I of this series lays the foundation for Part II, where these devices were utilized for various on-chip cellular analysis. PMID:23120747

  9. Confidence intervals for the population mean tailored to small sample sizes, with applications to survey sampling.

    PubMed

    Rosenblum, Michael A; Laan, Mark J van der

    2009-01-07

    The validity of standard confidence intervals constructed in survey sampling is based on the central limit theorem. For small sample sizes, the central limit theorem may give a poor approximation, resulting in confidence intervals that are misleading. We discuss this issue and propose methods for constructing confidence intervals for the population mean tailored to small sample sizes. We present a simple approach for constructing confidence intervals for the population mean based on tail bounds for the sample mean that are correct for all sample sizes. Bernstein's inequality provides one such tail bound. The resulting confidence intervals have guaranteed coverage probability under much weaker assumptions than are required for standard methods. A drawback of this approach, as we show, is that these confidence intervals are often quite wide. In response to this, we present a method for constructing much narrower confidence intervals, which are better suited for practical applications, and that are still more robust than confidence intervals based on standard methods, when dealing with small sample sizes. We show how to extend our approaches to much more general estimation problems than estimating the sample mean. We describe how these methods can be used to obtain more reliable confidence intervals in survey sampling. As a concrete example, we construct confidence intervals using our methods for the number of violent deaths between March 2003 and July 2006 in Iraq, based on data from the study "Mortality after the 2003 invasion of Iraq: A cross sectional cluster sample survey," by Burnham et al. (2006).

  10. A Comparison of Computer-Based Classification Testing Approaches Using Mixed-Format Tests with the Generalized Partial Credit Model

    ERIC Educational Resources Information Center

    Kim, Jiseon

    2010-01-01

    Classification testing has been widely used to make categorical decisions by determining whether an examinee has a certain degree of ability required by established standards. As computer technologies have developed, classification testing has become more computerized. Several approaches have been proposed and investigated in the context of…

  11. Defining Openness: Updating the Concept of "Open" for a Connected World

    ERIC Educational Resources Information Center

    McAndrew, Patrick

    2010-01-01

    The release of free resources by the education sector has led to reconsideration of how the open approach implied by Open Educational Resources (OER) impacts on the educator and the learner. However this work has tended to consider the replication of standard campus based approaches and the characteristics of content that will encourage other…

  12. For the Record: What Education Policy Could Be

    ERIC Educational Resources Information Center

    Tienken, Christopher H.

    2012-01-01

    A review of education reform policies reveals a shift from an input guarantee approach aimed at providing funds to level the playing field for all students to an output guarantee approach based on the expectation of achieving standardized results regardless of inputs. The shift reflects a belief that where a child starts his or her cognitive,…

  13. The Pedagogical Orientations of South African Physical Sciences Teachers towards Inquiry or Direct Instructional Approaches

    ERIC Educational Resources Information Center

    Ramnarain, Umesh; Schuster, David

    2014-01-01

    In recent years, inquiry-based science instruction has become widely advocated in science education standards in many countries and, hence, in teacher preparation programmes. Nevertheless, in practice, one finds a wide variety of science instructional approaches. In South Africa, as in many countries, there is also a great disparity in school…

  14. A Proposed Pedagogical Approach for Preparing Teacher Candidates to Incorporate Academic Language in Mathematics Classrooms

    ERIC Educational Resources Information Center

    Lim, Woong; Stallings, Lynn; Kim, Dong Joong

    2015-01-01

    The purpose of this article is to present issues related to prioritizing academic language in teaching performance assessments and to propose a pedagogical approach that prepares middle grades mathematics teacher candidates to teach academic language. Based on our experience with teacher candidates and our knowledge of edTPA standards involving…

  15. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    ERIC Educational Resources Information Center

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  16. A Semantic-Oriented Approach for Organizing and Developing Annotation for E-Learning

    ERIC Educational Resources Information Center

    Brut, Mihaela M.; Sedes, Florence; Dumitrescu, Stefan D.

    2011-01-01

    This paper presents a solution to extend the IEEE LOM standard with ontology-based semantic annotations for efficient use of learning objects outside Learning Management Systems. The data model corresponding to this approach is first presented. The proposed indexing technique for this model development in order to acquire a better annotation of…

  17. Modelling of plug and play interface for energy router based on IEC61850

    NASA Astrophysics Data System (ADS)

    Shi, Y. F.; Yang, F.; Gan, L.; He, H. L.

    2017-11-01

    Under the background of the “Internet Plus”, as the energy internet infrastructure equipment, energy router will be widely developed. The IEC61850 standard is the only universal standard in the field of power system automation which realizes the standardization of engineering operation of intelligent substation. To eliminate the lack of International unified standard for communication of energy router, this paper proposes to apply IEC61850 to plug and play interface and establishes the plug and play interface information model and information transfer services. This paper provides a research approach for the establishment of energy router communication standards, and promotes the development of energy router.

  18. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.

  19. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Improving hospital cost accounting with activity-based costing.

    PubMed

    Chan, Y C

    1993-01-01

    In this article, activity-based costing, an approach that has proved to be an improvement over the conventional costing system in product costing, is introduced. By combining activity-based costing with standard costing, health care administrators can better plan and control the costs of health services provided while ensuring that the organization's bottom line is healthy.

  1. Developing a Deeper Understanding of Community-Based Pedagogies with Teachers: Learning with and from Teachers in Colombia

    ERIC Educational Resources Information Center

    Sharkey, Judy; Clavijo Olarte, Amparo; Ramírez, Luz Maribel

    2016-01-01

    Here we share findings from a 9-month qualitative case study involving a school-university professional development inquiry into how teachers develop, implement, and interpret community-based pedagogies (CBPs), an asset-based approach to curriculum that acknowledges mandated standards but begins with recognizing and valuing local knowledge. After…

  2. Neck-Related Physical Function, Self-Efficacy, and Coping Strategies in Patients With Cervical Radiculopathy: A Randomized Clinical Trial of Postoperative Physiotherapy.

    PubMed

    Wibault, Johanna; Öberg, Birgitta; Dedering, Åsa; Löfgren, Håkan; Zsigmond, Peter; Persson, Liselott; Andell, Maria; R Jonsson, Margareta; Peolsson, Anneli

    2017-06-01

    The purpose of this study was to compare postoperative rehabilitation with structured physiotherapy to the standard approach in patients with cervical radiculopathy (CR) in a prospective randomized study at 6 months follow-up based on measures of neck-related physical function, self-efficacy, and coping strategies. Patients with persistent CR and scheduled for surgery (N = 202) were randomly assigned to structured postoperative physiotherapy or a standard postoperative approach. Structured postoperative physiotherapy combined neck-specific exercises with a behavioral approach. Baseline, 3-month, and 6-month evaluations included questionnaires and clinical examinations. Neck muscle endurance, active cervical range of motion, self-efficacy, pain catastrophizing (CSQ-CAT), perceived control over pain, and ability to decrease pain were analyzed for between-group differences using complete case and per-protocol approaches. No between-group difference was reported at the 6-month follow-up (P = .05-.99), but all outcomes had improved from baseline (P < .001). Patients undergoing structured postoperative physiotherapy with ≥50% attendance at treatment sessions had larger improvements in CSQ-CAT (P = .04) during the rehabilitation period from 3 to 6 months after surgery compared with the patients who received standard postoperative approach. No between-group difference was found at 6 months after surgery based on measures of neck-related physical function, self-efficacy, and coping strategies. However, the results confirm that neck-specific exercises are tolerated by patients with CR after surgery and may suggest a benefit from combining surgery with structured postoperative physiotherapy for patients with CR. Copyright © 2017. Published by Elsevier Inc.

  3. Fetal heart rate deceleration detection using a discrete cosine transform implementation of singular spectrum analysis.

    PubMed

    Warrick, P A; Precup, D; Hamilton, E F; Kearney, R E

    2007-01-01

    To develop a singular-spectrum analysis (SSA) based change-point detection algorithm applicable to fetal heart rate (FHR) monitoring to improve the detection of deceleration events. We present a method for decomposing a signal into near-orthogonal components via the discrete cosine transform (DCT) and apply this in a novel online manner to change-point detection based on SSA. The SSA technique forms models of the underlying signal that can be compared over time; models that are sufficiently different indicate signal change points. To adapt the algorithm to deceleration detection where many successive similar change events can occur, we modify the standard SSA algorithm to hold the reference model constant under such conditions, an approach that we term "base-hold SSA". The algorithm is applied to a database of 15 FHR tracings that have been preprocessed to locate candidate decelerations and is compared to the markings of an expert obstetrician. Of the 528 true and 1285 false decelerations presented to the algorithm, the base-hold approach improved on standard SSA, reducing the number of missed decelerations from 64 to 49 (21.9%) while maintaining the same reduction in false-positives (278). The standard SSA assumption that changes are infrequent does not apply to FHR analysis where decelerations can occur successively and in close proximity; our base-hold SSA modification improves detection of these types of event series.

  4. Intrathoracic airway measurement: ex-vivo validation

    NASA Astrophysics Data System (ADS)

    Reinhardt, Joseph M.; Raab, Stephen A.; D'Souza, Neil D.; Hoffman, Eric A.

    1997-05-01

    High-resolution x-ray CT (HRCT) provides detailed images of the lungs and bronchial tree. HRCT-based imaging and quantitation of peripheral bronchial airway geometry provides a valuable tool for assessing regional airway physiology. Such measurements have been sued to address physiological questions related to the mechanics of airway collapse in sleep apnea, the measurement of airway response to broncho-constriction agents, and to evaluate and track the progression of disease affecting the airways, such as asthma and cystic fibrosis. Significant attention has been paid to the measurements of extra- and intra-thoracic airways in 2D sections from volumetric x-ray CT. A variety of manual and semi-automatic techniques have been proposed for airway geometry measurement, including the use of standardized display window and level settings for caliper measurements, methods based on manual or semi-automatic border tracing, and more objective, quantitative approaches such as the use of the 'half-max' criteria. A recently proposed measurements technique uses a model-based deconvolution to estimate the location of the inner and outer airway walls. Validation using a plexiglass phantom indicates that the model-based method is more accurate than the half-max approach for thin-walled structures. In vivo validation of these airway measurement techniques is difficult because of the problems in identifying a reliable measurement 'gold standard.' In this paper we report on ex vivo validation of the half-max and model-based methods using an excised pig lung. The lung is sliced into thin sections of tissue and scanned using an electron beam CT scanner. Airways of interest are measured from the CT images, and also measured with using a microscope and micrometer to obtain a measurement gold standard. The result show no significant difference between the model-based measurements and the gold standard; while the half-max estimates exhibited a measurement bias and were significantly different than the gold standard.

  5. Polymer-and glass-based fluorescence standards for the near infrared (NIR) spectral region.

    PubMed

    Würth, Christian; Hoffmann, Katrin; Behnke, Thomas; Ohnesorge, Marius; Resch-Genger, Ute

    2011-05-01

    The widespread use and acceptance of fluorescence techniques especially in regulated areas like medical diagnostics is closely linked to standardization concepts that guarantee and improve the comparability and reliability of fluorescence measurements. At the core of such concepts are dependable fluorescence standards that are preferably certified. The ever rising interest in fluorescence measurements in the near-infrared (NIR) spectral region renders the availability of spectral and intensity standards for this wavelength region increasingly important. This encouraged us to develop approaches to solid NIR standards based upon dye-doped polymers and assess their application-relevant properties in comparison to metal ion-doped glasses. The overall goal is here to provide inexpensive, easily fabricated, and robust internal and external calibration tools for a broad variety of fluorescence instruments ranging e.g. from spectrofluorometers over fluorescence microscopes to miniaturized fluorescence sensors. © Springer Science+Business Media, LLC 2010

  6. Pharmacologic Therapy for Type 2 Diabetes: Synopsis of the 2017 American Diabetes Association Standards of Medical Care in Diabetes.

    PubMed

    Chamberlain, James J; Herman, William H; Leal, Sandra; Rhinehart, Andrew S; Shubrook, Jay H; Skolnik, Neil; Kalyani, Rita Rastogi

    2017-04-18

    The American Diabetes Association (ADA) annually updates the Standards of Medical Care in Diabetes to provide clinicians, patients, researchers, payers, and other interested parties with evidence-based recommendations for the diagnosis and management of patients with diabetes. For the 2017 Standards, the ADA Professional Practice Committee updated previous MEDLINE searches performed from 1 January 2016 to November 2016 to add, clarify, or revise recommendations based on new evidence. The committee rates the recommendations as A, B, or C, depending on the quality of evidence, or E for expert consensus or clinical experience. The Standards were reviewed and approved by the Executive Committee of the ADA Board of Directors, which includes health care professionals, scientists, and laypersons. Feedback from the larger clinical community informed revisions. This synopsis focuses on recommendations from the 2017 Standards about pharmacologic approaches to glycemic treatment of type 2 diabetes.

  7. A flexible and accurate quantification algorithm for electron probe X-ray microanalysis based on thin-film element yields

    NASA Astrophysics Data System (ADS)

    Schalm, O.; Janssens, K.

    2003-04-01

    Quantitative analysis by means of electron probe X-ray microanalysis (EPXMA) of low Z materials such as silicate glasses can be hampered by the fact that ice or other contaminants build up on the Si(Li) detector beryllium window or (in the case of a windowless detector) on the Si(Li) crystal itself. These layers act as an additional absorber in front of the detector crystal, decreasing the detection efficiency at low energies (<5 keV). Since the layer thickness gradually changes with time, also the detector efficiency in the low energy region is not constant. Using the normal ZAF approach to quantification of EPXMA data is cumbersome in these conditions, because spectra from reference materials and from unknown samples must be acquired within a fairly short period of time in order to avoid the effect of the change in efficiency. To avoid this problem, an alternative approach to quantification of EPXMA data is proposed, following a philosophy often employed in quantitative analysis of X-ray fluorescence (XRF) and proton-induced X-ray emission (PIXE) data. This approach is based on the (experimental) determination of thin-film element yields, rather than starting from infinitely thick and single element calibration standards. These thin-film sensitivity coefficients can also be interpolated to allow quantification of elements for which no suitable standards are available. The change in detector efficiency can be monitored by collecting an X-ray spectrum of one multi-element glass standard. This information is used to adapt the previously determined thin-film sensitivity coefficients to the actual detector efficiency conditions valid on the day that the experiments were carried out. The main advantage of this method is that spectra collected from the standards and from the unknown samples should not be acquired within a short period of time. This new approach is evaluated for glass and metal matrices and is compared with a standard ZAF method.

  8. Implementing the Next Generation Science Standards: Impacts on Geoscience Education

    NASA Astrophysics Data System (ADS)

    Wysession, M. E.

    2014-12-01

    This is a critical time for the geoscience community. The Next Generation Science Standards (NGSS) have been released and are now being adopted by states (a dozen states and Washington, DC, at the time of writing this), with dramatic implications for national K-12 science education. Curriculum developers and textbook companies are working hard to construct educational materials that match the new standards, which emphasize a hands-on practice-based approach that focuses on working directly with primary data and other forms of evidence. While the set of 8 science and engineering practices of the NGSS lend themselves well to the observation-oriented approach of much of the geosciences, there is currently not a sufficient number of geoscience educational modules and activities geared toward the K-12 levels, and geoscience research organizations need to be mobilizing their education & outreach programs to meet this need. It is a rare opportunity that will not come again in this generation. There are other significant issues surrounding the implementation of the NGSS. The NGSS involves a year of Earth and space science at the high school level, but there does not exist a sufficient workforce is geoscience teachers to meet this need. The form and content of the geoscience standards are also very different from past standards, moving away from a memorization and categorization approach and toward a complex Earth Systems Science approach. Combined with the shift toward practice-based teaching, this means that significant professional development will therefore be required for the existing K-12 geoscience education workforce. How the NGSS are to be assessed is another significant question, with an NRC report providing some guidance but leaving many questions unanswered. There is also an uneasy relationship between the NGSS and the Common Core of math and English, and the recent push-back against the Common Core in many states may impact the implementation of the NGSS.

  9. Mutation-Based Learning to Improve Student Autonomy and Scientific Inquiry Skills in a Large Genetics Laboratory Course

    PubMed Central

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a “mutation” method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the “mutations”; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional “cookbook”-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class. PMID:24006394

  10. Soft but Strong. Neg-Raising, Soft Triggers, and Exhaustification

    ERIC Educational Resources Information Center

    Romoli, Jacopo

    2012-01-01

    In this thesis, I focus on scalar implicatures, presuppositions and their connections. In chapter 2, I propose a scalar implicature-based account of neg-raising inferences, standardly analyzed as a presuppositional phenomenon (Gajewski 2005, 2007). I show that an approach based on scalar implicatures can straightforwardly account for the…

  11. SETTING EXPECTATIONS FOR THE OHIO RIVER FISH INDEX BASED ON IN-STREAM HABITAT

    EPA Science Inventory

    The use of habitat criteria for setting fish community assessment expectations is common for streams, but a standard approach for great rivers remains largely undeveloped. We developed assessment expectations for the Ohio River Fish Index (ORFIN) based on measures of in-stream h...

  12. Automatic Syllabification in English: A Comparison of Different Algorithms

    ERIC Educational Resources Information Center

    Marchand, Yannick; Adsett, Connie R.; Damper, Robert I.

    2009-01-01

    Automatic syllabification of words is challenging, not least because the syllable is not easy to define precisely. Consequently, no accepted standard algorithm for automatic syllabification exists. There are two broad approaches: rule-based and data-driven. The rule-based method effectively embodies some theoretical position regarding the…

  13. 0-6759 : developing a business process and logical model to support a tour-based travel demand model design for TxDOT.

    DOT National Transportation Integrated Search

    2013-08-01

    The Texas Department of Transportation : (TxDOT) created a standardized trip-based : modeling approach for travel demand modeling : called the Texas Package Suite of Travel Demand : Models (referred to as the Texas Package) to : oversee the travel de...

  14. Best Available Evidence: Three Complementary Approaches

    ERIC Educational Resources Information Center

    Slocum, Timothy A.; Spencer, Trina D.; Detrich, Ronnie

    2012-01-01

    The best available evidence is one of the three critical features of evidence-based practice. Best available evidence is often considered to be synonymous with extremely high standards for research methodology. However, this notion may limit the scope and impact of evidence based practice to those educational decisions on which high quality…

  15. The Impact of the AACTE-Microsoft Grant on Elementary Reading & Writing

    ERIC Educational Resources Information Center

    Borgia, Laurel; Cheek, Earl H., Jr.

    2005-01-01

    Accountability for student learning and support of evidence-based instructional approaches are critical responsibilities for teachers. Both are particularly significant with the current reliance on state standards, assessment tests and the No Child Left Behind Act (Shanahan 2002). Every elementary teacher must have research-based resources to help…

  16. Algorithms of maximum likelihood data clustering with applications

    NASA Astrophysics Data System (ADS)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  17. ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES

    PubMed Central

    LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.

    2008-01-01

    Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508

  18. Color standardization and optimization in whole slide imaging.

    PubMed

    Yagi, Yukako

    2011-03-30

    Standardization and validation of the color displayed by digital slides is an important aspect of digital pathology implementation. While the most common reason for color variation is the variance in the protocols and practices in the histology lab, the color displayed can also be affected by variation in capture parameters (for example, illumination and filters), image processing and display factors in the digital systems themselves. We have been developing techniques for color validation and optimization along two paths. The first was based on two standard slides that are scanned and displayed by the imaging system in question. In this approach, one slide is embedded with nine filters with colors selected especially for H&E stained slides (looking like tiny Macbeth color chart); the specific color of the nine filters were determined in our previous study and modified for whole slide imaging (WSI). The other slide is an H&E stained mouse embryo. Both of these slides were scanned and the displayed images were compared to a standard. The second approach was based on our previous multispectral imaging research. As a first step, the two slide method (above) was used to identify inaccurate display of color and its cause, and to understand the importance of accurate color in digital pathology. We have also improved the multispectral-based algorithm for more consistent results in stain standardization. In near future, the results of the two slide and multispectral techniques can be combined and will be widely available. We have been conducting a series of researches and developing projects to improve image quality to establish Image Quality Standardization. This paper discusses one of most important aspects of image quality - color.

  19. Purpose, Processes, Partnerships, and Products: 4Ps to advance Participatory Socio-Environmental Modeling

    NASA Astrophysics Data System (ADS)

    Gray, S. G.; Voinov, A. A.; Jordan, R.; Paolisso, M.

    2016-12-01

    Model-based reasoning is a basic part of human understanding, decision-making, and communication. Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding environmental change since stakeholders often hold valuable knowledge about socio-environmental dynamics and since collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four dimensional framework that includes reporting on dimensions of: (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of environmental changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of environmental policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.

  20. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  1. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  2. A Web-based approach to blood donor preparation.

    PubMed

    France, Christopher R; France, Janis L; Kowalsky, Jennifer M; Copley, Diane M; Lewis, Kristin N; Ellis, Gary D; McGlone, Sarah T; Sinclair, Kadian S

    2013-02-01

    Written and video approaches to donor education have been shown to enhance donation attitudes and intentions to give blood, particularly when the information provides specific coping suggestions for donation-related concerns. This study extends this work by comparing Web-based approaches to donor preparation among donors and nondonors. Young adults (62% female; mean [±SD] age, 19.3 [±1.5] years; mean [range] number of prior blood donations, 1.1 [0-26]; 60% nondonors) were randomly assigned to view 1) a study Web site designed to address common blood donor concerns and suggest specific coping strategies (n = 238), 2) a standard blood center Web site (n = 233), or 3) a control Web site where participants viewed videos of their choice (n = 202). Measures of donation attitude, anxiety, confidence, intention, anticipated regret, and moral norm were completed before and after the intervention. Among nondonors, the study Web site produced greater changes in donation attitude, confidence, intention, and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for moral norm and anxiety. Among donors, the study Web site produced greater changes in donation confidence and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for donation attitude, anxiety, intention, and moral norm. Web-based donor preparation materials may provide a cost-effective way to enhance donation intentions and encourage donation behavior. © 2012 American Association of Blood Banks.

  3. Empirically based comparisons of the reliability and validity of common quantification approaches for eyeblink startle potentiation in humans

    PubMed Central

    Bradford, Daniel E.; Starr, Mark J.; Shackman, Alexander J.

    2015-01-01

    Abstract Startle potentiation is a well‐validated translational measure of negative affect. Startle potentiation is widely used in clinical and affective science, and there are multiple approaches for its quantification. The three most commonly used approaches quantify startle potentiation as the increase in startle response from a neutral to threat condition based on (1) raw potentiation, (2) standardized potentiation, or (3) percent‐change potentiation. These three quantification approaches may yield qualitatively different conclusions about effects of independent variables (IVs) on affect when within‐ or between‐group differences exist for startle response in the neutral condition. Accordingly, we directly compared these quantification approaches in a shock‐threat task using four IVs known to influence startle response in the no‐threat condition: probe intensity, time (i.e., habituation), alcohol administration, and individual differences in general startle reactivity measured at baseline. We confirmed the expected effects of time, alcohol, and general startle reactivity on affect using self‐reported fear/anxiety as a criterion. The percent‐change approach displayed apparent artifact across all four IVs, which raises substantial concerns about its validity. Both raw and standardized potentiation approaches were stable across probe intensity and time, which supports their validity. However, only raw potentiation displayed effects that were consistent with a priori specifications and/or the self‐report criterion for the effects of alcohol and general startle reactivity. Supplemental analyses of reliability and validity for each approach provided additional evidence in support of raw potentiation. PMID:26372120

  4. Evaluting the Validity of Technology-Enhanced Educational Assessment Items and Tasks: An Emprical Approach to Studying Item Features and Scoring Rubrics

    ERIC Educational Resources Information Center

    Thomas, Ally

    2016-01-01

    With the advent of the newly developed Common Core State Standards and the Next Generation Science Standards, innovative assessments, including technology-enhanced items and tasks, will be needed to meet the challenges of developing valid and reliable assessments in a world of computer-based testing. In a recent critique of the next generation…

  5. 77 FR 15577 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-16

    ....S.C. 552(a), 1 CFR part 51, and Sec. 97.20 of Title 14 of the Code of Federal Regulations. The large... contained in this amendment are based on the criteria contained in the U.S. Standard for Terminal Instrument... 97 continues to read as follows: Authority: 49 U.S.C. 106(g), 40103, 40106, 40113, 40114, 40120...

  6. Connecting Emergent Curriculum and Standards in the Early Childhood Classroom: Strengthening Content and Teaching Practice. Early Childhood Education Series

    ERIC Educational Resources Information Center

    Schwartz, Sydney L.; Copeland, Sherry M.

    2010-01-01

    The most pressing challenge in early childhood education today is to find a way to meet the standards within a developmentally appropriate approach. In this book, two active early childhood educators provide teachers with resources to bring content alive and document it in every-day, action-based pre-K and Kindergarten classrooms. The book…

  7. Simplified Approach Charts Improve Data Retrieval Performance

    PubMed Central

    Stewart, Michael; Laraway, Sean; Jordan, Kevin; Feary, Michael S.

    2016-01-01

    The effectiveness of different instrument approach charts to deliver minimum visibility and altitude information during airport equipment outages was investigated. Eighteen pilots flew simulated instrument approaches in three conditions: (a) normal operations using a standard approach chart (standard-normal), (b) equipment outage conditions using a standard approach chart (standard-outage), and (c) equipment outage conditions using a prototype decluttered approach chart (prototype-outage). Errors and retrieval times in identifying minimum altitudes and visibilities were measured. The standard-outage condition produced significantly more errors and longer retrieval times versus the standard-normal condition. The prototype-outage condition had significantly fewer errors and shorter retrieval times than did the standard-outage condition. The prototype-outage condition produced significantly fewer errors but similar retrieval times when compared with the standard-normal condition. Thus, changing the presentation of minima may reduce risk and increase safety in instrument approaches, specifically with airport equipment outages. PMID:28491009

  8. Helicopter approach capability using the differential global positioning system. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kaufmann, David N.

    1993-01-01

    The results of flight tests to determine the feasibility of using the Global Positioning System (GPS) in the differential mode (DGPS) to provide high accuracy, precision navigation, and guidance for helicopter approaches to landing are presented. The airborne DGPS receiver and associated equipment is installed in a NASA UH-60 Black Hawk helicopter. The ground-based DGPS reference receiver is located at a surveyed test site and is equipped with a real-time VHF data link to transmit correction information to the airborne DGPS receiver. The corrected airborne DGPS information, together with the preset approach geometry, is used to calculate guidance commands which are sent to the aircraft's approach guidance instruments. The use of DGPS derived guidance for helicopter approaches to landing is evaluated by comparing the DGPS data with the laser tracker truth data. Both standard (3 deg) and steep (6 deg and 9 deg) glideslope straight-in approaches were flown. DGPS positioning accuracy based on a time history analysis of the entire approach was 0.2 m (mean) +/- 1.8 m (2 sigma) laterally and -2.0 m (mean) +/- 3.5 m (2 sigma) vertically for 3 deg glideslope approaches, -0.1 m (mean) +/- 1.5 m (2 sigma) laterally and -1.1 m (mean) +/- 3.5 m (2 sigma) vertically for 6 deg glideslope approaches and 0.2 m (mean) +/- 1.3 m (2 sigma) laterally and -1.0 m (mean) +/- 2.8 m (2 sigma) vertically for 9 deg glideslope approaches. DGPS positioning accuracy at the 200 ft decision height (DH) on a standard 3 deg slideslope approach was 0.3 m (mean) +/- 1.5 m (2 sigma) laterally and -2.3 m (mean) +/- 1.6 m (2 sigma) vertically. These errors indicate that the helicopter position based on DGPS guidance satisfies the International Civil Aviation Organization (ICAO) Category 1 (CAT 1) lateral and vertical navigational accuracy requirements.

  9. Helicopter approach capability using the differential Global Positioning System

    NASA Technical Reports Server (NTRS)

    Kaufmann, David N.

    1993-01-01

    The results of flight tests to determine the feasibility of using the Global Positioning System (GPS) in the differential mode (DGPS) to provide high accuracy, precision navigation and guidance for helicopter approaches to landing are presented. The airborne DGPS receiver and associated equipment is installed in a NASA UH-60 Black Hawk helicopter. The ground-based DGPS reference receiver is located at a surveyed test site and is equipped with a real-time VHF data link to transmit correction information to the airborne DGPS receiver. The corrected airborne DGPS information, together with the preset approach geometry, is used to calculate guidance commands which are sent to the aircraft's approach guidance instruments. The use of DGPS derived guidance for helicopter approaches to landing is evaluated by comparing the DGPS data with the laser tracker truth data. Both standard (3 degrees) and steep (6 degrees and 9 degrees) glidescope straight-in approaches were flown. DGPS positioning accuracy based on a time history analysis of the entire approach was 0.2 m (mean) +/- 1.8 m (2 sigma) laterally and -2.0 m (mean) +/- 3.5 m (2 sigma) vertically for 3 degree glidescope approaches, -0.1 m (mean) +/- 1.5 m (2 sigma) laterally and -1.1 m (mean) +/- 3.5 m (2 sigma) vertically for 6 degree glidescope approaches, and 0.2 m (mean) +/- 1.3 m (2 sigma) laterally and -1.0 m (mean) +/- 2.8 (2 sigma) vertically for 9 degree glidescope approaches. DGPS positioning accuracy at the 200 ft decision height on a standard 3 degree glidescope approach was 0.3 m (mean) +/- 1.5 m (2 sigma) laterally and -2.3 m (mean) +/- 1.6 m (2 sigma) vertically. These errors indicate that the helicopter position based on DGPS guidance satisfies the International Civil Aviation Organization Category 1 lateral and vertical accuracy requirements.

  10. Reverse breech extraction versus the standard approach of pushing the impacted fetal head up through the vagina in caesarean section for obstructed labour: A randomised controlled trial.

    PubMed

    Nooh, Ahmed Mohamed; Abdeldayem, Hussein Mohammed; Ben-Affan, Othman

    2017-05-01

    The objective of this study was to assess effectiveness and safety of the reverse breech extraction approach in Caesarean section for obstructed labour, and compare it with the standard approach of pushing the fetal head up through the vagina. This randomised controlled trial included 192 women. In 96, the baby was delivered by the 'reverse breech extraction approach', and in the remaining 96, by the 'standard approach'. Extension of uterine incision occurred in 18 participants (18.8%) in the reverse breech extraction approach group, and 46 (47.9%) in the standard approach group (p = .0003). Two women (2.1%) in the reverse breech extraction approach group needed blood transfusion and 11 (11.5%) in the standard approach group (p = .012). Pyrexia developed in 3 participants (3.1%) in the reverse breech extraction approach group, and 19 (19.8%) in the standard approach group (p = .0006). Wound infection occurred in 2 women (2.1%) in the reverse breech extraction approach group, and 12 (12.5%) in the standard approach group (p = .007). Apgar score <7 at 5 minutes was noted in 8 babies (8.3%) in the reverse breech extraction approach group, and 21 (21.9%) in the standard approach group (p = .015). In conclusion, reverse breech extraction in Caesarean section for obstructed labour is an effective and safe alternative to the standard approach of pushing the fetal head up through the vagina.

  11. Physiology-based modelling approaches to characterize fish habitat suitability: Their usefulness and limitations

    NASA Astrophysics Data System (ADS)

    Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo

    2018-02-01

    Models are useful tools for predicting the impact of global change on species distribution and abundance. As ectotherms, fish are being challenged to adapt or track changes in their environment, either in time through a phenological shift or in space by a biogeographic shift. Past modelling efforts have largely been based on correlative Species Distribution Models, which use known occurrences of species across landscapes of interest to define sets of conditions under which species are likely to maintain populations. The practical advantages of this correlative approach are its simplicity and the flexibility in terms of data requirements. However, effective conservation management requires models that make projections beyond the range of available data. One way to deal with such an extrapolation is to use a mechanistic approach based on physiological processes underlying climate change effects on organisms. Here we illustrate two approaches for developing physiology-based models to characterize fish habitat suitability. (i) Aerobic Scope Models (ASM) are based on the relationship between environmental factors and aerobic scope (defined as the difference between maximum and standard (basal) metabolism). This approach is based on experimental data collected by using a number of treatments that allow a function to be derived to predict aerobic metabolic scope from the stressor/environmental factor(s). This function is then integrated with environmental (oceanographic) data of current and future scenarios. For any given species, this approach allows habitat suitability maps to be generated at various spatiotemporal scales. The strength of the ASM approach relies on the estimate of relative performance when comparing, for example, different locations or different species. (ii) Dynamic Energy Budget (DEB) models are based on first principles including the idea that metabolism is organised in the same way within all animals. The (standard) DEB model aims to describe empirical relationships which can be found consistently within physiological data across the animal kingdom. The advantages of the DEB models are that they make use of the generalities found in terms of animal physiology and can therefore be applied to species for which little data or empirical observations are available. In addition, the limitations as well as useful potential refinements of these and other physiology-based modelling approaches are discussed. Inclusion of the physiological response of various life stages and modelling the patterns of extreme events observed in nature are suggested for future work.

  12. A Novel Approach to Simulation-Based Education for Veterinary Medical Communication Training Over Eight Consecutive Pre-Clinical Quarters.

    PubMed

    Englar, Ryane E

    Experiential learning through the use of standardized patients (SPs) is the primary way by which human medical schools teach clinical communication. The profession of veterinary medicine has followed suit in response to new graduates' and their employers' concerns that veterinary interpersonal skills are weak and unsatisfactory. As a result, standardized clients (SCs) are increasingly relied upon as invaluable teaching tools within veterinary curricula to advance relationship-centered care in the context of a clinical scenario. However, there is little to no uniformity in the approach that various colleges of veterinary medicine take when designing simulation-based education (SBE). A further complication is that programs with pre-conceived curricula must now make room for training in clinical communication. Curricular time constraints challenge veterinary colleges to individually decide how best to utilize SCs in what time is available. Because it is a new program, Midwestern University College of Veterinary Medicine (MWU CVM) has had the flexibility and the freedom to prioritize an innovative approach to SBE. The author discusses the SBE that is currently underway at MWU CVM, which incorporates 27 standardized client encounters over eight consecutive pre-clinical quarters. Prior to entering clinical rotations, MWU CVM students are exposed to a variety of simulation formats, species, clients, settings, presenting complaints, and communication tasks. These represent key learning opportunities for students to practice clinical communication, develop self-awareness, and strategize their approach to future clinical experiences.

  13. Developing accreditation for community based surgery: the Irish experience.

    PubMed

    Ní Riain, Ailís; Collins, Claire; O'Sullivan, Tony

    2018-02-05

    Purpose Carrying out minor surgery procedures in the primary care setting is popular with patients, cost effective and delivers at least as good outcomes as those performed in the hospital setting. This paper aims to describe the central role of clinical leadership in developing an accreditation system for general practitioners (GPs) undertaking community-based surgery in the Irish national setting where no mandatory accreditation process currently exists. Design/methodology/approach In all, 24 GPs were recruited to the GP network. Ten pilot standards were developed addressing GPs' experience and training, clinical activity and practice supporting infrastructure and tested, using information and document review, prospective collection of clinical data and a practice inspection visit. Two additional components were incorporated into the project (patient satisfaction survey and self-audit). A multi-modal evaluation was undertaken. A majority of GPs was included at all stages of the project, in line with the principles of action learning. The steering group had a majority of GPs with relevant expertise and representation of all other actors in the minor surgery arena. The GP research network contributed to each stage of the project. The project lead was a GP with minor surgery experience. Quantitative data collected were analysed using Predictive Analytic SoftWare. Krueger's framework analysis approach was used to analyse the qualitative data. Findings A total of 9 GPs achieved all standards at initial review, 14 successfully completed corrective actions and 1 GP did not achieve the required standard. Standards were then amended to reflect findings and a supporting framework was developed. Originality/value The flexibility of the action-learning approach and the clinical leadership design allowed for the development of robust quality standards in a short timeframe.

  14. Evaluation of online carbon isotope dilution mass spectrometry for the purity assessment of synthetic peptide standards.

    PubMed

    Cueto Díaz, Sergio; Ruiz Encinar, Jorge; García Alonso, J Ignacio

    2014-09-24

    We present a novel method for the purity assessment of peptide standards which is applicable to any water soluble peptide. The method is based on the online (13)C isotope dilution approach in which the peptide is separated from its related impurities by liquid chromatography (LC) and the eluent is mixed post-column with a continuous flow of (13)C-enriched sodium bicarbonate. An online oxidation step using sodium persulfate in acidic media at 99°C provides quantitative oxidation to (12)CO2 and (13)CO2 respectively which is extracted to a gaseous phase with the help of a gas permeable membrane. The measurement of the isotope ratio 44/45 in the mass spectrometer allows the construction of the mass flow chromatogram. As the only species that is finally measured in the mass spectrometer is CO2, the peptide content in the standard can be quantified, on the base of its carbon content, using a generic primary standard such as potassium hydrogen phthalate. The approach was validated by the analysis of a reference material (NIST 8327), and applied to the quantification of two commercial synthetic peptide standards. In that case, the results obtained were compared with those obtained using alternative methods, such as amino acid analysis and ICP-MS. The results obtained proved the value of the method for the fast, accurate and precise mass purity assignment of synthetic peptide standards. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Error-Based Design Space Windowing

    NASA Technical Reports Server (NTRS)

    Papila, Melih; Papila, Nilay U.; Shyy, Wei; Haftka, Raphael T.; Fitz-Coy, Norman

    2002-01-01

    Windowing of design space is considered in order to reduce the bias errors due to low-order polynomial response surfaces (RS). Standard design space windowing (DSW) uses a region of interest by setting a requirement on response level and checks it by a global RS predictions over the design space. This approach, however, is vulnerable since RS modeling errors may lead to the wrong region to zoom on. The approach is modified by introducing an eigenvalue error measure based on point-to-point mean squared error criterion. Two examples are presented to demonstrate the benefit of the error-based DSW.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendell, Mark J.; Fisk, William J.

    Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effectsmore » associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each specific outcome threshold are estimated; and the highest of these MVRs, which would then meet all outcome thresholds, is selected as the target MVR. In a second step, implemented only if the target MVR from step 1 is judged impractically high, costs and benefits are estimated and this information is used in a risk management process. Four human outcomes with substantial quantitative evidence of relationships to VRs are identified for initial consideration in setting MVR standards. These are: building-related symptoms (sometimes called sick building syndrome symptoms), poor perceived indoor air quality, and diminished work performance, all with data relating them directly to VRs; and cancer and non-cancer chronic outcomes, related indirectly to VRs through specific VR-influenced indoor contaminants. In an application of step 1 for offices using a set of example outcome thresholds, a target MVR of 9 L/s (19 cfm) per person was needed. Because this target MVR was close to MVRs in current standards, use of a cost/benefit process seemed unnecessary. Selection of more stringent thresholds for one or more human outcomes, however, could raise the target MVR to 14 L/s (30 cfm) per person or higher, triggering the step 2 risk management process. Consideration of outdoor air pollutant effects would add further complexity to the framework. For balancing the objective and subjective factors involved in setting MVRs in a cost-benefit process, it is suggested that a diverse group of stakeholders make the determination after assembling as much quantitative data as possible.« less

  17. In the Face of Cybersecurity: How the Common Information Model Can Be Used

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skare, Paul; Falk, Herbert; Rice, Mark

    2016-01-01

    Efforts are underway to combine smart grid information, devices, networking, and emergency response information to create messages that are not dependent on specific standards development organizations (SDOs). This supports a future-proof approach of allowing changes in the canonical data models (CDMs) going forward without having to perform forklift replacements of solutions that use the messages. This also allows end users (electric utilities) to upgrade individual components of a larger system while keeping the message payload definitions intact. The goal is to enable public and private information sharing securely in a standards-based approach that can be integrated into existing operations. Wemore » provide an example architecture that could benefit from this multi-SDO, secure message approach. This article also describes how to improve message security« less

  18. IT Security Standards and Legal Metrology - Transfer and Validation

    NASA Astrophysics Data System (ADS)

    Thiel, F.; Hartmann, V.; Grottker, U.; Richter, D.

    2014-08-01

    Legal Metrology's requirements can be transferred into the IT security domain applying a generic set of standardized rules provided by the Common Criteria (ISO/IEC 15408). We will outline the transfer and cross validation of such an approach. As an example serves the integration of Legal Metrology's requirements into a recently developed Common Criteria based Protection Profile for a Smart Meter Gateway designed under the leadership of the Germany's Federal Office for Information Security. The requirements on utility meters laid down in the Measuring Instruments Directive (MID) are incorporated. A verification approach to check for meeting Legal Metrology's requirements by their interpretation through Common Criteria's generic requirements is also presented.

  19. The sexual double standard in African American adolescent women's sexual risk reduction socialization.

    PubMed

    Fasula, Amy M; Miller, Kim S; Wiener, Jeffrey

    2007-01-01

    This study explored the sexual double standard (SDS) (in which males are afforded more freedom and power than females in heterosexual interactions) in African American mothers' sexual messages to sons and daughters. We used a convenience sample of 129 African American adolescents, aged 14 to 17 years, and their mothers who reported SDS attitudes. Qualitative analyses revealed gender differences based on an SDS in mothers' sexual risk reduction socialization. Mothers typically took a proactive approach with sons and a neutral or prohibitive approach with daughters. Findings provide directions for socially relevant programs for African American parents, schools, and communities.

  20. The fallopian canal: a comprehensive review and proposal of a new classification.

    PubMed

    Mortazavi, M M; Latif, B; Verma, K; Adeeb, N; Deep, A; Griessenauer, C J; Tubbs, R S; Fukushima, T

    2014-03-01

    The facial nerve follows a complex course through the skull base. Understanding its anatomy is crucial during standard skull base approaches and resection of certain skull base tumors closely related to the nerve, especially, tumors at the cerebellopontine angle. Herein, we review the fallopian canal and its implications in surgical approaches to the skull base. Furthermore, we suggest a new classification. Based on the anatomy and literature, we propose that the meatal segment of the facial nerve be included as a component of the fallopian canal. A comprehensive knowledge of the course of the facial nerve is important to those who treat patients with pathology of or near this cranial nerve.

  1. Affecting Student Perception and Performance by Grading with 10,000 Points

    ERIC Educational Resources Information Center

    Peterson, Claudette M.; Peterson, Tim O.

    2016-01-01

    As professors, we each have our own approach to grading which allows us to assess learning and provide useful feedback to our students, yet is not too onerous. This article explains one approach we have used that differs from standard grading scales we often hear about from our colleagues. Rather than being based on 100 points or 100% over the…

  2. The Elementary School Success Profile Model of Assessment and Prevention: Balancing Effective Practice Standards and Feasibility

    ERIC Educational Resources Information Center

    Bowen, Natasha K.; Powers, Joelle D.

    2011-01-01

    Evidence-based practice and data-driven decision making (DDDM) are two approaches to accountability that have been promoted in the school literature. In spite of the push to promote these approaches in schools, barriers to their widespread, appropriate, and effective use have limited their impact on practice and student outcomes. This article…

  3. Arabic Information Retrieval at UMass in TREC-10

    DTIC Science & Technology

    2006-01-01

    electronic bilingual dictionaries , and stemmers, and our unfamiliarity with Arabic, we had our hands full carrying out some standard approaches to... monolingual and cross-lan- guage Arabic retrieval, and did not submit any runs based on novel approaches. We submitted three monolingual runs and one... dictionary construction, expanded Arabic queries, improved estimation and smoothing in language models, and added combination of evidence, increasing

  4. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    NASA Astrophysics Data System (ADS)

    Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.

    2012-01-01

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method.

  6. Assessment of Microphysical Models in the National Combustion Code (NCC) for Aircraft Particulate Emissions: Particle Loss in Sampling Lines

    NASA Technical Reports Server (NTRS)

    Wey, Thomas; Liu, Nan-Suey

    2008-01-01

    This paper at first describes the fluid network approach recently implemented into the National Combustion Code (NCC) for the simulation of transport of aerosols (volatile particles and soot) in the particulate sampling systems. This network-based approach complements the other two approaches already in the NCC, namely, the lower-order temporal approach and the CFD-based approach. The accuracy and the computational costs of these three approaches are then investigated in terms of their application to the prediction of particle losses through sample transmission and distribution lines. Their predictive capabilities are assessed by comparing the computed results with the experimental data. The present work will help establish standard methodologies for measuring the size and concentration of particles in high-temperature, high-velocity jet engine exhaust. Furthermore, the present work also represents the first step of a long term effort of validating physics-based tools for the prediction of aircraft particulate emissions.

  7. Creation of a bovine herpes virus 1 (BoHV-1) quantitative particle standard by transmission electron microscopy and comparison with established standards for use in real-time PCR.

    PubMed

    Hoferer, Marc; Braun, Anne; Sting, Reinhard

    2017-07-01

    Standards are pivotal for pathogen quantification by real-time PCR (qPCR); however, the creation of a complete and universally applicable virus particle standard is challenging. In the present study a procedure based on purification of bovine herpes virus type 1 (BoHV-1) and subsequent quantification by transmission electron microscopy (TEM) is described. Accompanying quantitative quality controls of the TEM preparation procedure using qPCR yielded recovery rates of more than 95% of the BoHV-1 virus particles on the grid used for virus counting, which was attributed to pre-treatment of the grid with 5% bovine albumin. To compare the value of the new virus particle standard for use in qPCR, virus counter based quantification and established pure DNA standards represented by a plasmid and an oligonucleotide were included. It could be shown that the numbers of virus particles, plasmid and oligonucleotide equivalents were within one log10 range determined on the basis of standard curves indicating that different approaches provide comparable quantitative values. However, only virus particles represent a complete, universally applicable quantitative virus standard that meets the high requirements of an RNA and DNA virus gold standard. In contrast, standards based on pure DNA have to be considered as sub-standard due to limited applications. Copyright © 2017 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  8. A statistical analysis of energy and power demand for the tractive purposes of an electric vehicle in urban traffic - an analysis of a short and long observation period

    NASA Astrophysics Data System (ADS)

    Slaski, G.; Ohde, B.

    2016-09-01

    The article presents the results of a statistical dispersion analysis of an energy and power demand for tractive purposes of a battery electric vehicle. The authors compare data distribution for different values of an average speed in two approaches, namely a short and long period of observation. The short period of observation (generally around several hundred meters) results from a previously proposed macroscopic energy consumption model based on an average speed per road section. This approach yielded high values of standard deviation and coefficient of variation (the ratio between standard deviation and the mean) around 0.7-1.2. The long period of observation (about several kilometers long) is similar in length to standardized speed cycles used in testing a vehicle energy consumption and available range. The data were analysed to determine the impact of observation length on the energy and power demand variation. The analysis was based on a simulation of electric power and energy consumption performed with speed profiles data recorded in Poznan agglomeration.

  9. Using Clinical Data Standards to Measure Quality: A New Approach.

    PubMed

    D'Amore, John D; Li, Chun; McCrary, Laura; Niloff, Jonathan M; Sittig, Dean F; McCoy, Allison B; Wright, Adam

    2018-04-01

     Value-based payment for care requires the consistent, objective calculation of care quality. Previous initiatives to calculate ambulatory quality measures have relied on billing data or individual electronic health records (EHRs) to calculate and report performance. New methods for quality measure calculation promoted by federal regulations allow qualified clinical data registries to report quality outcomes based on data aggregated across facilities and EHRs using interoperability standards.  This research evaluates the use of clinical document interchange standards as the basis for quality measurement.  Using data on 1,100 patients from 11 ambulatory care facilities and 5 different EHRs, challenges to quality measurement are identified and addressed for 17 certified quality measures.  Iterative solutions were identified for 14 measures that improved patient inclusion and measure calculation accuracy. Findings validate this approach to improving measure accuracy while maintaining measure certification.  Organizations that report care quality should be aware of how identified issues affect quality measure selection and calculation. Quality measure authors should consider increasing real-world validation and the consistency of measure logic in respect to issues identified in this research. Schattauer GmbH Stuttgart.

  10. An approach for software-driven and standard-based support of cross-enterprise tumor boards.

    PubMed

    Mangesius, Patrick; Fischer, Bernd; Schabetsberger, Thomas

    2015-01-01

    For tumor boards, the networking of different medical disciplines' expertise continues to gain importance. However, interdisciplinary tumor boards spread across several institutions are rarely supported by information technology tools today. The aim of this paper is to point out an approach for a tumor board management system prototype. For analyzing the requirements, an incremental process was used. The requirements were surveyed using Informal Conversational Interview and documented with Use Case Diagrams defined by the Unified Modeling Language (UML). Analyses of current EHR standards were conducted to evaluate technical requirements. Functional and technical requirements of clinical conference applications were evaluated and documented. In several steps, workflows were derived and application mockups were created. Although there is a vast amount of common understanding concerning how clinical conferences should be conducted and how their workflows should be structured, these are hardly standardized, neither on a functional nor on a technical level. This results in drawbacks for participants and patients. Using modern EHR technologies based on profiles such as IHE Cross Enterprise document sharing (XDS), these deficits could be overcome.

  11. Near-infrared fluorescence image quality test methods for standardized performance evaluation

    NASA Astrophysics Data System (ADS)

    Kanniyappan, Udayakumar; Wang, Bohan; Yang, Charles; Ghassemi, Pejhman; Wang, Quanzeng; Chen, Yu; Pfefer, Joshua

    2017-03-01

    Near-infrared fluorescence (NIRF) imaging has gained much attention as a clinical method for enhancing visualization of cancers, perfusion and biological structures in surgical applications where a fluorescent dye is monitored by an imaging system. In order to address the emerging need for standardization of this innovative technology, it is necessary to develop and validate test methods suitable for objective, quantitative assessment of device performance. Towards this goal, we develop target-based test methods and investigate best practices for key NIRF imaging system performance characteristics including spatial resolution, depth of field and sensitivity. Characterization of fluorescence properties was performed by generating excitation-emission matrix properties of indocyanine green and quantum dots in biological solutions and matrix materials. A turbid, fluorophore-doped target was used, along with a resolution target for assessing image sharpness. Multi-well plates filled with either liquid or solid targets were generated to explore best practices for evaluating detection sensitivity. Overall, our results demonstrate the utility of objective, quantitative, target-based testing approaches as well as the need to consider a wide range of factors in establishing standardized approaches for NIRF imaging system performance.

  12. Development of the supply chain oriented quality assurance system for aerospace manufacturing SMEs and its implementation perspectives

    NASA Astrophysics Data System (ADS)

    Hussein, Abdullahi; Cheng, Kai

    2016-10-01

    Aerospace manufacturing SMEs are continuously facing the challenge on managing their supply chain and complying with the aerospace manufacturing quality standard requirement due to their lack of resources and the nature of business. In this paper, the ERP system based approach is presented to quality control and assurance work in light of seamless integration of in-process production data and information internally and therefore managing suppliers more effectively and efficiently. The Aerospace Manufacturing Quality Assurance Standard (BS/EN9100) is one of the most recognised and essential protocols for developing the industry-operated-and-driven quality assurance systems. The research investigates using the ERP based system as an enabler to implement BS/EN9100 quality management system at manufacturing SMEs and the associated implementation and application perspectives. An application case study on a manufacturing SME is presented by using the SAP based implementation, which helps further evaluate and validate the approach and application system development.

  13. Optimizing the early phase development of new analgesics by human pain biomarkers.

    PubMed

    Arendt-Nielsen, Lars; Hoeck, Hans Christian

    2011-11-01

    Human pain biomarkers are based on standardized acute activation of pain pathways/mechanisms and quantitative assessment of the evoked responses. This approach can be applied to healthy volunteers, to pain patients, and before and after pharmacological interventions to help understanding and profile the mode of action (proof-of-concept) of new and existing analgesic compounds. Standardized stimuli of different modalities can be applied to different tissues (multimodal and multi-tissue) for profiling analgesic compounds with respect to modulation of pain transduction, transmission, specific mechanisms and processing. This approach substantiates which specific compounds may work in particular clinical pain conditions. Human pain biomarkers can be translational and may bridge animal findings in clinical pain conditions, which in turn can provide new possibilities for designing more successful clinical trials. Biomarker based proof-of-concept drug studies in either volunteers or selected patient populations provide inexpensive, fast and reliable mechanism-based information about dose-efficacy relationships. This is important information in the early drug development phase and for designing large expensive clinical trials.

  14. Alternative calculations of individual patient time in therapeutic range while taking warfarin: results from the ROCKET AF trial.

    PubMed

    Singer, Daniel E; Hellkamp, Anne S; Yuan, Zhong; Lokhnygina, Yuliya; Patel, Manesh R; Piccini, Jonathan P; Hankey, Graeme J; Breithardt, Günter; Halperin, Jonathan L; Becker, Richard C; Hacke, Werner; Nessel, Christopher C; Mahaffey, Kenneth W; Fox, Keith A A; Califf, Robert M

    2015-03-03

    In the ROCKET AF (Rivaroxaban-Once-daily, oral, direct Factor Xa inhibition Compared with vitamin K antagonism for prevention of stroke and Embolism Trial in Atrial Fibrillation) trial, marked regional differences in control of warfarin anticoagulation, measured as the average individual patient time in the therapeutic range (iTTR) of the international normalized ratio (INR), were associated with longer inter-INR test intervals. The standard Rosendaal approach can produce biased low estimates of TTR after an appropriate dose change if the follow-up INR test interval is prolonged. We explored the effect of alternative calculations of TTR that more immediately account for dose changes on regional differences in mean iTTR in the ROCKET AF trial. We used an INR imputation method that accounts for dose change. We compared group mean iTTR values between our dose change-based method with the standard Rosendaal method and determined that the differences between approaches depended on the balance of dose changes that produced in-range INRs ("corrections") versus INRs that were out of range in the opposite direction ("overshoots"). In ROCKET AF, the overall mean iTTR of 55.2% (Rosendaal) increased up to 3.1% by using the dose change-based approach, depending on assumptions. However, large inter-regional differences in anticoagulation control persisted. TTR, the standard measure of control of warfarin anticoagulation, depends on imputing daily INR values for the vast majority of follow-up days. Our TTR calculation method may better reflect the impact of warfarin dose changes than the Rosendaal approach. In the ROCKET AF trial, this dose change-based approach led to a modest increase in overall mean iTTR but did not materially affect the large inter-regional differences previously reported. URL: ClinicalTrials.gov. Unique identifier: NCT00403767. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  15. Alternative Calculations of Individual Patient Time in Therapeutic Range While Taking Warfarin: Results From the ROCKET AF Trial

    PubMed Central

    Singer, Daniel E.; Hellkamp, Anne S.; Yuan, Zhong; Lokhnygina, Yuliya; Patel, Manesh R.; Piccini, Jonathan P.; Hankey, Graeme J.; Breithardt, Günter; Halperin, Jonathan L.; Becker, Richard C.; Hacke, Werner; Nessel, Christopher C.; Mahaffey, Kenneth W.; Fox, Keith A. A.; Califf, Robert M.

    2015-01-01

    Background In the ROCKET AF (Rivaroxaban–Once‐daily, oral, direct Factor Xa inhibition Compared with vitamin K antagonism for prevention of stroke and Embolism Trial in Atrial Fibrillation) trial, marked regional differences in control of warfarin anticoagulation, measured as the average individual patient time in the therapeutic range (iTTR) of the international normalized ratio (INR), were associated with longer inter‐INR test intervals. The standard Rosendaal approach can produce biased low estimates of TTR after an appropriate dose change if the follow‐up INR test interval is prolonged. We explored the effect of alternative calculations of TTR that more immediately account for dose changes on regional differences in mean iTTR in the ROCKET AF trial. Methods and Results We used an INR imputation method that accounts for dose change. We compared group mean iTTR values between our dose change–based method with the standard Rosendaal method and determined that the differences between approaches depended on the balance of dose changes that produced in‐range INRs (“corrections”) versus INRs that were out of range in the opposite direction (“overshoots”). In ROCKET AF, the overall mean iTTR of 55.2% (Rosendaal) increased up to 3.1% by using the dose change–based approach, depending on assumptions. However, large inter‐regional differences in anticoagulation control persisted. Conclusions TTR, the standard measure of control of warfarin anticoagulation, depends on imputing daily INR values for the vast majority of follow‐up days. Our TTR calculation method may better reflect the impact of warfarin dose changes than the Rosendaal approach. In the ROCKET AF trial, this dose change–based approach led to a modest increase in overall mean iTTR but did not materially affect the large inter‐regional differences previously reported. Clinical Trial Registration URL: ClinicalTrials.gov. Unique identifier: NCT00403767. PMID:25736441

  16. Pathways--A Case of Large-Scale Implementation of Evidence-Based Practice in Scientific Inquiry-Based Science Education

    ERIC Educational Resources Information Center

    Sotiriou, Sofoklis; Bybee, Rodger W.; Bogner, Franz X.

    2017-01-01

    The fundamental pioneering ideas about student-centered, inquiry-based learning initiatives are differing in Europe and the US. The latter had initiated various top-down schemes that have led to well-defined standards, while in Europe, with its some 50 independent educational systems, a wide variety of approaches has been evolved. In this present…

  17. Parents' Experiences of Applied Behaviour Analysis (ABA)-Based Interventions for Children Diagnosed with Autistic Spectrum Disorder

    ERIC Educational Resources Information Center

    McPhilemy, Catherine; Dillenburger, Karola

    2013-01-01

    Applied behaviour analysis (ABA)-based programmes are endorsed as the gold standard for treatment of children with autistic spectrum disorder (ASD) in most of North America. This is not the case in most of Europe, where instead a non-specified "eclectic" approach is adopted. We explored the social validity of ABA-based interventions with…

  18. Minimal intervention dentistry for early childhood caries and child dental anxiety: a randomized controlled trial.

    PubMed

    Arrow, P; Klobas, E

    2017-06-01

    To compare changes in child dental anxiety after treatment for early childhood caries (ECC) using two treatment approaches. Children with ECC were randomized to test (atraumatic restorative treatment (ART)-based approach) or control (standard care approach) groups. Children aged 3 years or older completed a dental anxiety scale at baseline and follow up. Changes in child dental anxiety from baseline to follow up were tested using the chi-squared statistic, Wilcoxon rank sum test, McNemar's test and multinomial logistic regression. Two hundred and fifty-four children were randomized (N = 127 test, N = 127 control). At baseline, 193 children completed the dental anxiety scale, 211 at follow up and 170 completed the scale on both occasions. Children who were anxious at baseline (11%) were no longer anxious at follow up, and 11% non-anxious children became anxious. Multinomial logistic regression found each increment in the number of visits increased the odds of worsening dental anxiety (odds ratio (OR), 2.2; P < 0.05), whereas each increment in the number of treatments lowered the odds of worsening anxiety (OR, 0.50; P = 0.05). The ART-based approach to managing ECC resulted in similar levels of dental anxiety to the standard treatment approach and provides a valuable alternative approach to the management of ECC in a primary dental care setting. © 2016 Australian Dental Association.

  19. NATIONAL ELECTRONIC DISEASE SURVEILLANCE SYSTEM (NEDSS)

    EPA Science Inventory

    The National Electronic Disease Surveillance System (NEDSS) project is a public health initiative to provide a standard-based, integrated approach to disease surveillance and to connect public health surveillance to the burgeoning clinical information systems infrastructure. NEDS...

  20. Finite element techniques in computational time series analysis of turbulent flows

    NASA Astrophysics Data System (ADS)

    Horenko, I.

    2009-04-01

    In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical assumptions used in the analysis will be pointed up in this example. Finally, applications to analysis of meteorological and climate data will be presented.

  1. An improved genetic algorithm for designing optimal temporal patterns of neural stimulation

    NASA Astrophysics Data System (ADS)

    Cassar, Isaac R.; Titus, Nathan D.; Grill, Warren M.

    2017-12-01

    Objective. Electrical neuromodulation therapies typically apply constant frequency stimulation, but non-regular temporal patterns of stimulation may be more effective and more efficient. However, the design space for temporal patterns is exceedingly large, and model-based optimization is required for pattern design. We designed and implemented a modified genetic algorithm (GA) intended for design optimal temporal patterns of electrical neuromodulation. Approach. We tested and modified standard GA methods for application to designing temporal patterns of neural stimulation. We evaluated each modification individually and all modifications collectively by comparing performance to the standard GA across three test functions and two biophysically-based models of neural stimulation. Main results. The proposed modifications of the GA significantly improved performance across the test functions and performed best when all were used collectively. The standard GA found patterns that outperformed fixed-frequency, clinically-standard patterns in biophysically-based models of neural stimulation, but the modified GA, in many fewer iterations, consistently converged to higher-scoring, non-regular patterns of stimulation. Significance. The proposed improvements to standard GA methodology reduced the number of iterations required for convergence and identified superior solutions.

  2. Medical Paraclinical Standards, Political Economy of Clinic, and Patients’ Clinical Dependency; A Critical Conversation Analysis of Clinical Counseling in South of Iran

    PubMed Central

    Kalateh Sadati, Ahmad; Iman, Mohammad Taghi; Bagheri Lankarani, Kamran

    2014-01-01

    Background: Despite its benefits and importance, clinical counseling affects the patient both psychosocially and socially. Illness labeling not only leads to many problems for patient and his/her family but also it imposes high costs to health care system. Among various factors, doctor-patient relationship has an important role in the clinical counseling and its medical approach. The goal of this study is to evaluate the nature of clinical counseling based on critical approach. Methods: The context of research is the second major medical training center in Shiraz, Iran. In this study, Critical Conversation Analysis was used based on the methodologies of critical theories. Among about 50 consultation meetings digitally recorded, 33 were selected for this study. Results: Results show that the nature of doctor-patient relationship in these cases is based on paternalistic model. On the other hand, in all consultations, the important values that were legitimated with physicians were medical paraclinical standards. Paternalism in one hand and standardization on the other leads to dependency of patients to the clinic. Conclusion: Although we can’t condone the paraclinical standards, clinical counseling and doctor-patient relationship need to reduce its dominance over counseling based on interpretation of human relations, paying attention to social and economical differences of peoples and biosocial and biocultural differences, and focusing on clinical examinations. Also, we need to accept that medicine is an art of interaction that can’t reduce it to instrumental and linear methods of body treatment. PMID:25349858

  3. Examining database persistence of ISO/EN 13606 standardized electronic health record extracts: relational vs. NoSQL approaches.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Lozano-Rubí, Raimundo; Serrano-Balazote, Pablo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2017-08-18

    The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606 normalized Electronic Health Record XML extracts, both in isolation and concurrently. NoSQL database systems have recently attracted much attention, but few studies in the literature address their direct comparison with relational databases when applied to build the persistence layer of a standardized medical information system. One relational and two NoSQL databases (one document-based and one native XML database) of three different sizes have been created in order to evaluate and compare the response times (algorithmic complexity) of six different complexity growing queries, which have been performed on them. Similar appropriate results available in the literature have also been considered. Relational and non-relational NoSQL database systems show almost linear algorithmic complexity query execution. However, they show very different linear slopes, the former being much steeper than the two latter. Document-based NoSQL databases perform better in concurrency than in isolation, and also better than relational databases in concurrency. Non-relational NoSQL databases seem to be more appropriate than standard relational SQL databases when database size is extremely high (secondary use, research applications). Document-based NoSQL databases perform in general better than native XML NoSQL databases. EHR extracts visualization and edition are also document-based tasks more appropriate to NoSQL database systems. However, the appropriate database solution much depends on each particular situation and specific problem.

  4. Should the Standard Count Be Excluded from Neutron Probe Calibration?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Z. Fred

    About 6 decades after its introduction, the neutron probe remains one of the most accurate methods for indirect measurement of soil moisture content. Traditionally, the calibration of a neutron probe involves the ratio of the neutron count in the soil to a standard count, which is the neutron count in the fixed environment such as the probe shield or a specially-designed calibration tank. The drawback of this count-ratio-based calibration is that the error in the standard count is carried through to all the measurements. An alternative calibration is to use the neutron counts only, not the ratio, with proper correctionmore » for radioactive decay and counting time. To evaluate both approaches, the shield counts of a neutron probe used for three decades were analyzed. The results show that the surrounding conditions have a substantial effect on the standard count. The error in the standard count also impacts the calculation of water storage and could indicate false consistency among replicates. The analysis of the shield counts indicates negligible aging effect of the instrument over a period of 26 years. It is concluded that, by excluding the standard count, the use of the count-based calibration is appropriate and sometimes even better than ratio-based calibration. The count-based calibration is especially useful for historical data when the standard count was questionable or absent« less

  5. Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.

    PubMed

    Lee, Jaime B; Cherney, Leora R

    2018-03-01

    Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p < .05) effect sizes for 2 of 3 participants for trained probes and 1 of 3 participants for untrained probes. A baseline trend correction was applied to data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.

  6. Creating COMFORT: A Communication-Based Model for Breaking Bad News

    ERIC Educational Resources Information Center

    Villagran, Melinda; Goldsmith, Joy; Wittenberg-Lyles, Elaine; Baldwin, Paula

    2010-01-01

    This study builds upon existing protocols for breaking bad news (BBN), and offers an interaction-based approach to communicating comfort to patients and their families. The goal was to analyze medical students' (N = 21) videotaped standardized patient BBN interactions after completing an instructional unit on a commonly used BBN protocol, commonly…

  7. Teaching Middle School Physical Education: A Standards-Based Approach for Grades 5-8. Second Edition.

    ERIC Educational Resources Information Center

    Mohnsen, Bonnie S.

    This book provides a blueprint for developing environment, curriculum, instruction, and assessment based on high quality physical education guidelines. There are 17 chapters in four parts. Part 1, "Prepare for Your Journey," includes (1) "Physical Education in a Changing World"; (2) "Reform Efforts in the Middle…

  8. sTeam--Providing Primary Media Functions for Web-Based Computer-Supported Cooperative Learning.

    ERIC Educational Resources Information Center

    Hampel, Thorsten

    The World Wide Web has developed as the de facto standard for computer based learning. However, as a server-centered approach, it confines readers and learners to passive nonsequential reading. Authoring and Web-publishing systems aim at supporting the authors' design process. Consequently, learners' activities are confined to selecting and…

  9. The 5E Instructional Model: A Learning Cycle Approach for Inquiry-Based Science Teaching

    ERIC Educational Resources Information Center

    Duran, Lena Ballone; Duran, Emilio

    2004-01-01

    The implementation of inquiry-based teaching is a major theme in national science education reform documents such as "Project 2061: Science for All Americans" (Rutherford & Alhgren, 1990) and the "National Science Education Standards" (NRC, 1996). These reports argue that inquiry needs to be a central strategy of all…

  10. Temperament Profiles from Infancy to Middle Childhood: Development and Associations with Behavior Problems

    ERIC Educational Resources Information Center

    Janson, Harald; Mathiesen, Kristin S.

    2008-01-01

    The authors applied I-States as Objects Analysis (ISOA), a recently proposed person-oriented analytic approach, to the study of temperament development in 921 Norwegian children from a population-based sample. A 5-profile classification based on cluster analysis of standardized mother reports of activity, sociability, emotionality, and shyness at…

  11. The Need for Consumer Education among Indians.

    ERIC Educational Resources Information Center

    Deloria, P. S.

    Since the standard approach to consumer education is based upon the economic situation of the average American and since the degree of American Indian reservation poverty is substantially greater than that of other groups, it is clear that there is a need for Indian oriented consumer education. Based upon a long established credit system.,…

  12. Developing a Common Metadata Model for Competencies Description

    ERIC Educational Resources Information Center

    Sampson, Demetrios; Karampiperis, Pythagoras; Fytros, Demetrios

    2007-01-01

    Competence-based approaches are frequently adopted as the key paradigm in both formal or non-formal education and training. To support the provision of competence-based learning services, it is necessary to be able to maintain a record of an individual's competences in a persistent and standard way. In this paper, we investigate potential issues…

  13. Putting the Focus on Student Engagement: The Benefits of Performance-Based Assessment

    ERIC Educational Resources Information Center

    Barlowe, Avram; Cook, Ann

    2016-01-01

    For more than two decades, the New York Performance Standards Consortium, a coalition of 38 public high schools, has steered clear of high-stakes testing, which superficially assess student learning. Instead, the consortium's approach relies on performance-based assessments--essays, research papers, science experiments, and high-level mathematical…

  14. Risk-Based School Inspections: Impact of Targeted Inspection Approaches on Dutch Secondary Schools

    ERIC Educational Resources Information Center

    Ehren, Melanie C.; Shackleton, Nichola

    2016-01-01

    In most countries, publicly funded schools are held accountable to one inspectorate and are judged against agreed national standards. Many inspectorates of education have recently moved towards more proportional risk-based inspection models, targeting high-risk schools for visits, while schools with satisfactory student attainment levels are…

  15. [Creation of a medical work station for use in community-based care].

    PubMed

    Mercier, Samuel; Desauty, Fabrice; Lamache, Christophe; Lefort, Hugues

    2017-03-01

    In community-based care, the teams must adapt to the environment and perform a number of technical procedures. Foldable medical equipment has been developed and patented, enabling the care provision to approach hospital standards and improving working conditions in this context. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  16. Services for All: Are Outcome-Based Education and Flexible School Structures the Answer?

    ERIC Educational Resources Information Center

    Smith, Sarah J.

    1995-01-01

    This paper discusses the recent controversy over outcome-based education (OBE), arguing that while OBE may be correct in establishing high standards for student learning, its implementation has tended to establish rigid "assembly line" approaches to teaching. A call is made for more flexible and individualized systems that respond to…

  17. Transforming High Schools: Performance Systems for Powerful Teaching. Policy Brief

    ERIC Educational Resources Information Center

    Haynes, Mariana

    2011-01-01

    This policy brief examines standards-based approaches that hold promise for shaping a common vision of skilled teaching commensurate with the national goal of preparing all students for college and careers. Numerous studies confirm that teachers are the most significant school-based factor in improving student achievement, particularly for the…

  18. Developing a Competency-Based Pan-European Accreditation Framework for Health Promotion

    ERIC Educational Resources Information Center

    Battel-Kirk, Barbara; Van der Zanden, Gerard; Schipperen, Marielle; Contu, Paolo; Gallardo, Carmen; Martinez, Ana; Garcia de Sola, Silvia; Sotgiu, Alessandra; Zaagsma, Miriam; Barry, Margaret M.

    2012-01-01

    Background: The CompHP Pan-European Accreditation Framework for Health Promotion was developed as part of the CompHP Project that aimed to develop competency-based standards and an accreditation system for health promotion practice, education, and training in Europe. Method: A phased, multiple-method approach was employed to facilitate consensus…

  19. Integrating the Demonstration Orientation and Standards-Based Models of Achievement Goal Theory

    ERIC Educational Resources Information Center

    Wynne, Heather Marie

    2014-01-01

    Achievement goal theory and thus, the empirical measures stemming from the research, are currently divided on two conceptual approaches, namely the reason versus aims-based models of achievement goals. The factor structure and predictive utility of goal constructs from the Patterns of Adaptive Learning Strategies (PALS) and the latest two versions…

  20. First-Principles Approach to Model Electrochemical Reactions: Understanding the Fundamental Mechanisms behind Mg Corrosion

    NASA Astrophysics Data System (ADS)

    Surendralal, Sudarsan; Todorova, Mira; Finnis, Michael W.; Neugebauer, Jörg

    2018-06-01

    Combining concepts of semiconductor physics and corrosion science, we develop a novel approach that allows us to perform ab initio calculations under controlled potentiostat conditions for electrochemical systems. The proposed approach can be straightforwardly applied in standard density functional theory codes. To demonstrate the performance and the opportunities opened by this approach, we study the chemical reactions that take place during initial corrosion at the water-Mg interface under anodic polarization. Based on this insight, we derive an atomistic model that explains the origin of the anodic hydrogen evolution.

  1. Security challenges in integration of a PHR-S into a standards based national EHR.

    PubMed

    Mense, Alexander; Hoheiser Pförtner, Franz; Sauermann, Stefan

    2014-01-01

    Health related data provided by patients themselves is expected to play a major role in future healthcare. Data from personal health devices, vaccination records, health diaries or observations of daily living, for instance, is stored in personal health records (PHR) which are maintained by personal health record systems (PHR-S). Combining this information with medical records provided by healthcare providers in electronic health records (EHR) is one of the next steps towards "personal care". Austria currently sets up a nationwide EHR system that incorporates all healthcare providers and is technically based on international standards (IHE, HL7, OASIS, ...). Looking at the expected potential of merging PHR and EHR data it is worth to analyse integration approaches. Although knowing that an integration requires the coordination of processes, information models and technical architectures, this paper specifically focuses on security issues by evaluating general security requirements for a PHR-S (based on HL7 PHR-S FM), comparing them with the information security specifications for the Austrian's national EHR (based on ISO/IES 27000 series) and identifying the main challenges as well as possible approaches.

  2. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed Central

    LeBozec, C.; Jaulent, M. C.; Zapletal, E.; Degoulet, P.

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users. Images Figure 6 Figure 7 PMID:9929346

  3. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed

    LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.

  4. Sustainable development induction in organizations: a convergence analysis of ISO standards management tools' parameters.

    PubMed

    Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar

    2012-01-01

    Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.

  5. Co-constructing cultural landscapes for disciplinary learning in and out of school: the next generation science standards and learning progressions in action

    NASA Astrophysics Data System (ADS)

    Córdova, Ralph A.; Balcerzak, Phyllis

    2016-12-01

    The authors of this study are teacher-researchers, the first is a university researcher and former third and fourth grade teacher, while the second author is a university-based science educator. They report findings from a community-based study that Ralph, the first author, and his students conducted across two academic years (2001-2003) in order to illustrate the ways in which the next generation science standards and learning progressions can be appropriated as social-constructed practices inside and outside of school. The authors argue that what constitutes science learning in school is not a `state of grace' dictated by standards. Rather, becoming a scientist within a community of learners is a cultural phenomenon that teachers and students co-construct and as such teachers can approach the next generation science standards and learning progressions as opportunities to create intentional, disciplinary practice-based learning communities inside and outside of school.

  6. A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado

    USGS Publications Warehouse

    Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.

    2007-01-01

    Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model application to the premining problem for Red Mountain Creek is based on limited field reconnaissance and chemical analyses; additional field work and analyses may be needed to develop definitive, quantitative estimates of premining water quality.

  7. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  8. Electronic book format EPUB and Japanese typography : Mainstream people in Japan are unlikely to win international standardization battles

    NASA Astrophysics Data System (ADS)

    Murata, Makoto

    EPUB3 is an electronic book format based on Web technologies such as HTML and CSS. EPUB3 is internationalized; in particular, it supports Japanese typography. Features such as vertical writing were introduced by first creating CSS Writing Modes and CSS Text at W3C and then creating EPUB3 at IDPF on top of them. On the basis of this standardization experience, common pitfalls for Japanese in international standardization are pointed out and a promising approach is suggested.

  9. Study on the Spatial Resolution of Single and Multiple Coincidences Compton Camera

    NASA Astrophysics Data System (ADS)

    Andreyev, Andriy; Sitek, Arkadiusz; Celler, Anna

    2012-10-01

    In this paper we study the image resolution that can be obtained from the Multiple Coincidences Compton Camera (MCCC). The principle of MCCC is based on a simultaneous acquisition of several gamma-rays emitted in cascade from a single nucleus. Contrary to a standard Compton camera, MCCC can theoretically provide the exact location of a radioactive source (based only on the identification of the intersection point of three cones created by a single decay), without complicated tomographic reconstruction. However, practical implementation of the MCCC approach encounters several problems, such as low detection sensitivities result in very low probability of coincident triple gamma-ray detection, which is necessary for the source localization. It is also important to evaluate how the detection uncertainties (finite energy and spatial resolution) influence identification of the intersection of three cones, thus the resulting image quality. In this study we investigate how the spatial resolution of the reconstructed images using the triple-cone reconstruction (TCR) approach compares to images reconstructed from the same data using standard iterative method based on single-cone. Results show, that FWHM for the point source reconstructed with TCR was 20-30% higher than the one obtained from the standard iterative reconstruction based on expectation maximization (EM) algorithm and conventional single-cone Compton imaging. Finite energy and spatial resolutions of the MCCC detectors lead to errors in conical surfaces definitions (“thick” conical surfaces) which only amplify in image reconstruction when intersection of three cones is being sought. Our investigations show that, in spite of being conceptually appealing, the identification of triple cone intersection constitutes yet another restriction of the multiple coincidence approach which limits the image resolution that can be obtained with MCCC and TCR algorithm.

  10. Organizing Community-Based Data Standards: Lessons from Developing a Successful Open Standard in Systems Biology

    NASA Astrophysics Data System (ADS)

    Hucka, M.

    2015-09-01

    In common with many fields, including astronomy, a vast number of software tools for computational modeling and simulation are available today in systems biology. This wealth of resources is a boon to researchers, but it also presents interoperability problems. Despite working with different software tools, researchers want to disseminate their work widely as well as reuse and extend the models of other researchers. This situation led in the year 2000 to an effort to create a tool-independent, machine-readable file format for representing models: SBML, the Systems Biology Markup Language. SBML has since become the de facto standard for its purpose. Its success and general approach has inspired and influenced other community-oriented standardization efforts in systems biology. Open standards are essential for the progress of science in all fields, but it is often difficult for academic researchers to organize successful community-based standards. I draw on personal experiences from the development of SBML and summarize some of the lessons learned, in the hope that this may be useful to other groups seeking to develop open standards in a community-oriented fashion.

  11. Passive PE Sampling in Support of In Situ Remediation of Contaminated Sediments

    DTIC Science & Technology

    2015-08-01

    control RPD relative percent difference RSD relative standard deviation SERDP Strategic Environmental Research and Development Program SOPs...sediments from 2 stations, each at 4 PCB spike levels, for four individual congeners was 22 ± 6 % relative standard deviation ( RSD ). Also, comparison of... RSD (Table 3). However, larger congeners (e.g., congeners #153 and 180) whose approach to equilibrium is less certain, based on small fractions of

  12. Analysis of view synthesis prediction architectures in modern coding standards

    NASA Astrophysics Data System (ADS)

    Tian, Dong; Zou, Feng; Lee, Chris; Vetro, Anthony; Sun, Huifang

    2013-09-01

    Depth-based 3D formats are currently being developed as extensions to both AVC and HEVC standards. The availability of depth information facilitates the generation of intermediate views for advanced 3D applications and displays, and also enables more efficient coding of the multiview input data through view synthesis prediction techniques. This paper outlines several approaches that have been explored to realize view synthesis prediction in modern video coding standards such as AVC and HEVC. The benefits and drawbacks of various architectures are analyzed in terms of performance, complexity, and other design considerations. It is hence concluded that block-based VSP prediction for multiview video signals provides attractive coding gains with comparable complexity as traditional motion/disparity compensation.

  13. Performance metrics for the evaluation of hyperspectral chemical identification systems

    NASA Astrophysics Data System (ADS)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  14. Development and implementation of a competency-based clinical evaluation tool for midwifery education.

    PubMed

    Woeber, Kate

    2018-07-01

    The learning goals and evaluation strategies of competency-based midwifery programs must be explicit and well-defined. In the US, didactic learning is evaluated through a standardized certification examination, but standardized clinical competence evaluation is lacking. The Midwifery Competency Assessment Tool (MCAT) has been adapted from the International Confederation of Midwives' (ICM) "Essential Competencies" and from the American College of Nurse-Midwives' (ACNM) "Core Competencies", with student self-evaluation based on Benner's Novice-to-Expert theory. The MCAT allows for the measurement and monitoring of competence development in all domains of full-scope practice over the course of the midwifery program. Strengths of the MCAT are that it provides clear learning goals and performance evaluations for students, ensures and communicates content mapping across a curriculum, and highlights strengths and gaps in clinical opportunities at individual clinical sites and for entire programs. Challenges of the MCAT lie in balancing the number of competency items to be measured with the tedium of form completion, in ensuring the accuracy of student self-evaluation, and in determining "adequate" competence achievement when particular clinical opportunities are limited. Use of the MCAT with competency-based clinical education may facilitate a more standardized approach to clinical evaluation, as well as a more strategic approach to clinical site development and use. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Implementation methodology for interoperable personal health devices with low-voltage low-power constraints.

    PubMed

    Martinez-Espronceda, Miguel; Martinez, Ignacio; Serrano, Luis; Led, Santiago; Trigo, Jesús Daniel; Marzo, Asier; Escayola, Javier; Garcia, José

    2011-05-01

    Traditionally, e-Health solutions were located at the point of care (PoC), while the new ubiquitous user-centered paradigm draws on standard-based personal health devices (PHDs). Such devices place strict constraints on computation and battery efficiency that encouraged the International Organization for Standardization/IEEE11073 (X73) standard for medical devices to evolve from X73PoC to X73PHD. In this context, low-voltage low-power (LV-LP) technologies meet the restrictions of X73PHD-compliant devices. Since X73PHD does not approach the software architecture, the accomplishment of an efficient design falls directly on the software developer. Therefore, computational and battery performance of such LV-LP-constrained devices can even be outperformed through an efficient X73PHD implementation design. In this context, this paper proposes a new methodology to implement X73PHD into microcontroller-based platforms with LV-LP constraints. Such implementation methodology has been developed through a patterns-based approach and applied to a number of X73PHD-compliant agents (including weighing scale, blood pressure monitor, and thermometer specializations) and microprocessor architectures (8, 16, and 32 bits) as a proof of concept. As a reference, the results obtained in the weighing scale guarantee all features of X73PHD running over a microcontroller architecture based on ARM7TDMI requiring only 168 B of RAM and 2546 B of flash memory.

  16. From agricultural fields to urban asphalt: the role of worker education to promote California's heat illness prevention standard.

    PubMed

    Riley, Kevin; Delp, Linda; Cornelio, Deogracia; Jacobs, Sarah

    2012-01-01

    This article describes an innovative approach to reach and educate workers and worker advocates about California's outdoor heat illness prevention standard. In 2010, Cal/OSHA initiated a statewide education campaign to reduce heat-related illnesses and fatalities and increase awareness of the standard's requirements. In Southern California, the UCLA Labor Occupational Safety and Health Program (LOSH) focused on three principal strategies of community-based outreach, popular education, and organizational capacity building. Central to the LOSH approach was the integration of health promotores into core program planning and training activities and the expansion of campaign activities to a wide variety of rural and urban workers. We describe each of these strategies and analyze the possibilities and constraints of worker education to support implementation of this standard, particularly given the vulnerabilities of the impacted workforce, the often precarious nature of employment arrangements for these workers, and the resource limitations of Cal/OSHA.

  17. Enhanced definition and required examples of common datum imposed by ISO standard

    NASA Astrophysics Data System (ADS)

    Yan, Yiqing; Bohn, Martin

    2017-12-01

    According to the ISO Geometrical Product Specifications (GPS), the establishment and definition of common datum for geometrical components are not fully defined. There are two main limitations of this standard. Firstly: the explications of ISO examples of common datums are not matched with their corresponding definitions, and secondly: a full definition of common datum is missing. This paper suggests a new approach for an enhanced definition and concrete examples of common datum and proposes a holistic methodology for establishment of common datum for each geometrical component. This research is based on the analysis of physical behaviour of geometrical components, orientation constraints and invariance classes of datums. This approach fills the definition gaps of common datum based on ISO GPS, thereby eliminating those deficits. As a result, an improved methodology for a fully functional defined definition of common datum was formulated.

  18. Age-structured mark-recapture analysis: A virtual-population-analysis-based model for analyzing age-structured capture-recapture data

    USGS Publications Warehouse

    Coggins, L.G.; Pine, William E.; Walters, C.J.; Martell, S.J.D.

    2006-01-01

    We present a new model to estimate capture probabilities, survival, abundance, and recruitment using traditional Jolly-Seber capture-recapture methods within a standard fisheries virtual population analysis framework. This approach compares the numbers of marked and unmarked fish at age captured in each year of sampling with predictions based on estimated vulnerabilities and abundance in a likelihood function. Recruitment to the earliest age at which fish can be tagged is estimated by using a virtual population analysis method to back-calculate the expected numbers of unmarked fish at risk of capture. By using information from both marked and unmarked animals in a standard fisheries age structure framework, this approach is well suited to the sparse data situations common in long-term capture-recapture programs with variable sampling effort. ?? Copyright by the American Fisheries Society 2006.

  19. Robust Mediation Analysis Based on Median Regression

    PubMed Central

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  20. A Synthetic Comparator Approach to Local Evaluation of School-Based Substance Use Prevention Programming.

    PubMed

    Hansen, William B; Derzon, James H; Reese, Eric L

    2014-06-01

    We propose a method for creating groups against which outcomes of local pretest-posttest evaluations of evidence-based programs can be judged. This involves assessing pretest markers for new and previously conducted evaluations to identify groups that have high pretest similarity. A database of 802 prior local evaluations provided six summary measures for analysis. The proximity of all groups using these variables is calculated as standardized proximities having values between 0 and 1. Five methods for creating standardized proximities are demonstrated. The approach allows proximity limits to be adjusted to find sufficient numbers of synthetic comparators. Several index cases are examined to assess the numbers of groups available to serve as comparators. Results show that most local evaluations would have sufficient numbers of comparators available for estimating program effects. This method holds promise as a tool for local evaluations to estimate relative effectiveness. © The Author(s) 2012.

  1. Use of the Budyko Framework to Estimate the Virtual Water Content in Shijiazhuang Plain, North China

    NASA Astrophysics Data System (ADS)

    Zhang, E.; Yin, X.

    2017-12-01

    One of the most challenging steps in implementing analysis of virtual water content (VWC) of agricultural crops is how to properly assess the volume of consumptive water use (CWU) for crop production. In practice, CWU is considered equivalent to the crop evapotranspiration (ETc). Following the crop coefficient method, ETc can be calculated under standard or non-standard conditions by multiplying the reference evapotranspiration (ET0) by one or a few coefficients. However, when current crop growing conditions deviate from standard conditions, accurately determining the coefficients under non-standard conditions remains to be a complicated process and requires lots of field experimental data. Based on regional surface water-energy balance, this research integrates the Budyko framework into the traditional crop coefficient approach to simplify the coefficients determination. This new method enables us to assess the volume of agricultural VWC only based on some hydrometeorological data and agricultural statistic data in regional scale. To demonstrate the new method, we apply it to the Shijiazhuang Plain, which is an agricultural irrigation area in the North China Plain. The VWC of winter wheat and summer maize is calculated and we further subdivide VWC into blue and green water components. Compared with previous studies in this study area, VWC calculated by the Budyko-based crop coefficient approach uses less data and agrees well with some of the previous research. It shows that this new method may serve as a more convenient tool for assessing VWC.

  2. Periodontal Management by Risk Assessment: A Pragmatic Approach.

    PubMed

    Mullins, Joanna M; Even, Joshua B; White, Joel M

    2016-06-01

    An evidence-based periodontal disease risk assessment and diagnosis system has been developed and combined with a clinical decision support and management program to improve treatment and measure patient outcomes. There is little agreement on a universally accepted periodontal risk assessment, periodontal diagnosis, and treatment management tool and their incorporation into dental practice to improve patient care. This article highlights the development and use of a practical periodontal management and risk assessment program that can be implemented in dental settings. The approach taken by Willamette Dental Group to develop a periodontal disease risk assessment, periodontal diagnosis, and treatment management tool is described using evidence-based best practices. With goals of standardized treatment interventions while maintaining personalized care and improved communication, this process is described to facilitate its incorporation into other dental settings. Current electronic health records can be leveraged to enhance patient-centered care through the use of risk assessments and standardized guidelines to more effectively assess, diagnose, and treat patients to improve outcomes. Dental hygienists, and other committed providers, with their emphasis on prevention of periodontal disease can be principal drivers in creation and implementation of periodontal risk assessments and personalized treatment planning. Willamette Dental Group believes that such evidence-based tools can advance dentistry to new diagnostic and treatment standards. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Improving data quality in the linked open data: a survey

    NASA Astrophysics Data System (ADS)

    Hadhiatma, A.

    2018-03-01

    The Linked Open Data (LOD) is “web of data”, a different paradigm from “web of document” commonly used today. However, the huge LOD still suffers from data quality problems such as completeness, consistency, and accuracy. Data quality problems relate to designing effective methods both to manage and to retrieve information at various data quality levels. Based on review from papers and journals, addressing data quality requires some standards functioning to (1) identification of data quality problems, (2) assessment of data quality for a given context, and (3) correction of data quality problems. However, mostly the methods and strategies dealing with the LOD data quality were not as an integrative approach. Hence, based on those standards and an integrative approach, there are opportunities to improve the LOD data quality in the term of incompleteness, inaccuracy and inconsistency, considering to its schema and ontology, namely ontology refinement. Moreover, the term of the ontology refinement means that it copes not only to improve data quality but also to enrich the LOD. Therefore, it needs (1) a standard for data quality assessment and evaluation which is more appropriate to the LOD; (2) a framework of methods based on statistical relational learning that can improve the correction of data quality problems as well as enrich the LOD.

  4. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    NASA Astrophysics Data System (ADS)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  5. Variance computations for functional of absolute risk estimates.

    PubMed

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  6. Variance computations for functional of absolute risk estimates

    PubMed Central

    Pfeiffer, R.M.; Petracci, E.

    2011-01-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates. PMID:21643476

  7. Aescin-based topical formulation to prevent foot wounds and ulcerations in diabetic microangiopathy.

    PubMed

    Hu, S; Belcaro, G; Dugall, M; Hosoi, M; Togni, S; Maramaldi, G; Giacomelli, L

    2016-10-01

    Impairment of the peripheral microcirculation in diabetic patients often leads to severe complications in the lower extremities, such as foot infections and ulcerations. In this study, a novel aescin-based formulation has been evaluated as a potential approach to prevent skin breaks and ulcerations by improving the peripheral microcirculation and skin hydration. In this registry study, 63 patients with moderate diabetic microangiopathy were recruited. Informed participants freely decided to follow either a standard management (SM) to prevent diabetic foot diseases (n = 31) or SM associated with topical application of the aescin-based cream (n = 32). Peripheral microcirculatory parameters such as resting skin flux, venoarteriolar response and transcutaneous gas tension were evaluated at inclusion and after 8 weeks. In addition, several skin parameters of the foot area, such as integrity (as number of skin breaks/patients), hydration and content of dead cells were assessed at the defined observational study periods. Improvements in cutaneous peripheral microcirculation parameters were observed at 8 weeks in both groups; however, a remarkable and significant beneficial effect resulted to be exerted by the aescin-based cream treatment. In fact, the microcirculatory parameters evaluated significantly improved in the standard management + aescin-based cream group, compared with baseline and with the standard management group. Similar findings were reported for skin parameters of the foot area. The topical formulation containing aescin could represent a valid approach to manage skin wounds and prevent skin ulcerations in patients affected by moderate diabetic microangiopathy.

  8. 78 FR 29815 - Control of Air Pollution From Motor Vehicles: Tier 3 Motor Vehicle Emission and Fuel Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-21

    ...This action would establish more stringent vehicle emissions standards and reduce the sulfur content of gasoline beginning in 2017, as part of a systems approach to addressing the impacts of motor vehicles and fuels on air quality and public health. The proposed gasoline sulfur standard would make emission control systems more effective for both existing and new vehicles, and would enable more stringent vehicle emissions standards. The proposed vehicle standards would reduce both tailpipe and evaporative emissions from passenger cars, light-duty trucks, medium-duty passenger vehicles, and some heavy-duty vehicles. This would result in significant reductions in pollutants such as ozone, particulate matter, and air toxics across the country and help state and local agencies in their efforts to attain and maintain health-based National Ambient Air Quality Standards. Motor vehicles are an important source of exposure to air pollution both regionally and near roads. These proposed vehicle standards are intended to harmonize with California's Low Emission Vehicle program, thus creating a federal vehicle emissions program that would allow automakers to sell the same vehicles in all 50 states. The proposed vehicle standards would be implemented over the same timeframe as the greenhouse gas/fuel efficiency standards for light-duty vehicles, as part of a comprehensive approach toward regulating emissions from motor vehicles.

  9. Regulatory approaches for addressing dissolved oxygen concerns at hydropower facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Mark J.; Cada, Glenn F.; Sale, Michael J.

    Low dissolved oxygen (DO) concentrations are a common water quality problem downstream of hydropower facilities. At some facilities, structural improvements (e.g. installation of weir dams or aerating turbines) or operational changes (e.g., spilling water over the dam) can be made to improve DO levels. In other cases, structural and operational approaches are too costly for the project to implement or are likely to be of limited effectiveness. Despite improvements in overall water quality below dams in recent years, many hydropower projects are unable to meet state water quality standards for DO. Regulatory agencies in the U.S. are considering or implementingmore » dramatic changes in their approach to protecting the quality of the Nation’s waters. New policies and initiatives have emphasized flexibility, increased collaboration and shared responsibility among all parties, and market-based, economic incentives. The use of new regulatory approaches may now be a viable option for addressing the DO problem at some hydropower facilities. This report summarizes some of the regulatory-related options available to hydropower projects, including negotiation of site-specific water quality criteria, use of biological monitoring, watershed-based strategies for the management of water quality, and watershed-based trading. Key decision points center on the health of the local biological communities and whether there are contributing impacts (i.e., other sources of low DO effluents) in the watershed. If the biological communities downstream of the hydropower project are healthy, negotiation for site-specific water quality standards or biocriteria (discharge performance criteria based on characteristics of the aquatic biota) might be pursued. If there are other effluent dischargers in the watershed that contribute to low DO problems, watershed-scale strategies and effluent trading may be effective. This report examines the value of regulatory approaches by reviewing their use in other« less

  10. Quantifying the relative irreplaceability of important bird and biodiversity areas.

    PubMed

    Di Marco, Moreno; Brooks, Thomas; Cuttelod, Annabelle; Fishpool, Lincoln D C; Rondinini, Carlo; Smith, Robert J; Bennun, Leon; Butchart, Stuart H M; Ferrier, Simon; Foppen, Ruud P B; Joppa, Lucas; Juffe-Bignoli, Diego; Knight, Andrew T; Lamoreux, John F; Langhammer, Penny F; May, Ian; Possingham, Hugh P; Visconti, Piero; Watson, James E M; Woodley, Stephen

    2016-04-01

    World governments have committed to increase the global protected areas coverage by 2020, but the effectiveness of this commitment for protecting biodiversity depends on where new protected areas are located. Threshold- and complementarity-based approaches have been independently used to identify important sites for biodiversity. We brought together these approaches by performing a complementarity-based analysis of irreplaceability in important bird and biodiversity areas (IBAs), which are sites identified using a threshold-based approach. We determined whether irreplaceability values are higher inside than outside IBAs and whether any observed difference depends on known characteristics of the IBAs. We focused on 3 regions with comprehensive IBA inventories and bird distribution atlases: Australia, southern Africa, and Europe. Irreplaceability values were significantly higher inside than outside IBAs, although differences were much smaller in Europe than elsewhere. Higher irreplaceability values in IBAs were associated with the presence and number of restricted-range species; number of criteria under which the site was identified; and mean geographic range size of the species for which the site was identified (trigger species). In addition, IBAs were characterized by higher irreplaceability values when using proportional species representation targets, rather than fixed targets. There were broadly comparable results when measuring irreplaceability for trigger species and when considering all bird species, which indicates a good surrogacy effect of the former. Recently, the International Union for Conservation of Nature has convened a consultation to consolidate global standards for the identification of key biodiversity areas (KBAs), building from existing approaches such as IBAs. Our results informed this consultation, and in particular a proposed irreplaceability criterion that will allow the new KBA standard to draw on the strengths of both threshold- and complementarity-based approaches. © 2015 Society for Conservation Biology.

  11. A Deep Denoising Autoencoder Approach to Improving the Intelligibility of Vocoded Speech in Cochlear Implant Simulation.

    PubMed

    Lai, Ying-Hui; Chen, Fei; Wang, Syu-Siang; Lu, Xugang; Tsao, Yu; Lee, Chin-Hui

    2017-07-01

    In a cochlear implant (CI) speech processor, noise reduction (NR) is a critical component for enabling CI users to attain improved speech perception under noisy conditions. Identifying an effective NR approach has long been a key topic in CI research. Recently, a deep denoising autoencoder (DDAE) based NR approach was proposed and shown to be effective in restoring clean speech from noisy observations. It was also shown that DDAE could provide better performance than several existing NR methods in standardized objective evaluations. Following this success with normal speech, this paper further investigated the performance of DDAE-based NR to improve the intelligibility of envelope-based vocoded speech, which simulates speech signal processing in existing CI devices. We compared the performance of speech intelligibility between DDAE-based NR and conventional single-microphone NR approaches using the noise vocoder simulation. The results of both objective evaluations and listening test showed that, under the conditions of nonstationary noise distortion, DDAE-based NR yielded higher intelligibility scores than conventional NR approaches. This study confirmed that DDAE-based NR could potentially be integrated into a CI processor to provide more benefits to CI users under noisy conditions.

  12. Improving Measurement in Health Education and Health Behavior Research Using Item Response Modeling: Comparison with the Classical Test Theory Approach

    ERIC Educational Resources Information Center

    Wilson, Mark; Allen, Diane D.; Li, Jun Corser

    2006-01-01

    This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…

  13. Changing Health Behaviors to Improve Health Outcomes after Angioplasty: A Randomized Trial of Net Present Value versus Future Value Risk Communication

    ERIC Educational Resources Information Center

    Charlson, M. E.; Peterson, J. C.; Boutin-Foster, C.; Briggs, W. M.; Ogedegbe, G. G.; McCulloch, C. E.; Hollenberg, J.; Wong, C.; Allegrante, J. P.

    2008-01-01

    Patients who have undergone angioplasty experience difficulty modifying at-risk behaviors for subsequent cardiac events. The purpose of this study was to test whether an innovative approach to framing of risk, based on "net present value" economic theory, would be more effective in behavioral intervention than the standard "future value approach"…

  14. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PROBABILISTIC APPROACH FOR CALCULATING INGESTION EXPOSURE FROM DAY 4 COMPOSITE MEASUREMENTS, THE DIRECT METHOD OF EXPOSURE ESTIMATION (IIT-A-15.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate the ingestion exposure using composite food chemical residue values from the day of direct measurements. The calculation is based on the probabilistic approach. This SOP uses data that have been proper...

  15. Tensorial dynamic time warping with articulation index representation for efficient audio-template learning.

    PubMed

    Le, Long N; Jones, Douglas L

    2018-03-01

    Audio classification techniques often depend on the availability of a large labeled training dataset for successful performance. However, in many application domains of audio classification (e.g., wildlife monitoring), obtaining labeled data is still a costly and laborious process. Motivated by this observation, a technique is proposed to efficiently learn a clean template from a few labeled, but likely corrupted (by noise and interferences), data samples. This learning can be done efficiently via tensorial dynamic time warping on the articulation index-based time-frequency representations of audio data. The learned template can then be used in audio classification following the standard template-based approach. Experimental results show that the proposed approach outperforms both (1) the recurrent neural network approach and (2) the state-of-the-art in the template-based approach on a wildlife detection application with few training samples.

  16. Monitoring scale scores over time via quality control charts, model-based approaches, and time series techniques.

    PubMed

    Lee, Yi-Hsuan; von Davier, Alina A

    2013-07-01

    Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.

  17. [The establishment, development and application of classification approach of freshwater phytoplankton based on the functional group: a review].

    PubMed

    Yang, Wen; Zhu, Jin-Yong; Lu, Kai-Hong; Wan, Li; Mao, Xiao-Hua

    2014-06-01

    Appropriate schemes for classification of freshwater phytoplankton are prerequisites and important tools for revealing phytoplanktonic succession and studying freshwater ecosystems. An alternative approach, functional group of freshwater phytoplankton, has been proposed and developed due to the deficiencies of Linnaean and molecular identification in ecological applications. The functional group of phytoplankton is a classification scheme based on autoecology. In this study, the theoretical basis and classification criterion of functional group (FG), morpho-functional group (MFG) and morphology-based functional group (MBFG) were summarized, as well as their merits and demerits. FG was considered as the optimal classification approach for the aquatic ecology research and aquatic environment evaluation. The application status of FG was introduced, with the evaluation standards and problems of two approaches to assess water quality on the basis of FG, index methods of Q and QR, being briefly discussed.

  18. American National Standards: The Consensus Process

    NASA Technical Reports Server (NTRS)

    Schafer, Thom

    2000-01-01

    Since the early 20th Century, technical and professional societies have developed standards within their areas of expertise addressing aspects of their industries which they feel would benefit from a degree of standardization. From the beginning, the use of these standards was strictly voluntary. It did not take jurisdictional authorities long, however, to recognize that application of these voluntary standards enhanced public safety, as well as leveling the playing field in trade. Hence, laws were passed mandating their use. Purchasers of goods and services also recognized the advantages of standardization, and began requiring the use of standards in their procurement contracts. But how do jurisdictions and purchasers know that the standard they are mandating is a broad-based industry standard, or a narrowly focused set of rules which only apply to one company or institution, thereby giving them an unfair advantage? The answer is "consensus", and a unified approach in achieving it.

  19. Learning to Mean in Spanish Writing: A Case Study of a Genre-Based Pedagogy for Standards-Based Writing Instruction

    ERIC Educational Resources Information Center

    Troyan, Francis J.

    2016-01-01

    This case study reports the results of a genre-based approach, which was used to explicitly teach the touristic landmark description to fourth-grade students of Spanish as a foreign language. The instructional model and unit of instruction were informed by the pedagogies of the Sydney School of Linguistics and an instructional model for…

  20. Competency Based Training. How To Do It--for Trainers. A Guide for Teachers and Trainers on Approaches to Competency Based Training.

    ERIC Educational Resources Information Center

    Worsnop, Percy J.

    This booklet, which is intended for vocational educators/trainers in Australia, explains the principles and techniques of competency-based training (CBT). The following topics are discussed in the first 10 sections: the decision to adopt CBT in Australia; the meaning of competency; teaching and learning to become competent (competency standards as…

  1. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542

  2. 2D Flood Modelling Using Advanced Terrain Analysis Techniques And A Fully Continuous DEM-Based Rainfall-Runoff Algorithm

    NASA Astrophysics Data System (ADS)

    Nardi, F.; Grimaldi, S.; Petroselli, A.

    2012-12-01

    Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.

  3. A first approach to a neuropsychological screening tool using eye-tracking for bedside cognitive testing based on the Edinburgh Cognitive and Behavioural ALS Screen.

    PubMed

    Keller, Jürgen; Krimly, Amon; Bauer, Lisa; Schulenburg, Sarah; Böhm, Sarah; Aho-Özhan, Helena E A; Uttner, Ingo; Gorges, Martin; Kassubek, Jan; Pinkhardt, Elmar H; Abrahams, Sharon; Ludolph, Albert C; Lulé, Dorothée

    2017-08-01

    Reliable assessment of cognitive functions is a challenging task in amyotrophic lateral sclerosis (ALS) patients unable to speak and write. We therefore present an eye-tracking based neuropsychological screening tool based on the Edinburgh Cognitive and Behavioural ALS Screen (ECAS), a standard screening tool for cognitive deficits in ALS. In total, 46 ALS patients and 50 healthy controls matched for age, gender and education were tested with an oculomotor based and a standard paper-and-pencil version of the ECAS. Significant correlation between both versions was observed for ALS patients and healthy controls in the ECAS total score and in all of its ALS-specific domains (all r > 0.3; all p < 0.05). The eye-tracking version of the ECAS reliably distinguished between ALS patients and healthy controls in the ECAS total score (p < 0.05). Also, cognitively impaired and non-impaired patients could be reliably distinguished with a specificity of 95%. This study provides first evidence that the eye-tracking based ECAS version is a promising approach for assessing cognitive deficits in ALS patients who are unable to speak or write.

  4. Adaptive Spatial Filter Based on Similarity Indices to Preserve the Neural Information on EEG Signals during On-Line Processing

    PubMed Central

    Villa-Parra, Ana Cecilia; Bastos-Filho, Teodiano; López-Delis, Alberto; Frizera-Neto, Anselmo; Krishnan, Sridhar

    2017-01-01

    This work presents a new on-line adaptive filter, which is based on a similarity analysis between standard electrode locations, in order to reduce artifacts and common interferences throughout electroencephalography (EEG) signals, but preserving the useful information. Standard deviation and Concordance Correlation Coefficient (CCC) between target electrodes and its correspondent neighbor electrodes are analyzed on sliding windows to select those neighbors that are highly correlated. Afterwards, a model based on CCC is applied to provide higher values of weight to those correlated electrodes with lower similarity to the target electrode. The approach was applied to brain computer-interfaces (BCIs) based on Canonical Correlation Analysis (CCA) to recognize 40 targets of steady-state visual evoked potential (SSVEP), providing an accuracy (ACC) of 86.44 ± 2.81%. In addition, also using this approach, features of low frequency were selected in the pre-processing stage of another BCI to recognize gait planning. In this case, the recognition was significantly (p<0.01) improved for most of the subjects (ACC≥74.79%), when compared with other BCIs based on Common Spatial Pattern, Filter Bank-Common Spatial Pattern, and Riemannian Geometry. PMID:29186848

  5. MCTDH on-the-fly: Efficient grid-based quantum dynamics without pre-computed potential energy surfaces

    NASA Astrophysics Data System (ADS)

    Richings, Gareth W.; Habershon, Scott

    2018-04-01

    We present significant algorithmic improvements to a recently proposed direct quantum dynamics method, based upon combining well established grid-based quantum dynamics approaches and expansions of the potential energy operator in terms of a weighted sum of Gaussian functions. Specifically, using a sum of low-dimensional Gaussian functions to represent the potential energy surface (PES), combined with a secondary fitting of the PES using singular value decomposition, we show how standard grid-based quantum dynamics methods can be dramatically accelerated without loss of accuracy. This is demonstrated by on-the-fly simulations (using both standard grid-based methods and multi-configuration time-dependent Hartree) of both proton transfer on the electronic ground state of salicylaldimine and the non-adiabatic dynamics of pyrazine.

  6. Dysfunctional problem-based learning curricula: resolving the problem

    PubMed Central

    2012-01-01

    Background Problem-based learning (PBL) has become the most significant innovation in medical education of the past 40 years. In contrast to exam-centered, lecture-based conventional curricula, PBL is a comprehensive curricular strategy that fosters student-centred learning and the skills desired in physicians. The rapid spread of PBL has produced many variants. One of the most common is 'hybrid PBL' where conventional teaching methods are implemented alongside PBL. This paper contends that the mixing of these two opposing educational philosophies can undermine PBL and nullify its positive benefits. Schools using hybrid PBL and lacking medical education expertise may end up with a dysfunctional curriculum worse off than the traditional approach. Discussion For hybrid PBL schools with a dysfunctional curriculum, standard PBL is a cost-feasible option that confers the benefits of the PBL approach. This paper describes the signs of a dysfunctional PBL curriculum to aid hybrid PBL schools in recognising curricular breakdown. Next it discusses alternative curricular strategies and costs associated with PBL. It then details the four critical factors for successful conversion to standard PBL: dealing with staff resistance, understanding the role of lectures, adequate time for preparation and support from the administrative leadership. Summary Hybrid PBL curricula without oversight by staff with medical education expertise can degenerate into dysfunctional curricula inferior even to the traditional approach from which PBL emerged. Such schools should inspect their curriculum periodically for signs of dysfunction to enable timely corrective action. A decision to convert fully to standard PBL is cost feasible but will require time, expertise and commitment which is only sustainable with supportive leadership. PMID:23009729

  7. Knowledge-rich temporal relation identification and classification in clinical notes

    PubMed Central

    D’Souza, Jennifer; Ng, Vincent

    2014-01-01

    Motivation: We examine the task of temporal relation classification for the clinical domain. Our approach to this task departs from existing ones in that it is (i) ‘knowledge-rich’, employing sophisticated knowledge derived from discourse relations as well as both domain-independent and domain-dependent semantic relations, and (ii) ‘hybrid’, combining the strengths of rule-based and learning-based approaches. Evaluation results on the i2b2 Clinical Temporal Relations Challenge corpus show that our approach yields a 17–24% and 8–14% relative reduction in error over a state-of-the-art learning-based baseline system when gold-standard and automatically identified temporal relations are used, respectively. Database URL: http://www.hlt.utdallas.edu/~jld082000/temporal-relations/ PMID:25414383

  8. Natural Language Processing–Enabled and Conventional Data Capture Methods for Input to Electronic Health Records: A Comparative Usability Study

    PubMed Central

    Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark

    2016-01-01

    Background The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)–enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user’s experience. Objective The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. Methods This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods (“protocols”) of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. Results A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. Conclusions In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. PMID:27793791

  9. Natural Language Processing-Enabled and Conventional Data Capture Methods for Input to Electronic Health Records: A Comparative Usability Study.

    PubMed

    Kaufman, David R; Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark

    2016-10-28

    The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)-enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user's experience. The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods ("protocols") of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. ©David R. Kaufman, Barbara Sheehan, Peter Stetson, Ashish R. Bhatt, Adele I. Field, Chirag Patel, James Mark Maisel. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 28.10.2016.

  10. INTERNATIONAL STANDARDS ON FOOD AND ENVIRONMENTAL RADIOACTIVITY MEASUREMENT FOR RADIOLOGICAL PROTECTION: STATUS AND PERSPECTIVES.

    PubMed

    Calmet, D; Ameon, R; Bombard, A; Brun, S; Byrde, F; Chen, J; Duda, J-M; Forte, M; Fournier, M; Fronka, A; Haug, T; Herranz, M; Husain, A; Jerome, S; Jiranek, M; Judge, S; Kim, S B; Kwakman, P; Loyen, J; LLaurado, M; Michel, R; Porterfield, D; Ratsirahonana, A; Richards, A; Rovenska, K; Sanada, T; Schuler, C; Thomas, L; Tokonami, S; Tsapalov, A; Yamada, T

    2017-04-01

    Radiological protection is a matter of concern for members of the public and thus national authorities are more likely to trust the quality of radioactivity data provided by accredited laboratories using common standards. Normative approach based on international standards aims to ensure the accuracy or validity of the test result through calibrations and measurements traceable to the International System of Units. This approach guarantees that radioactivity test results on the same types of samples are comparable over time and space as well as between different testing laboratories. Today, testing laboratories involved in radioactivity measurement have a set of more than 150 international standards to help them perform their work. Most of them are published by the International Standardization Organization (ISO) and the International Electrotechnical Commission (IEC). This paper reviews the most essential ISO standards that give guidance to testing laboratories at different stages from sampling planning to the transmission of the test report to their customers, summarizes recent activities and achievements and present the perspectives on new standards under development by the ISO Working Groups dealing with radioactivity measurement in connection with radiological protection. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  12. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  13. Toward a sustainability label for food products: an analysis of experts' and consumers' acceptance.

    PubMed

    Engels, Stéphanie V; Hansmann, Ralf; Scholz, Roland W

    2010-01-01

    The recent proliferation of standards and labels for organic, fair-trade, locally produced, and healthy food products risks creating confusion among consumers. This study presents a standardized approach to developing a comprehensive sustainability label that incorporates ecological, economic, and social values. The methodology is based on an extension of modular life-cycle assessment to non-environmental sustainability criteria. Interviews with a wide range of experts (n=65) and a consumer survey (n=233) were conducted to analyze the feasibility and potential effectiveness of the approach. Responses indicated that a comprehensive sustainability label could considerably influence consumption patterns and facilitate cross-product comparisons. Copyright © Taylor & Francis Group, LLC

  14. Per capita invasion probabilities: an empirical model to predict rates of invasion via ballast water

    USGS Publications Warehouse

    Reusser, Deborah A.; Lee, Henry; Frazier, Melanie; Ruiz, Gregory M.; Fofonoff, Paul W.; Minton, Mark S.; Miller, A. Whitman

    2013-01-01

    Ballast water discharges are a major source of species introductions into marine and estuarine ecosystems. To mitigate the introduction of new invaders into these ecosystems, many agencies are proposing standards that establish upper concentration limits for organisms in ballast discharge. Ideally, ballast discharge standards will be biologically defensible and adequately protective of the marine environment. We propose a new technique, the per capita invasion probability (PCIP), for managers to quantitatively evaluate the relative risk of different concentration-based ballast water discharge standards. PCIP represents the likelihood that a single discharged organism will become established as a new nonindigenous species. This value is calculated by dividing the total number of ballast water invaders per year by the total number of organisms discharged from ballast. Analysis was done at the coast-wide scale for the Atlantic, Gulf, and Pacific coasts, as well as the Great Lakes, to reduce uncertainty due to secondary invasions between estuaries on a single coast. The PCIP metric is then used to predict the rate of new ballast-associated invasions given various regulatory scenarios. Depending upon the assumptions used in the risk analysis, this approach predicts that approximately one new species will invade every 10–100 years with the International Maritime Organization (IMO) discharge standard of 50 μm per m3 of ballast. This approach resolves many of the limitations associated with other methods of establishing ecologically sound discharge standards, and it allows policy makers to use risk-based methodologies to establish biologically defensible discharge standards.

  15. An evaluation of multiple trauma severity indices created by different index development strategies.

    PubMed

    Gustafson, D H; Fryback, D G; Rose, J H; Prokop, C T; Detmer, D E; Rossmeissl, J C; Taylor, C M; Alemi, F; Carnazzo, A J

    1983-07-01

    Evaluation of the effectiveness of emergency trauma care systems is complicated by the need to adjust for the widely variable case mix found in trauma patient populations. Several strategies have been advanced to construct the severity indices that can control for these population differences. This article describes a validity and reliability comparison of trauma severity indices developed under three different approaches: 1) use of a multi-attribute utility (MAU) model; 2) an actuarial approach relying on empirical data bases; and 3) an "ad hoc" approach. Seven criteria were identified to serve as standards of comparison for four different indices. The study's findings indicate that the index developed using the MAU theory approach associates most closely with physician judgments of trauma severity. When correlated with a morbidity outcome measure, the MAU-based index shows higher levels of agreement than the other indices. The index development approach based on the principles of MAU theory has several advantages and it appears to be a powerful tool in the creation of effective severity indices.

  16. Travel without Leaving the Classroom.

    ERIC Educational Resources Information Center

    Zertuche, Albert A.

    2002-01-01

    Describes a lesson on different world ecosystems in which activities are based on the constructivist approach to teaching that encourages learners to control their own learning. Includes a sample grading rubric and national science education standards related to these activities. (KHR)

  17. A data fusion framework for meta-evaluation of intelligent transportation system effectiveness

    DOT National Transportation Integrated Search

    This study presents a framework for the meta-evaluation of Intelligent Transportation System effectiveness. The framework is based on data fusion approaches that adjust for data biases and violations of other standard statistical assumptions. Operati...

  18. Shoulder-To-Shoulder Innovation

    ERIC Educational Resources Information Center

    Demski, Jennifer

    2011-01-01

    Arizona's Vail School District won this year's Sylvia Charp Award because of its revolutionary--and truly collaborative--approach to standards-based curriculum development that it is sharing with the rest of the state. This article takes a look at how they did it.

  19. Approaches to setting organism-based ballast water discharge standards

    EPA Science Inventory

    As a major vector by which foreign species invade coastal and freshwater waterbodies, ballast water discharge from ships is recognized as a major environmental threat. The International Maritime Organization (IMO) drafted an international ballast water treaty establishing ballast...

  20. Whole-genome-based Mycobacterium tuberculosis surveillance: a standardized, portable, and expandable approach.

    PubMed

    Kohl, Thomas A; Diel, Roland; Harmsen, Dag; Rothgänger, Jörg; Walter, Karen Meywald; Merker, Matthias; Weniger, Thomas; Niemann, Stefan

    2014-07-01

    Whole-genome sequencing (WGS) allows for effective tracing of Mycobacterium tuberculosis complex (MTBC) (tuberculosis pathogens) transmission. However, it is difficult to standardize and, therefore, is not yet employed for interlaboratory prospective surveillance. To allow its widespread application, solutions for data standardization and storage in an easily expandable database are urgently needed. To address this question, we developed a core genome multilocus sequence typing (cgMLST) scheme for clinical MTBC isolates using the Ridom SeqSphere(+) software, which transfers the genome-wide single nucleotide polymorphism (SNP) diversity into an allele numbering system that is standardized, portable, and not computationally intensive. To test its performance, we performed WGS analysis of 26 isolates with identical IS6110 DNA fingerprints and spoligotyping patterns from a longitudinal outbreak in the federal state of Hamburg, Germany (notified between 2001 and 2010). The cgMLST approach (3,041 genes) discriminated the 26 strains with a resolution comparable to that of SNP-based WGS typing (one major cluster of 22 identical or closely related and four outlier isolates with at least 97 distinct SNPs or 63 allelic variants). Resulting tree topologies are highly congruent and grouped the isolates in both cases analogously. Our data show that SNP- and cgMLST-based WGS analyses facilitate high-resolution discrimination of longitudinal MTBC outbreaks. cgMLST allows for a meaningful epidemiological interpretation of the WGS genotyping data. It enables standardized WGS genotyping for epidemiological investigations, e.g., on the regional public health office level, and the creation of web-accessible databases for global TB surveillance with an integrated early warning system. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  1. A fluorometric paper-based sensor array for the discrimination of heavy-metal ions.

    PubMed

    Feng, Liang; Li, Hui; Niu, Li-Ya; Guan, Ying-Shi; Duan, Chun-Feng; Guan, Ya-Feng; Tung, Chen-Ho; Yang, Qing-Zheng

    2013-04-15

    A fluorometric paper-based sensor array has been developed for the sensitive and convenient determination of seven heavy-metal ions at their wastewater discharge standard concentrations. Combining with nine cross-reactive BODIPY fluorescent indicators and array technologies-based pattern-recognition, we have obtained the discrimination capability of seven different heavy-metal ions at their wastewater discharge standard concentrations. After the immobilization of indicators and the enrichment of analytes, identification of the heavy-metal ions was readily acquired using a standard chemometric approach. Clear differentiation among heavy-metal ions as a function of concentration was also achieved, even down to 10(-7)M. A semi-quantitative estimation of the heavy-metal ion concentration was obtained by comparing color changes with a set of known concentrations. The sensor array was tentatively investigated in spiked tap water and sea water, and showed possible feasibility for real sample testing. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Robust optimization based energy dispatch in smart grids considering demand uncertainty

    NASA Astrophysics Data System (ADS)

    Nassourou, M.; Puig, V.; Blesa, J.

    2017-01-01

    In this study we discuss the application of robust optimization to the problem of economic energy dispatch in smart grids. Robust optimization based MPC strategies for tackling uncertain load demands are developed. Unexpected additive disturbances are modelled by defining an affine dependence between the control inputs and the uncertain load demands. The developed strategies were applied to a hybrid power system connected to an electrical power grid. Furthermore, to demonstrate the superiority of the standard Economic MPC over the MPC tracking, a comparison (e.g average daily cost) between the standard MPC tracking, the standard Economic MPC, and the integration of both in one-layer and two-layer approaches was carried out. The goal of this research is to design a controller based on Economic MPC strategies, that tackles uncertainties, in order to minimise economic costs and guarantee service reliability of the system.

  3. Variable selection for confounder control, flexible modeling and Collaborative Targeted Minimum Loss-based Estimation in causal inference

    PubMed Central

    Schnitzer, Mireille E.; Lok, Judith J.; Gruber, Susan

    2015-01-01

    This paper investigates the appropriateness of the integration of flexible propensity score modeling (nonparametric or machine learning approaches) in semiparametric models for the estimation of a causal quantity, such as the mean outcome under treatment. We begin with an overview of some of the issues involved in knowledge-based and statistical variable selection in causal inference and the potential pitfalls of automated selection based on the fit of the propensity score. Using a simple example, we directly show the consequences of adjusting for pure causes of the exposure when using inverse probability of treatment weighting (IPTW). Such variables are likely to be selected when using a naive approach to model selection for the propensity score. We describe how the method of Collaborative Targeted minimum loss-based estimation (C-TMLE; van der Laan and Gruber, 2010) capitalizes on the collaborative double robustness property of semiparametric efficient estimators to select covariates for the propensity score based on the error in the conditional outcome model. Finally, we compare several approaches to automated variable selection in low-and high-dimensional settings through a simulation study. From this simulation study, we conclude that using IPTW with flexible prediction for the propensity score can result in inferior estimation, while Targeted minimum loss-based estimation and C-TMLE may benefit from flexible prediction and remain robust to the presence of variables that are highly correlated with treatment. However, in our study, standard influence function-based methods for the variance underestimated the standard errors, resulting in poor coverage under certain data-generating scenarios. PMID:26226129

  4. Variable Selection for Confounder Control, Flexible Modeling and Collaborative Targeted Minimum Loss-Based Estimation in Causal Inference.

    PubMed

    Schnitzer, Mireille E; Lok, Judith J; Gruber, Susan

    2016-05-01

    This paper investigates the appropriateness of the integration of flexible propensity score modeling (nonparametric or machine learning approaches) in semiparametric models for the estimation of a causal quantity, such as the mean outcome under treatment. We begin with an overview of some of the issues involved in knowledge-based and statistical variable selection in causal inference and the potential pitfalls of automated selection based on the fit of the propensity score. Using a simple example, we directly show the consequences of adjusting for pure causes of the exposure when using inverse probability of treatment weighting (IPTW). Such variables are likely to be selected when using a naive approach to model selection for the propensity score. We describe how the method of Collaborative Targeted minimum loss-based estimation (C-TMLE; van der Laan and Gruber, 2010 [27]) capitalizes on the collaborative double robustness property of semiparametric efficient estimators to select covariates for the propensity score based on the error in the conditional outcome model. Finally, we compare several approaches to automated variable selection in low- and high-dimensional settings through a simulation study. From this simulation study, we conclude that using IPTW with flexible prediction for the propensity score can result in inferior estimation, while Targeted minimum loss-based estimation and C-TMLE may benefit from flexible prediction and remain robust to the presence of variables that are highly correlated with treatment. However, in our study, standard influence function-based methods for the variance underestimated the standard errors, resulting in poor coverage under certain data-generating scenarios.

  5. The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training

    ERIC Educational Resources Information Center

    Sandrey, Michelle A.; Bulger, Sean M.

    2008-01-01

    Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,…

  6. Designing a Clinical Framework to Guide Gross Motor Intervention Decisions for Infants and Young Children with Hypotonia

    ERIC Educational Resources Information Center

    Darrah, Johanna; O'Donnell, Maureen; Lam, Joyce; Story, Maureen; Wickenheiser, Diane; Xu, Kaishou; Jin, Xiaokun

    2013-01-01

    Clinical practice frameworks are a valuable component of clinical education, promoting informed clinical decision making based on the best available evidence and/or clinical experience. They encourage standardized intervention approaches and evaluation of practice. Based on an international project to support the development of an enhanced service…

  7. Inquiry-Based Science and Technology Enrichment Program for Middle School-Aged Female Students

    ERIC Educational Resources Information Center

    Kim, Hanna

    2016-01-01

    This study investigates the effects of an intensive 1-week Inquiry-Based Science and Technology Enrichment Program (InSTEP) designed for middle school-aged female students. InSTEP uses a guided/open inquiry approach that is deepened and redefined as eight sciences and engineering practices in the Next Generation Science Standards, which aimed at…

  8. Mutation-Based Learning to Improve Student Autonomy and Scientific Inquiry Skills in a Large Genetics Laboratory Course

    ERIC Educational Resources Information Center

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a "mutation" method in a molecular genetics laboratory course. Students could…

  9. Data Manipulation in an XML-Based Digital Image Library

    ERIC Educational Resources Information Center

    Chang, Naicheng

    2005-01-01

    Purpose: To help to clarify the role of XML tools and standards in supporting transition and migration towards a fully XML-based environment for managing access to information. Design/methodology/approach: The Ching Digital Image Library, built on a three-tier architecture, is used as a source of examples to illustrate a number of methods of data…

  10. Is Inquiry-Based Science Teaching Worth the Effort? Some Thoughts Worth Considering

    ERIC Educational Resources Information Center

    Zhang, Lin

    2016-01-01

    Inquiry-based science teaching has been advocated by many science educational standards and reports from around the world. Disagreements about and concerns with this teaching approach, however, are often ignored. Opposing ideas and conflicting results have been bouncing around in the field. It seems that the field carries on with a hope that…

  11. A Modified Approach to Team-Based Learning in Linear Algebra Courses

    ERIC Educational Resources Information Center

    Nanes, Kalman M.

    2014-01-01

    This paper documents the author's adaptation of team-based learning (TBL), an active learning pedagogy developed by Larry Michaelsen and others, in the linear algebra classroom. The paper discusses the standard components of TBL and the necessary changes to those components for the needs of the course in question. There is also an empirically…

  12. Computer-Based Assessment of Collaborative Problem Solving: Exploring the Feasibility of Human-to-Agent Approach

    ERIC Educational Resources Information Center

    Rosen, Yigal

    2015-01-01

    How can activities in which collaborative skills of an individual are measured be standardized? In order to understand how students perform on collaborative problem solving (CPS) computer-based assessment, it is necessary to examine empirically the multi-faceted performance that may be distributed across collaboration methods. The aim of this…

  13. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  14. Managing public health in the Army through a standard community health promotion council model.

    PubMed

    Courie, Anna F; Rivera, Moira Shaw; Pompey, Allison

    2014-01-01

    Public health processes in the US Army remain uncoordinated due to competing lines of command, funding streams and multiple subject matter experts in overlapping public health concerns. The US Army Public Health Command (USAPHC) has identified a standard model for community health promotion councils (CHPCs) as an effective framework for synchronizing and integrating these overlapping systems to ensure a coordinated approach to managing the public health process. The purpose of this study is to test a foundational assumption of the CHPC effectiveness theory: the 3 features of a standard CHPC model - a CHPC chaired by a strong leader, ie, the senior commander; a full time health promotion team dedicated to the process; and centralized management through the USAPHC - will lead to high quality health promotion councils capable of providing a coordinated approach to addressing public health on Army installations. The study employed 2 evaluation questions: (1) Do CHPCs with centralized management through the USAPHC, alignment with the senior commander, and a health promotion operations team adhere more closely to the evidence-based CHPC program framework than CHPCs without these 3 features? (2) Do members of standard CHPCs report that participation in the CHPC leads to a well-coordinated approach to public health at the installation? The results revealed that both time (F(5,76)=25.02, P<.0001) and the 3 critical features of the standard CHPC model (F(1,76)=28.40, P<.0001) independently predicted program adherence. Evaluation evidence supports the USAPHC's approach to CHPC implementation as part of public health management on Army installations. Preliminary evidence suggests that the standard CHPC model may lead to a more coordinated approach to public health and may assure that CHPCs follow an evidence-informed design. This is consistent with past research demonstrating that community coalitions and public health systems that have strong leadership; dedicated staff time and expertise; influence over policy, governance and oversight; and formalized rules and regulations function more effectively than those without. It also demonstrates the feasibility of implementing an evidence-informed approach to community coalitions in an Army environment.

  15. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments

    DTIC Science & Technology

    2011-12-01

    Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization

  16. Measurements of Aircraft Wake Vortex Separation at High Arrival Rates and a Proposed New Wake Vortex Separation Philosophy

    NASA Technical Reports Server (NTRS)

    Rutishauser, David; Donohue, George L.; Haynie, Rudolph C.

    2003-01-01

    This paper presents data and a proposed new aircraft wake vortex separation standard that argues for a fundamental re-thinking of international practice. The current static standard, under certain atmospheric conditions, presents an unnecessary restriction on system capacity. A new approach, that decreases aircraft separation when atmospheric conditions dictate, is proposed based upon the availability of new instrumentation and a better understanding of wake physics.

  17. Purpose, processes, partnerships, and products: four Ps to advance participatory socio-environmental modeling

    USGS Publications Warehouse

    Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre D.; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt-Olabisi, Laura; Singer, Alison; Sterling, Eleanor J.; Zellner, Moira

    2018-01-01

    Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human–environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.

  18. Purpose, processes, partnerships, and products: four Ps to advance participatory socio-environmental modeling.

    PubMed

    Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt Olabisi, Laura; Singer, Alison; Sterling, Eleanor; Zellner, Moira

    2018-01-01

    Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM. © 2017 by the Ecological Society of America.

  19. Applying an innovative educational program for the education of today's engineers

    NASA Astrophysics Data System (ADS)

    Kans, M.

    2012-05-01

    Engineers require a broad spectrum of knowledge and skills: basic skills in mathematics and physics, skills and competencies within the major subject area as well as more general knowledge about business and enterprise contexts, society regulations and understanding of the future professions' characteristics. In addition, social, intercultural, analytical and managing competencies are desired. The CDIO educational program was initiated as a means to come closer to practice and to assure the training of engineering skills that are required of today's engineers. CDIO is short for Conceive-Design-Implement-Operate and describes the full life cycle understanding of a system or asset that engineering students should reach during education. The CDIO initiative is formulated in a program consisting of two important documents: the CDIO standards and the CDIO syllabus. The standards describe a holistic approach on education, from knowledge and skills to be trained, how to train and assess them, to how to develop the teaching staff and the work places for enabling the goals. The specific knowledge and skills to be achieved are accounted for in the syllabus. In this paper we share our more than 15 years of experiences in problem and project based learning from the perspective of the CDIO standards. For each standard, examples of how to set up the education and overcome challenges connected to the standard are given. The paper concludes with recommendations to others wishing to work toward problem and real-life based education without compromising the requirements of a scientific approach.

  20. Automatic selection of landmarks in T1-weighted head MRI with regression forests for image registration initialization.

    PubMed

    Wang, Jianing; Liu, Yuan; Noble, Jack H; Dawant, Benoit M

    2017-10-01

    Medical image registration establishes a correspondence between images of biological structures, and it is at the core of many applications. Commonly used deformable image registration methods depend on a good preregistration initialization. We develop a learning-based method to automatically find a set of robust landmarks in three-dimensional MR image volumes of the head. These landmarks are then used to compute a thin plate spline-based initialization transformation. The process involves two steps: (1) identifying a set of landmarks that can be reliably localized in the images and (2) selecting among them the subset that leads to a good initial transformation. To validate our method, we use it to initialize five well-established deformable registration algorithms that are subsequently used to register an atlas to MR images of the head. We compare our proposed initialization method with a standard approach that involves estimating an affine transformation with an intensity-based approach. We show that for all five registration algorithms the final registration results are statistically better when they are initialized with the method that we propose than when a standard approach is used. The technique that we propose is generic and could be used to initialize nonrigid registration algorithms for other applications.

  1. Use of the ICRP system for the protection of marine ecosystems.

    PubMed

    Telleria, D; Cabianca, T; Proehl, G; Kliaus, V; Brown, J; Bossio, C; Van der Wolf, J; Bonchuk, I; Nilsen, M

    2015-06-01

    The International Commission on Radiological Protection (ICRP) recently reinforced the international system of radiological protection, initially focused on humans, by identifying principles of environmental protection and proposing a framework for assessing impacts of ionising radiation on non-human species, based on a reference flora and fauna approach. For this purpose, ICRP developed dosimetric models for a set of Reference Animals and Plants, which are representative of flora and fauna in different environments (terrestrial, freshwater, marine), and produced criteria based on information on radiation effects, with the aim of evaluating the level of potential or actual radiological impacts, and as an input for decision making. The approach developed by ICRP for flora and fauna is consistent with the approach used to protect humans. The International Atomic Energy Agency (IAEA) includes considerations on the protection of the environment in its safety standards, and is currently developing guidelines to assess radiological impacts based on the aforementioned ICRP approach. This paper presents the method developed by IAEA, in a series of meetings with international experts, to enable assessment of the radiological impact to the marine environment in connection with the Convention on the Prevention of Marine Pollution by Dumping of Wastes and Other Matter 1972 (London Convention 1972). This method is based on IAEA's safety standards and ICRP's recommendations, and was presented in 2013 for consideration by representatives of the contracting parties of the London Convention 1972; it was approved for inclusion in its procedures, and is in the process of being incorporated into guidelines. © The International Society for Prosthetics and Orthotics Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  2. Event-driven, pattern-based methodology for cost-effective development of standardized personal health devices.

    PubMed

    Martínez-Espronceda, Miguel; Trigo, Jesús D; Led, Santiago; Barrón-González, H Gilberto; Redondo, Javier; Baquero, Alfonso; Serrano, Luis

    2014-11-01

    Experiences applying standards in personal health devices (PHDs) show an inherent trade-off between interoperability and costs (in terms of processing load and development time). Therefore, reducing hardware and software costs as well as time-to-market is crucial for standards adoption. The ISO/IEEE11073 PHD family of standards (also referred to as X73PHD) provides interoperable communication between PHDs and aggregators. Nevertheless, the responsibility of achieving inexpensive implementations of X73PHD in limited resource microcontrollers falls directly on the developer. Hence, the authors previously presented a methodology based on patterns to implement X73-compliant PHDs into devices with low-voltage low-power constraints. That version was based on multitasking, which required additional features and resources. This paper therefore presents an event-driven evolution of the patterns-based methodology for cost-effective development of standardized PHDs. The results of comparing between the two versions showed that the mean values of decrease in memory consumption and cycles of latency are 11.59% and 45.95%, respectively. In addition, several enhancements in terms of cost-effectiveness and development time can be derived from the new version of the methodology. Therefore, the new approach could help in producing cost-effective X73-compliant PHDs, which in turn could foster the adoption of standards. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Income or living standard and health in Germany: different ways of measurement of relative poverty with regard to self-rated health.

    PubMed

    Pfoertner, Timo-Kolja; Andress, Hans-Juergen; Janssen, Christian

    2011-08-01

    Current study introduces the living standard concept as an alternative approach of measuring poverty and compares its explanatory power to an income-based poverty measure with regard to subjective health status of the German population. Analyses are based on the German Socio-Economic Panel (2001, 2003 and 2005) and refer to binary logistic regressions of poor subjective health status with regard to each poverty condition, their duration and their causal influence from a previous time point. To calculate the discriminate power of both poverty indicators, initially the indicators were considered separately in regression models and subsequently, both were included simultaneously. The analyses reveal a stronger poverty-health relationship for the living standard indicator. An inadequate living standard in 2005, longer spells of an inadequate living standard between 2001, 2003 and 2005 as well as an inadequate living standard at a previous time point is significantly strongly associated with subjective health than income poverty. Our results challenge conventional measurements of the relationship between poverty and health that probably has been underestimated by income measures so far.

  4. Effect of music therapy with emotional-approach coping on preprocedural anxiety in cardiac catheterization: a randomized controlled trial.

    PubMed

    Ghetti, Claire M

    2013-01-01

    Individuals undergoing cardiac catheterization are likely to experience elevated anxiety periprocedurally, with highest anxiety levels occurring immediately prior to the procedure. Elevated anxiety has the potential to negatively impact these individuals psychologically and physiologically in ways that may influence the subsequent procedure. This study evaluated the use of music therapy, with a specific emphasis on emotional-approach coping, immediately prior to cardiac catheterization to impact periprocedural outcomes. The randomized, pretest/posttest control group design consisted of two experimental groups--the Music Therapy with Emotional-Approach Coping group [MT/EAC] (n = 13), and a talk-based Emotional-Approach Coping group (n = 14), compared with a standard care Control group (n = 10). MT/EAC led to improved positive affective states in adults awaiting elective cardiac catheterization, whereas a talk-based emphasis on emotional-approach coping or standard care did not. All groups demonstrated a significant overall decrease in negative affect. The MT/EAC group demonstrated a statistically significant, but not clinically significant, increase in systolic blood pressure most likely due to active engagement in music making. The MT/EAC group trended toward shortest procedure length and least amount of anxiolytic required during the procedure, while the EAC group trended toward least amount of analgesic required during the procedure, but these differences were not statistically significant. Actively engaging in a session of music therapy with an emphasis on emotional-approach coping can improve the well-being of adults awaiting cardiac catheterization procedures.

  5. A Cost-Effective Transparency-Based Digital Imaging for Efficient and Accurate Wound Area Measurement

    PubMed Central

    Li, Pei-Nan; Li, Hong; Wu, Mo-Li; Wang, Shou-Yu; Kong, Qing-You; Zhang, Zhen; Sun, Yuan; Liu, Jia; Lv, De-Cheng

    2012-01-01

    Wound measurement is an objective and direct way to trace the course of wound healing and to evaluate therapeutic efficacy. Nevertheless, the accuracy and efficiency of the current measurement methods need to be improved. Taking the advantages of reliability of transparency tracing and the accuracy of computer-aided digital imaging, a transparency-based digital imaging approach is established, by which data from 340 wound tracing were collected from 6 experimental groups (8 rats/group) at 8 experimental time points (Day 1, 3, 5, 7, 10, 12, 14 and 16) and orderly archived onto a transparency model sheet. This sheet was scanned and its image was saved in JPG form. Since a set of standard area units from 1 mm2 to 1 cm2 was integrated into the sheet, the tracing areas in JPG image were measured directly, using the “Magnetic lasso tool” in Adobe Photoshop program. The pixel values/PVs of individual outlined regions were obtained and recorded in an average speed of 27 second/region. All PV data were saved in an excel form and their corresponding areas were calculated simultaneously by the formula of Y (PV of the outlined region)/X (PV of standard area unit) × Z (area of standard unit). It took a researcher less than 3 hours to finish area calculation of 340 regions. In contrast, over 3 hours were expended by three skillful researchers to accomplish the above work with traditional transparency-based method. Moreover, unlike the results obtained traditionally, little variation was found among the data calculated by different persons and the standard area units in different sizes and shapes. Given its accurate, reproductive and efficient properties, this transparency-based digital imaging approach would be of significant values in basic wound healing research and clinical practice. PMID:22666449

  6. Simultaneous quantitation of sphingoid bases by UPLC-ESI-MS/MS with identical 13C-encoded internal standards.

    PubMed

    Mirzaian, M; Wisse, P; Ferraz, M J; Marques, A R A; Gaspar, P; Oussoren, S V; Kytidou, K; Codée, J D C; van der Marel, G; Overkleeft, H S; Aerts, J M

    2017-03-01

    Free sphingoid bases (lysosphingolipids) of primary storage sphingolipids are increased in tissues and plasma of several sphingolipidoses. As shown earlier by us, sphingoid bases can be accurately quantified using UPLC-ESI-MS/MS, particularly in combination with identical 13 C-encoded internal standards. The feasibility of simultaneous quantitation of sphingoid bases in plasma specimens spiked with a mixture of such standards is here described. The sensitivity and linearity of detection is excellent for all examined sphingoid bases (sphingosine, sphinganine, hexosyl-sphingosine (glucosylsphingosine), hexosyl 2 -sphingosine (lactosylsphingosine), hexosyl 3 -sphingosine (globotriaosylsphingosine), phosphorylcholine-sphingosine) in the relevant concentration range and the measurements show very acceptable intra- and inter-assay variation (<10% average). Plasma samples of a series of male and female Gaucher Disease and Fabry Disease patients were analyzed with the multiplex assay. The obtained data compare well to those earlier determined for plasma globotriaosylsphingosine and glucosylsphingosine in GD and FD patients. The same approach can be also applied to measure sphingolipids in the same sample. Following extraction of sphingolipids from the same sample these can be converted to sphingoid bases by microwave exposure and subsequently quantified using 13 C-encoded internal standards. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.

    PubMed

    Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro

    2016-09-02

    Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards.

  8. A modified artificial immune system based pattern recognition approach -- an application to clinic diagnostics

    PubMed Central

    Zhao, Weixiang; Davis, Cristina E.

    2011-01-01

    Objective This paper introduces a modified artificial immune system (AIS)-based pattern recognition method to enhance the recognition ability of the existing conventional AIS-based classification approach and demonstrates the superiority of the proposed new AIS-based method via two case studies of breast cancer diagnosis. Methods and materials Conventionally, the AIS approach is often coupled with the k nearest neighbor (k-NN) algorithm to form a classification method called AIS-kNN. In this paper we discuss the basic principle and possible problems of this conventional approach, and propose a new approach where AIS is integrated with the radial basis function – partial least square regression (AIS-RBFPLS). Additionally, both the two AIS-based approaches are compared with two classical and powerful machine learning methods, back-propagation neural network (BPNN) and orthogonal radial basis function network (Ortho-RBF network). Results The diagnosis results show that: (1) both the AIS-kNN and the AIS-RBFPLS proved to be a good machine leaning method for clinical diagnosis, but the proposed AIS-RBFPLS generated an even lower misclassification ratio, especially in the cases where the conventional AIS-kNN approach generated poor classification results because of possible improper AIS parameters. For example, based upon the AIS memory cells of “replacement threshold = 0.3”, the average misclassification ratios of two approaches for study 1 are 3.36% (AIS-RBFPLS) and 9.07% (AIS-kNN), and the misclassification ratios for study 2 are 19.18% (AIS-RBFPLS) and 28.36% (AIS-kNN); (2) the proposed AIS-RBFPLS presented its robustness in terms of the AIS-created memory cells, showing a smaller standard deviation of the results from the multiple trials than AIS-kNN. For example, using the result from the first set of AIS memory cells as an example, the standard deviations of the misclassification ratios for study 1 are 0.45% (AIS-RBFPLS) and 8.71% (AIS-kNN) and those for study 2 are 0.49% (AIS-RBFPLS) and 6.61% (AIS-kNN); and (3) the proposed AIS-RBFPLS classification approaches also yielded better diagnosis results than two classical neural network approaches of BPNN and Ortho-RBF network. Conclusion In summary, this paper proposed a new machine learning method for complex systems by integrating the AIS system with RBFPLS. This new method demonstrates its satisfactory effect on classification accuracy for clinical diagnosis, and also indicates its wide potential applications to other diagnosis and detection problems. PMID:21515033

  9. Does the form or the amount of exposure make a difference in the cognitive-behavioral therapy treatment of social phobia?

    PubMed

    Borgeat, François; Stankovic, Miroslava; Khazaal, Yasser; Rouget, Beatrice Weber; Baumann, Marie-Claude; Riquier, Françoise; O'Connor, Kieron; Jermann, Françoise; Zullino, Daniele; Bondolfi, Guido

    2009-07-01

    Exposure is considered to be an essential ingredient of cognitive-behavioral therapy treatment of social phobia and of most anxiety disorders. To assess the impact of the amount of exposure on outcome, 30 social phobic patients were randomly allocated to 1 of 2 group treatments of 8 weekly sessions: Self-Focused Exposure Therapy which is based essentially on prolonged exposure to public speaking combined with positive feedback or a more standard cognitive and behavioral method encompassing psychoeducation, cognitive work, working through exposure hierarchies of feared situations for exposure within and outside the group. The results show that the 2 methods led to significant and equivalent symptomatic improvements which were maintained at 1-year follow-up. There was a more rapid and initially more pronounced decrease in negative cognitions with the Self-Focused Exposure Therapy, which included no formal cognitive work, than with the more standard approach in which approximately a third of the content was cognitive. In contrast, decrease in social avoidance was more persistent with standard cognitive-behavior therapy which involved less exposure. The results indicate that positive cognitive change can be achieved more rapidly with non cognitive methods while avoidance decreases more reliably with a standard approach rather than an approach with an exclusive focus on exposure.

  10. Effective spatial database support for acquiring spatial information from remote sensing images

    NASA Astrophysics Data System (ADS)

    Jin, Peiquan; Wan, Shouhong; Yue, Lihua

    2009-12-01

    In this paper, a new approach to maintain spatial information acquiring from remote-sensing images is presented, which is based on Object-Relational DBMS. According to this approach, the detected and recognized results of targets are stored and able to be further accessed in an ORDBMS-based spatial database system, and users can access the spatial information using the standard SQL interface. This approach is different from the traditional ArcSDE-based method, because the spatial information management module is totally integrated into the DBMS and becomes one of the core modules in the DBMS. We focus on three issues, namely the general framework for the ORDBMS-based spatial database system, the definitions of the add-in spatial data types and operators, and the process to develop a spatial Datablade on Informix. The results show that the ORDBMS-based spatial database support for image-based target detecting and recognition is easy and practical to be implemented.

  11. An Approach to Information Management for AIR7000 with Metadata and Ontologies

    DTIC Science & Technology

    2009-10-01

    metadata. We then propose an approach based on Semantic Technologies including the Resource Description Framework (RDF) and Upper Ontologies, for the...mandating specific metadata schemas can result in interoperability problems. For example, many standards within the ADO mandate the use of XML for metadata...such problems, we propose an archi- tecture in which different metadata schemes can inter operate. By using RDF (Resource Description Framework ) as a

  12. Caracterisation, modelisation et validation du transfert radiatif d'atmospheres non standard; impact sur les corrections atmospheriques d'images de teledetection

    NASA Astrophysics Data System (ADS)

    Zidane, Shems

    This study is based on data acquired with an airborne multi-altitude sensor on July 2004 during a nonstandard atmospheric event in the region of Saint-Jean-sur-Richelieu, Quebec. By non-standard atmospheric event we mean an aerosol atmosphere that does not obey the typical monotonic, scale height variation employed in virtually all atmospheric correction codes. The surfaces imaged during this field campaign included a diverse variety of targets : agricultural land, water bodies, urban areas and forests. The multi-altitude approach employed in this campaign allowed us to better understand the altitude dependent influence of the atmosphere over the array of ground targets and thus to better characterize the perturbation induced by a non-standard (smoke) plume. The transformation of the apparent radiance at 3 different altitudes into apparent reflectance and the insertion of the plume optics into an atmospheric correction model permitted an atmospheric correction of the apparent reflectance at the two higher altitudes. The results showed consistency with the apparent validation reflectances derived from the lowest altitude radiances. This approach effectively confirmed the accuracy of our non-standard atmospheric correction approach. This test was particularly relevant at the highest altitude of 3.17 km : the apparent reflectances at this altitude were above most of the plume and therefore represented a good test of our ability to adequately correct for the influence of the perturbation. Standard atmospheric disturbances are obviously taken into account in most atmospheric correction models, but these are based on monotonically decreasing aerosol variations with increasing altitude. When the atmospheric radiation is affected by a plume or a local, non-standard pollution event, one must adapt the existing models to the radiative transfer constraints of the local perturbation and to the reality of the measurable parameters available for ingestion into the model. The main inputs of this study were those normally used in an atmospheric correction : apparent at-sensor radiance and the aerosol optical depth (AOD) acquired using ground-based sunphotometry. The procedure we employed made use of a standard atmospheric correction code (CAM5S, for Canadian Modified 5S, which comes from the 5S radiative transfer model in the visible and near infrared) : however, we also used other parameters and data to adapt and correctly model the special atmospheric situation which affected the multi-altitude images acquired during the St. Jean field campaign. We then developed a modeling protocol for these atmospheric perturbations where auxiliary data was employed to complement our main data-set. This allowed for the development of a robust and simple methodology adapted to this atmospheric situation. The auxiliary data, i.e. meteorological data, LIDAR profiles, various satellite images and sun photometer retrievals of the scattering phase function, were sufficient to accurately model the observed plume in terms of a unusual, vertical distribution. This distribution was transformed into an aerosol optical depth profile that replaced the standard aerosol optical depth profile employed in the CAM5S atmospheric correction model. Based on this model, a comparison between the apparent ground reflectances obtained after atmospheric corrections and validation values of R*(0) obtained from the lowest altitude data showed that the error between the two was less than 0.01 rms. This correction was shown to be a significantly better estimation of the surface reflectance than that obtained using the atmospheric correction model. Significant differences were nevertheless observed in the non-standard solution : these were mainly caused by the difficulties brought about by the acquisition conditions, significant disparities attributable to inconsistencies in the co-sampling / co-registration of different targets from three different altitudes, and possibly modeling errors and / or calibration. There is accordingly room for improvement in our approach to dealing with such conditions. The modeling and forecasting of such a disturbance is explicitly described in this document: our goal in so doing is to permit the establishment of a better protocol for the acquisition of more suitable supporting data. The originality of this study stems from a new approach for incorporating a plume structure into an operational atmospheric correction model and then demonstrating that the approach was a significant improvement over an approach that ignored the perturbations in the vertical profile while employing the correct overall AOD. The profile model we employed was simple and robust but captured sufficient plume detail to achieve significant improvements in atmospheric correction accuracy. The overall process of addressing all the problems encountered in the analysis of our aerosol perturbation helped us to build an appropriate methodology for characterizing such events based on data availability, distributed freely and accessible to the scientific community. This makes our study adaptable and exportable to other types of non-standard atmospheric events. Keywords : non-standard atmospheric perturbation, multi-altitude apparent radiances, smoke plume, Gaussian plume modelization, radiance fit, AOD, CASI

  13. Laparoscopic and Robotic Total Mesorectal Excision in the Treatment of Rectal Cancer. Brief Review and Personal Remarks

    PubMed Central

    Bianchi, Paolo Pietro; Petz, Wanda; Luca, Fabrizio; Biffi, Roberto; Spinoglio, Giuseppe; Montorsi, Marco

    2014-01-01

    The current standard treatment for rectal cancer is based on a multimodality approach with preoperative radiochemotherapy in advanced cases and complete surgical removal through total mesorectal excision (TME). The most frequent surgical approach is traditional open surgery, as laparoscopic TME requires high technical skill, a long learning curve, and is not widespread, still being confined to centers with great experience in minimally invasive techniques. Nevertheless, in several studies, the laparoscopic approach, when compared to open surgery, has shown some better short-term clinical outcomes and at least comparable oncologic results. Robotic surgery for the treatment of rectal cancer is an emerging technique, which could overcome some of the technical difficulties posed by standard laparoscopy, but evidence from the literature regarding its oncologic safety and clinical outcomes is still lacking. This brief review analyses the current status of minimally invasive surgery for rectal cancer therapy, focusing on oncologic safety and the new robotic approach. PMID:24834429

  14. Content-based image retrieval for interstitial lung diseases using classification confidence

    NASA Astrophysics Data System (ADS)

    Dash, Jatindra Kumar; Mukhopadhyay, Sudipta; Prabhakar, Nidhi; Garg, Mandeep; Khandelwal, Niranjan

    2013-02-01

    Content Based Image Retrieval (CBIR) system could exploit the wealth of High-Resolution Computed Tomography (HRCT) data stored in the archive by finding similar images to assist radiologists for self learning and differential diagnosis of Interstitial Lung Diseases (ILDs). HRCT findings of ILDs are classified into several categories (e.g. consolidation, emphysema, ground glass, nodular etc.) based on their texture like appearances. Therefore, analysis of ILDs is considered as a texture analysis problem. Many approaches have been proposed for CBIR of lung images using texture as primitive visual content. This paper presents a new approach to CBIR for ILDs. The proposed approach makes use of a trained neural network (NN) to find the output class label of query image. The degree of confidence of the NN classifier is analyzed using Naive Bayes classifier that dynamically takes a decision on the size of the search space to be used for retrieval. The proposed approach is compared with three simple distance based and one classifier based texture retrieval approaches. Experimental results show that the proposed technique achieved highest average percentage precision of 92.60% with lowest standard deviation of 20.82%.

  15. A BRIEF ORAL OVERVIEW OF ENVIRONMENTAL ECONOMICS

    EPA Science Inventory

    A brief 1 hour oral presentation to professional staff of Cincinnati Nature Center is intended to provide a lay audience with a general understanding of how market-based approaches to environmental protection can meet (or exceed) regulatory efforts at enforcing pollution standard...

  16. Transportation performance measures for outcome based system management and monitoring.

    DOT National Transportation Integrated Search

    2014-09-01

    The Oregon Department of Transportation (ODOT) is mature in its development and use of : performance measures, however there was not a standard approach for selecting measures nor : evaluating if existing ones were used to inform decision-making. Thi...

  17. The SGML Standardization Framework and the Introduction of XML

    PubMed Central

    Grütter, Rolf

    2000-01-01

    Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future. PMID:11720931

  18. The SGML standardization framework and the introduction of XML.

    PubMed

    Fierz, W; Grütter, R

    2000-01-01

    Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future.

  19. Enhanced sensitivity and multiplexing with 2D LC/MRM-MS and labeled standards for deeper and more comprehensive protein quantitation.

    PubMed

    Percy, Andrew J; Simon, Romain; Chambers, Andrew G; Borchers, Christoph H

    2014-06-25

    Mass spectrometry (MS)-based protein quantitation is increasingly being employed to verify candidate protein biomarkers. Multiple or selected reaction monitoring-mass spectrometry (MRM-MS or SRM-MS) with isotopically labeled internal standards has proven to be a successful approach in that regard, but has yet to reach its full potential in terms of multiplexing and sensitivity. Here, we report the development of a new MRM method for the quantitation of 253 disease-associated proteins (represented by 625 interference-free peptides) in 13 LC fractions. This 2D RPLC/MRM-MS approach extends the depth and breadth of the assay by 2 orders of magnitude over pre-fractionation-free assays, with 31 proteins below 10 ng/mL and 41 proteins above 10 ng/mL now quantifiable. Standard flow rates are used in both chromatographic dimensions, and up-front depletion or antibody-based enrichment is not required. The LC separations utilize high and low pH conditions, with the former employing an ammonium hydroxide-based eluent, instead of the conventional ammonium formate, resulting in improved LC column lifetime and performance. The high sensitivity (determined concentration range: 15 mg/mL to 452 pg/mL) and robustness afforded by this method makes the full MRM panel, or subsets thereof, useful for the verification of disease-associated plasma protein biomarkers in patient samples. The described research extends the breadth and depth of protein quantitation in undepleted and non-enriched human plasma by employing standard-flow 2D RPLC/MRM-MS in conjunction with a complex mixture of isotopically labeled peptide standards. The proteins quantified are mainly putative biomarkers of non-communicable (i.e., non-infectious) disease (e.g., cardiovascular or cancer), which require pre-clinical verification and validation before clinical implementation. Based on the enhanced sensitivity and multiplexing, this quantitative plasma proteomic method should prove useful in future candidate biomarker verification studies. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. 7 CFR 220.8 - Nutrition standards and menu planning approaches for breakfasts.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Nutrition standards and menu planning approaches for... NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SCHOOL BREAKFAST PROGRAM § 220.8 Nutrition standards and menu planning approaches for breakfasts. (a) What are the nutrition standards for...

Top