A Decision Tree for Nonmetric Sex Assessment from the Skull.
Langley, Natalie R; Dudzik, Beatrix; Cloutier, Alesia
2018-01-01
This study uses five well-documented cranial nonmetric traits (glabella, mastoid process, mental eminence, supraorbital margin, and nuchal crest) and one additional trait (zygomatic extension) to develop a validated decision tree for sex assessment. The decision tree was built and cross-validated on a sample of 293 U.S. White individuals from the William M. Bass Donated Skeletal Collection. Ordinal scores from the six traits were analyzed using the partition modeling option in JMP Pro 12. A holdout sample of 50 skulls was used to test the model. The most accurate decision tree includes three variables: glabella, zygomatic extension, and mastoid process. This decision tree yielded 93.5% accuracy on the training sample, 94% on the cross-validated sample, and 96% on a holdout validation sample. Linear weighted kappa statistics indicate acceptable agreement among observers for these variables. Mental eminence should be avoided, and definitions and figures should be referenced carefully to score nonmetric traits. © 2017 American Academy of Forensic Sciences.
If Anything Can Go Wrong, Maybe It Will.
ERIC Educational Resources Information Center
Wager, Jane C.; Rayner, Gail T.
Thirty personnel involved in various stages of the Training Extension Course (TEC) design, development, and distribution process were interviewed by telephone to determine the major problems perceived within each stage of the program, which provides validated extension training wherever U.S. soldiers are stationed. Those interviewed included TEC…
USDA-ARS?s Scientific Manuscript database
Watershed simulation models are used extensively to investigate hydrologic processes, landuse and climate change impacts, pollutant load assessments and best management practices (BMPs). Developing, calibrating and validating these models require a number of critical decisions that will influence t...
Romero-Franco, Natalia; Jiménez-Reyes, Pedro; Montaño-Munuera, Juan A
2017-11-01
Lower limb isometric strength is a key parameter to monitor the training process or recognise muscle weakness and injury risk. However, valid and reliable methods to evaluate it often require high-cost tools. The aim of this study was to analyse the concurrent validity and reliability of a low-cost digital dynamometer for measuring isometric strength in lower limb. Eleven physically active and healthy participants performed maximal isometric strength for: flexion and extension of ankle, flexion and extension of knee, flexion, extension, adduction, abduction, internal and external rotation of hip. Data obtained by the digital dynamometer were compared with the isokinetic dynamometer to examine its concurrent validity. Data obtained by the digital dynamometer from 2 different evaluators and 2 different sessions were compared to examine its inter-rater and intra-rater reliability. Intra-class correlation (ICC) for validity was excellent in every movement (ICC > 0.9). Intra and inter-tester reliability was excellent for all the movements assessed (ICC > 0.75). The low-cost digital dynamometer demonstrated strong concurrent validity and excellent intra and inter-tester reliability for assessing isometric strength in the main lower limb movements.
Development and Validation of the Career Competencies Indicator (CCI)
ERIC Educational Resources Information Center
Francis-Smythe, Jan; Haase, Sandra; Thomas, Erica; Steele, Catherine
2013-01-01
This article describes the development and validation of the Career Competencies Indicator (CCI); a 43-item measure to assess career competencies (CCs). Following an extensive literature review, a comprehensive item generation process involving consultation with subject matter experts, a pilot study and a factor analytic study on a large sample…
Validation of a 30-year-old process for the manufacture of L-asparaginase from Erwinia chrysanthemi.
Gervais, David; Allison, Nigel; Jennings, Alan; Jones, Shane; Marks, Trevor
2013-04-01
A 30-year-old manufacturing process for the biologic product L-asparaginase from the plant pathogen Erwinia chrysanthemi was rigorously qualified and validated, with a high level of agreement between validation data and the 6-year process database. L-Asparaginase exists in its native state as a tetrameric protein and is used as a chemotherapeutic agent in the treatment regimen for Acute Lymphoblastic Leukaemia (ALL). The manufacturing process involves fermentation of the production organism, extraction and purification of the L-asparaginase to make drug substance (DS), and finally formulation and lyophilisation to generate drug product (DP). The extensive manufacturing experience with the product was used to establish ranges for all process parameters and product quality attributes. The product and in-process intermediates were rigorously characterised, and new assays, such as size-exclusion and reversed-phase UPLC, were developed, validated, and used to analyse several pre-validation batches. Finally, three prospective process validation batches were manufactured and product quality data generated using both the existing and the new analytical methods. These data demonstrated the process to be robust, highly reproducible and consistent, and the validation was successful, contributing to the granting of an FDA product license in November, 2011.
DOT National Transportation Integrated Search
2016-08-01
This study conducted an analysis of the SCDOT HMA specification. A Research Steering Committee provided oversight : of the process. The research process included extensive statistical analyses of test data supplied by SCDOT. : A total of 2,789 AC tes...
On the validation of a code and a turbulence model appropriate to circulation control airfoils
NASA Technical Reports Server (NTRS)
Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.
1988-01-01
A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.
Constraint processing in our extensible language for cooperative imaging system
NASA Astrophysics Data System (ADS)
Aoki, Minoru; Murao, Yo; Enomoto, Hajime
1996-02-01
The extensible WELL (Window-based elaboration language) has been developed using the concept of common platform, where both client and server can communicate with each other with support from a communication manager. This extensible language is based on an object oriented design by introducing constraint processing. Any kind of services including imaging in the extensible language is controlled by the constraints. Interactive functions between client and server are extended by introducing agent functions including a request-respond relation. Necessary service integrations are satisfied with some cooperative processes using constraints. Constraints are treated similarly to data, because the system should have flexibilities in the execution of many kinds of services. The similar control process is defined by using intentional logic. There are two kinds of constraints, temporal and modal constraints. Rendering the constraints, the predicate format as the relation between attribute values can be a warrant for entities' validity as data. As an imaging example, a processing procedure of interaction between multiple objects is shown as an image application for the extensible system. This paper describes how the procedure proceeds in the system, and that how the constraints work for generating moving pictures.
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Friedman, Carol; Hripcsak, George; Shagina, Lyuda; Liu, Hongfang
1999-01-01
Objective: To design a document model that provides reliable and efficient access to clinical information in patient reports for a broad range of clinical applications, and to implement an automated method using natural language processing that maps textual reports to a form consistent with the model. Methods: A document model that encodes structured clinical information in patient reports while retaining the original contents was designed using the extensible markup language (XML), and a document type definition (DTD) was created. An existing natural language processor (NLP) was modified to generate output consistent with the model. Two hundred reports were processed using the modified NLP system, and the XML output that was generated was validated using an XML validating parser. Results: The modified NLP system successfully processed all 200 reports. The output of one report was invalid, and 199 reports were valid XML forms consistent with the DTD. Conclusions: Natural language processing can be used to automatically create an enriched document that contains a structured component whose elements are linked to portions of the original textual report. This integrated document model provides a representation where documents containing specific information can be accurately and efficiently retrieved by querying the structured components. If manual review of the documents is desired, the salient information in the original reports can also be identified and highlighted. Using an XML model of tagging provides an additional benefit in that software tools that manipulate XML documents are readily available. PMID:9925230
DPOD2014: a new DORIS extension of ITRF2014 for Precise Orbit Determination
NASA Astrophysics Data System (ADS)
Moreaux, G.; Willis, P.; Lemoine, F. G.; Zelensky, N. P.
2016-12-01
As one of the tracking systems used to determine orbits of the altimeter mission satellites (such as TOPEX/Poseidon, Envisat, Jason-1/2/3 & Cryosat-2), the position of the DORIS tracking stations provides a fundamental reference for the estimation of the precise orbits and so, by extension is fundamental for the quality of the altimeter data and derived products. Therefore, the time evolution of the position of both the existing and the newest DORIS stations must be precisely modeled and regularly updated. To satisfy operational requirements for precise orbit determination and routine delivery of geodetic products, the International DORIS Service maintains the so-called DPOD solutions, which can be seen as extensions of the latest available ITRF solution from the International Earth Rotation and Reference Systems Service (IERS). In mid-2016, the IDS agreed to change the processing strategy of the DPOD solution. The new solution from the IDS Combination Center (CC) consists of a DORIS cumulative position and velocity solution using the latest IDS combined weekly solutions. The first objective of this study is to describe the new DPOD elaboration scheme and to show the IDS CC internal validation steps. The second purpose is to present the external validation process made by an external team before the new DPOD is made available to all the users. The elaboration and validation procedures will be illustrated by the presentation of first version of the DPOD2014 (ITRF2014 DORIS extension) and focus will be given on the update of the position and velocity of two DORIS sites: Everest (after Gorkha earthquake M7.8 in April 2015) and Thule (Greenland).
Recent Improvements in the FDNS CFD Code and its Associated Process
NASA Technical Reports Server (NTRS)
West, Jeff S.; Dorney, Suzanne M.; Turner, Jim (Technical Monitor)
2002-01-01
This viewgraph presentation gives an overview on recent improvements in the Finite Difference Navier Stokes (FDNS) computational fluid dynamics (CFD) code and its associated process. The development of a utility, PreViewer, has essentially eliminated the creeping of simple human error into the FDNS Solution process. Extension of PreViewer to encapsulate the Domain Decompression process has made practical the routine use of parallel processing. The combination of CVS source control and ATS consistency validation significantly increases the efficiency of the CFD process.
Mayorga-Vega, Daniel; Merino-Marban, Rafael; Viciana, Jesús
2014-01-01
The main purpose of the present meta-analysis was to examine the scientific literature on the criterion-related validity of sit-and-reach tests for estimating hamstring and lumbar extensibility. For this purpose relevant studies were searched from seven electronic databases dated up through December 2012. Primary outcomes of criterion-related validity were Pearson´s zero-order correlation coefficients (r) between sit-and-reach tests and hamstrings and/or lumbar extensibility criterion measures. Then, from the included studies, the Hunter- Schmidt´s psychometric meta-analysis approach was conducted to estimate population criterion- related validity of sit-and-reach tests. Firstly, the corrected correlation mean (rp), unaffected by statistical artefacts (i.e., sampling error and measurement error), was calculated separately for each sit-and-reach test. Subsequently, the three potential moderator variables (sex of participants, age of participants, and level of hamstring extensibility) were examined by a partially hierarchical analysis. Of the 34 studies included in the present meta-analysis, 99 correlations values across eight sit-and-reach tests and 51 across seven sit-and-reach tests were retrieved for hamstring and lumbar extensibility, respectively. The overall results showed that all sit-and-reach tests had a moderate mean criterion-related validity for estimating hamstring extensibility (rp = 0.46-0.67), but they had a low mean for estimating lumbar extensibility (rp = 0. 16-0.35). Generally, females, adults and participants with high levels of hamstring extensibility tended to have greater mean values of criterion-related validity for estimating hamstring extensibility. When the use of angular tests is limited such as in a school setting or in large scale studies, scientists and practitioners could use the sit-and-reach tests as a useful alternative for hamstring extensibility estimation, but not for estimating lumbar extensibility. Key Points Overall sit-and-reach tests have a moderate mean criterion-related validity for estimating hamstring extensibility, but they have a low mean validity for estimating lumbar extensibility. Among all the sit-and-reach test protocols, the Classic sit-and-reach test seems to be the best option to estimate hamstring extensibility. End scores (e.g., the Classic sit-and-reach test) are a better indicator of hamstring extensibility than the modifications that incorporate fingers-to-box distance (e.g., the Modified sit-and-reach test). When angular tests such as straight leg raise or knee extension tests cannot be used, sit-and-reach tests seem to be a useful field test alternative to estimate hamstring extensibility, but not to estimate lumbar extensibility. PMID:24570599
A simplified conjoint recognition paradigm for the measurement of gist and verbatim memory.
Stahl, Christoph; Klauer, Karl Christoph
2008-05-01
The distinction between verbatim and gist memory traces has furthered the understanding of numerous phenomena in various fields, such as false memory research, research on reasoning and decision making, and cognitive development. To measure verbatim and gist memory empirically, an experimental paradigm and multinomial measurement model has been proposed but rarely applied. In the present article, a simplified conjoint recognition paradigm and multinomial model is introduced and validated as a measurement tool for the separate assessment of verbatim and gist memory processes. A Bayesian metacognitive framework is applied to validate guessing processes. Extensions of the model toward incorporating the processes of phantom recollection and erroneous recollection rejection are discussed.
Translating the short version of the Perinatal Grief Scale: process and challenges.
Capitulo, K L; Cornelio, M A; Lenz, E R
2001-08-01
Non-English-speaking populations may be excluded from rigorous clinical research because of the lack of reliable and valid instrumentation to measure psychosocial variables. The purpose of this article is to describe the process and challenges when translating a research instrument. The process will be illustrated in the project of translating into Spanish the Short Version of the Perinatal Grief Scale, extensively studied in English-speaking, primarily Caucasian populations. Translation methods, errors, and tips are included. Tools cannot be used in transcultural research and practice without careful and accurate translation and subsequent psychometric evaluation, which are essential to generate credible and valid findings. Copyright 2001 by W.B. Saunders Company
Eliciting design patterns for e-learning systems
NASA Astrophysics Data System (ADS)
Retalis, Symeon; Georgiakakis, Petros; Dimitriadis, Yannis
2006-06-01
Design pattern creation, especially in the e-learning domain, is a highly complex process that has not been sufficiently studied and formalized. In this paper, we propose a systematic pattern development cycle, whose most important aspects focus on reverse engineering of existing systems in order to elicit features that are cross-validated through the use of appropriate, authentic scenarios. However, an iterative pattern process is proposed that takes advantage of multiple data sources, thus emphasizing a holistic view of the teaching learning processes. The proposed schema of pattern mining has been extensively validated for Asynchronous Network Supported Collaborative Learning (ANSCL) systems, as well as for other types of tools in a variety of scenarios, with promising results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Valid and Reliable Science Content Assessments for Science Teachers
NASA Astrophysics Data System (ADS)
Tretter, Thomas R.; Brown, Sherri L.; Bush, William S.; Saderholm, Jon C.; Holmes, Vicki-Lynn
2013-03-01
Science teachers' content knowledge is an important influence on student learning, highlighting an ongoing need for programs, and assessments of those programs, designed to support teacher learning of science. Valid and reliable assessments of teacher science knowledge are needed for direct measurement of this crucial variable. This paper describes multiple sources of validity and reliability (Cronbach's alpha greater than 0.8) evidence for physical, life, and earth/space science assessments—part of the Diagnostic Teacher Assessments of Mathematics and Science (DTAMS) project. Validity was strengthened by systematic synthesis of relevant documents, extensive use of external reviewers, and field tests with 900 teachers during assessment development process. Subsequent results from 4,400 teachers, analyzed with Rasch IRT modeling techniques, offer construct and concurrent validity evidence.
NASA Astrophysics Data System (ADS)
Strippoli, L. S.; Gonzalez-Arjona, D. G.
2018-04-01
GMV extensively worked in many activities aimed at developing, validating, and verifying up to TRL-6 advanced GNC and IP algorithms for Mars Sample Return rendezvous working under different ESA contracts on the development of advanced algorithms for VBN sensor.
USDA-ARS?s Scientific Manuscript database
Although numerous chemical interventions have been implemented and validated to decontaminate meat and meat products during the harvesting process, more novel technologies are under development. UV light ionizing irradiation has been used extensively in pharmaceutical and medical device companies to...
An Assessment of the Myers-Briggs Type Indicator
ERIC Educational Resources Information Center
Carlyn, Marcia
1977-01-01
The Myers Briggs Type Indicator is a self-report inventory developed to measure variables in Carl Jung's personality typology. The four personality scales measured by the instrument, and the scoring process are described, and an extensive review of the intercorrelation, reliability, and validity research is presented. (Author/MV)
Determining Content Validity for the Transition Awareness and Possibilities Scale (TAPS)
ERIC Educational Resources Information Center
Ross, Melynda Burck
2011-01-01
The Transition Awareness & Possibilities Scale (TAPS) was crafted after an extensive review of literature was conducted to find research that examined and described specific aspects of transition programming: inputs, including supports and skill instruction; processes, including parent and support provider perceptions of the transition experience;…
NASA Technical Reports Server (NTRS)
Woodbury, Sarah K.
2008-01-01
The introduction of United Space Alliance's Human Engineering Modeling and Performance Laboratory began in early 2007 in an attempt to address the problematic workspace design issues that the Space Shuttle has imposed on technicians performing maintenance and inspection operations. The Space Shuttle was not expected to require the extensive maintenance it undergoes between flights. As a result, extensive, costly resources have been expended on workarounds and modifications to accommodate ground processing personnel. Consideration of basic human factors principles for design of maintenance is essential during the design phase of future space vehicles, facilities, and equipment. Simulation will be needed to test and validate designs before implementation.
TMATS/ IHAL/ DDML Schema Validation
2017-02-01
task was to create a method for performing IRIG eXtensible Markup Language (XML) schema validation. As opposed to XML instance document validation...TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 vii Acronyms DDML Data Display Markup Language HUD heads-up display iNET...system XML eXtensible Markup Language TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 viii This page intentionally left blank
Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins
NASA Technical Reports Server (NTRS)
Brenner, Marty; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.
CMOS array design automation techniques. [metal oxide semiconductors
NASA Technical Reports Server (NTRS)
Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.
1975-01-01
A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.
ERIC Educational Resources Information Center
Fountain, Lily
2011-01-01
This cross-sectional descriptive study of the Model of Domain Learning, which describes learners' progress from acclimation through competence to proficiency through the interplay of knowledge, interest and strategic processing/critical thinking (CT), examined its extension to maternity nursing. Based on the identified need for valid, reliable…
Burris, Silas E.; Brown, Danielle D.
2014-01-01
Narratives, also called stories, can be found in conversations, children's play interactions, reading material, and television programs. From infancy to adulthood, narrative comprehension processes interpret events and inform our understanding of physical and social environments. These processes have been extensively studied to ascertain the multifaceted nature of narrative comprehension. From this research we know that three overlapping processes (i.e., knowledge integration, goal structure understanding, and causal inference generation) proposed by the constructionist paradigm are necessary for narrative comprehension, narrative comprehension has a predictive relationship with children's later reading performance, and comprehension processes are generalizable to other contexts. Much of the previous research has emphasized internal and predictive validity; thus, limiting the generalizability of previous findings. We are concerned these limitations may be excluding underrepresented populations from benefits and implications identified by early comprehension processes research. This review identifies gaps in extant literature regarding external validity and argues for increased emphasis on externally valid research. We highlight limited research on narrative comprehension processes in children from low-income and minority populations, and argue for changes in comprehension assessments. Specifically, we argue both on- and off-line assessments should be used across various narrative types (e.g., picture books, televised narratives) with traditionally underserved and underrepresented populations. We propose increasing the generalizability of narrative comprehension processes research can inform persistent reading achievement gaps, and have practical implications for how children learn from narratives. PMID:24659973
Bicarbonate of soda paint stripping process validation and material characterization
NASA Technical Reports Server (NTRS)
Haas, Michael N.
1995-01-01
The Aircraft Production Division at San Antonio Air Logistics Center has conducted extensive investigation into the replacement of hazardous chemicals in aircraft component cleaning, degreasing, and depainting. One of the most viable solutions is process substitution utilizing abrasive techniques. SA-ALC has incorporated the use of Bicarbonate of Soda Blasting as one such substitution. Previous utilization of methylene chloride based chemical strippers and carbon removal agents has been replaced by a walk-in blast booth in which we remove carbon from engine nozzles and various gas turbine engine parts, depaint cowlings, and perform various other functions on a variety of parts. Prior to implementation of this new process, validation of the process was performed, and materials and waste stream characterization studies were conducted. These characterization studies examined the effects of the blasting process on the integrity of the thin-skinned aluminum substrates, the effects of the process on both air emissions and effluent disposal, and the effects on the personnel exposed to the process.
On-Line Robust Modal Stability Prediction using Wavelet Processing
NASA Technical Reports Server (NTRS)
Brenner, Martin J.; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.
Chafetz, M D; Williams, M A; Ben-Porath, Y S; Bianchini, K J; Boone, K B; Kirkwood, M W; Larrabee, G J; Ord, J S
2015-01-01
The milestone publication by Slick, Sherman, and Iverson (1999) of criteria for determining malingered neurocognitive dysfunction led to extensive research on validity testing. Position statements by the National Academy of Neuropsychology and the American Academy of Clinical Neuropsychology (AACN) recommended routine validity testing in neuropsychological evaluations. Despite this widespread scientific and professional support, the Social Security Administration (SSA) continued to discourage validity testing, a stance that led to a congressional initiative for SSA to reevaluate their position. In response, SSA commissioned the Institute of Medicine (IOM) to evaluate the science concerning the validation of psychological testing. The IOM concluded that validity assessment was necessary in psychological and neuropsychological examinations (IOM, 2015 ). The AACN sought to provide independent expert guidance and recommendations concerning the use of validity testing in disability determinations. A panel of contributors to the science of validity testing and its application to the disability process was charged with describing why the disability process for SSA needs improvement, and indicating the necessity for validity testing in disability exams. This work showed how the determination of malingering is a probability proposition, described how different types of validity tests are appropriate, provided evidence concerning non-credible findings in children and low-functioning individuals, and discussed the appropriate evaluation of pain disorders typically seen outside of mental consultations. A scientific plan for validity assessment that additionally protects test security is needed in disability determinations and in research on classification accuracy of disability decisions.
DEVA: An extensible ontology-based annotation model for visual document collections
NASA Astrophysics Data System (ADS)
Jelmini, Carlo; Marchand-Maillet, Stephane
2003-01-01
The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.
The cross-cultural equivalence of participation instruments: a systematic review.
Stevelink, S A M; van Brakel, W H
2013-07-01
Concepts such as health-related quality of life, disability and participation may differ across cultures. Consequently, when assessing such a concept using a measure developed elsewhere, it is important to test its cultural equivalence. Previous research suggested a lack of cultural equivalence testing in several areas of measurement. This paper reviews the process of cross-cultural equivalence testing of instruments to measure participation in society. An existing cultural equivalence framework was adapted and used to assess participation instruments on five categories of equivalence: conceptual, item, semantic, measurement and operational equivalence. For each category, several aspects were rated, resulting in an overall category rating of 'minimal/none', 'partial' or 'extensive'. The best possible overall study rating was five 'extensive' ratings. Articles were included if the instruments focussed explicitly on measuring 'participation' and were theoretically grounded in the ICIDH(-2) or ICF. Cross-validation articles were only included if it concerned an adaptation of an instrument developed in a high or middle-income country to a low-income country or vice versa. Eight cross-cultural validation studies were included in which five participation instruments were tested (Impact on Participation and Autonomy, London Handicap Scale, Perceived Impact and Problem Profile, Craig Handicap Assessment Reporting Technique, Participation Scale). Of these eight studies, only three received at least two 'extensive' ratings for the different categories of equivalence. The majority of the cultural equivalence ratings given were 'partial' and 'minimal/none'. The majority of the 'none/minimal' ratings were given for item and measurement equivalence. The cross-cultural equivalence testing of the participation instruments included leaves much to be desired. A detailed checklist is proposed for designing a cross-validation study. Once a study has been conducted, the checklist can be used to ensure comprehensive reporting of the validation (equivalence) testing process and its results. • Participation instruments are often used in a different cultural setting than initial developed for. • The conceptualization of participation may vary across cultures. Therefore, cultural equivalence – the extent to which an instrument is equally suitable for use in two or more cultures – is an important concept to address. • This review showed that the process of cultural equivalence testing of the included participation instruments was often addressed insufficiently. • Clinicians should be aware that application of participations instruments in a different culture than initially developed for needs prior testing of cultural validity in the next context.
A framework for the direct evaluation of large deviations in non-Markovian processes
NASA Astrophysics Data System (ADS)
Cavallaro, Massimo; Harris, Rosemary J.
2016-11-01
We propose a general framework to simulate stochastic trajectories with arbitrarily long memory dependence and efficiently evaluate large deviation functions associated to time-extensive observables. This extends the ‘cloning’ procedure of Giardiná et al (2006 Phys. Rev. Lett. 96 120603) to non-Markovian systems. We demonstrate the validity of this method by testing non-Markovian variants of an ion-channel model and the totally asymmetric exclusion process, recovering results obtainable by other means.
STILTS -- Starlink Tables Infrastructure Library Tool Set
NASA Astrophysics Data System (ADS)
Taylor, Mark
STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.
Störmer, M; Radojska, S; Hos, N J; Gathof, B S
2015-04-01
In order to generate standardized conditions for the microbiological control of HPCs, the PEI recommended defined steps for validation that will lead to extensive validation as shown in this study, where a possible validation principle for the microbiological control of allogeneic SCPs is presented. Although it could be demonstrated that automated culture improves microbial safety of cellular products, the requirement for extensive validation studies needs to be considered. © 2014 International Society of Blood Transfusion.
8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...
8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...
8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...
8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...
8 CFR 1214.1 - Review of requirements for admission, extension, and maintenance of status.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) of the Act. Upon application for admission, the alien shall present a valid passport and valid visa... present a passport only if requested to do so by the Service. The passport of an alien applying for... conditions of his or her admission. The passport of an alien applying for extension of stay shall be valid at...
NASA Technical Reports Server (NTRS)
Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio
1992-01-01
This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.
Adam S. Ward; Robert A. Payn; Michael N. Gooseff; Brian L. McGlynn; Kenneth E. Bencala; Christa A. Kellecher; Steven M. Wondzell; Thorsten Wagener
2013-01-01
The accumulation of discharge along a stream valley is frequently assumed to be the primary control on solute transport processes. Relationships of both increasing and decreasing transient storage, and decreased gross losses of stream water have been reported with increasing discharge; however, we have yet to validate these relationships with extensive field study. We...
Spring 2013 Graduate Engineering Internship Summary
NASA Technical Reports Server (NTRS)
Ehrlich, Joshua
2013-01-01
In the spring of 2013, I participated in the National Aeronautics and Space Administration (NASA) Pathways Intern Employment Program at the Kennedy Space Center (KSC) in Florida. This was my final internship opportunity with NASA, a third consecutive extension from a summer 2012 internship. Since the start of my tenure here at KSC, I have gained an invaluable depth of engineering knowledge and extensive hands-on experience. These opportunities have granted me the ability to enhance my systems engineering approach in the field of payload design and testing as well as develop a strong foundation in the area of composite fabrication and testing for repair design on space vehicle structures. As a systems engineer, I supported the systems engineering and integration team with final acceptance testing of the Vegetable Production System, commonly referred to as Veggie. Verification and validation (V and V) of Veggie was carried out prior to qualification testing of the payload, which incorporated the process of confirming the system's design requirements dependent on one or more validation methods: inspection, analysis, demonstration, and testing.
NASA Astrophysics Data System (ADS)
Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.
2014-05-01
Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.
Land Ice Verification and Validation Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-07-15
To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less
Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity
Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.
2010-01-01
An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183
Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.
Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L
2010-02-01
An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.
Osmotic pressure beyond concentration restrictions.
Grattoni, Alessandro; Merlo, Manuele; Ferrari, Mauro
2007-10-11
Osmosis is a fundamental physical process that involves the transit of solvent molecules across a membrane separating two liquid solutions. Osmosis plays a role in many biological processes such as fluid exchange in animal cells (Cell Biochem. Biophys. 2005, 42, 277-345;1 J. Periodontol. 2007, 78, 757-7632) and water transport in plants. It is also involved in many technological applications such as drug delivery systems (Crit. Rev. Ther. Drug. 2004, 21, 477-520;3 J. Micro-Electromech. Syst. 2004, 13, 75-824) and water purification. Extensive attention has been dedicated in the past to the modeling of osmosis, starting with the classical theories of van't Hoff and Morse. These are predictive, in the sense that they do not involve adjustable parameters; however, they are directly applicable only to limited regimes of dilute solute concentrations. Extensions beyond the domains of validity of these classical theories have required recourse to fitting parameters, transitioning therefore to semiempirical, or nonpredictive models. A novel approach was presented by Granik et al., which is not a priori restricted in concentration domains, presents no adjustable parameters, and is mechanistic, in the sense that it is based on a coupled diffusion model. In this work, we examine the validity of predictive theories of osmosis, by comparison with our new experimental results, and a meta-analysis of literature data.
Global dynamics for switching systems and their extensions by linear differential equations
NASA Astrophysics Data System (ADS)
Huttinga, Zane; Cummins, Bree; Gedeon, Tomáš; Mischaikow, Konstantin
2018-03-01
Switching systems use piecewise constant nonlinearities to model gene regulatory networks. This choice provides advantages in the analysis of behavior and allows the global description of dynamics in terms of Morse graphs associated to nodes of a parameter graph. The parameter graph captures spatial characteristics of a decomposition of parameter space into domains with identical Morse graphs. However, there are many cellular processes that do not exhibit threshold-like behavior and thus are not well described by a switching system. We consider a class of extensions of switching systems formed by a mixture of switching interactions and chains of variables governed by linear differential equations. We show that the parameter graphs associated to the switching system and any of its extensions are identical. For each parameter graph node, there is an order-preserving map from the Morse graph of the switching system to the Morse graph of any of its extensions. We provide counterexamples that show why possible stronger relationships between the Morse graphs are not valid.
Global dynamics for switching systems and their extensions by linear differential equations.
Huttinga, Zane; Cummins, Bree; Gedeon, Tomáš; Mischaikow, Konstantin
2018-03-15
Switching systems use piecewise constant nonlinearities to model gene regulatory networks. This choice provides advantages in the analysis of behavior and allows the global description of dynamics in terms of Morse graphs associated to nodes of a parameter graph. The parameter graph captures spatial characteristics of a decomposition of parameter space into domains with identical Morse graphs. However, there are many cellular processes that do not exhibit threshold-like behavior and thus are not well described by a switching system. We consider a class of extensions of switching systems formed by a mixture of switching interactions and chains of variables governed by linear differential equations. We show that the parameter graphs associated to the switching system and any of its extensions are identical. For each parameter graph node, there is an order-preserving map from the Morse graph of the switching system to the Morse graph of any of its extensions. We provide counterexamples that show why possible stronger relationships between the Morse graphs are not valid.
Experimental measurement of flexion-extension movement in normal and corpse prosthetic elbow joint.
TarniŢă, Daniela; TarniŢă, DănuŢ Nicolae
2016-01-01
This paper presents a comparative experimental study of flexion-extension movement in healthy elbow and in the prosthetic elbow joint fixed on an original experimental bench. Measurements were carried out in order to validate the functional morphology and a new elbow prosthesis type ball head. The three-dimensional (3D) model and the physical prototype of our experimental bench used to test elbow endoprosthesis at flexion-extension and pronation-supination movements is presented. The measurements were carried out on a group of nine healthy subjects and on the prosthetic corpse elbow, the experimental data being obtained for flexion-extension movement cycles. Experimental data for the two different flexion-extension tests for the nine subjects and for the corpse prosthetic elbow were acquired using SimiMotion video system. Experimental data were processed statistically. The corresponding graphs were obtained for all subjects in the experimental group, and for corpse prosthetic elbow for both flexion-extension tests. The statistical analysis has proved that the flexion angles of healthy elbows were significantly close to the values measured at the prosthetic elbow fixed on the experimental bench. The studied elbow prosthesis manages to re-establish the mobility for the elbow joint as close to the normal one.
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
A computer program partly automates the task of determining whether an HDF-EOS 5 file is valid in that it conforms to specifications for such characteristics as attribute names, dimensionality of data products, and ranges of legal data values. ["HDF-EOS" and variants thereof are defined in "Converting EOS Data From HDF-EOS to netCDF" (GSC-15007-1), which is the first of several preceding articles in this issue of NASA Tech Briefs.] Previously, validity of a file was determined in a tedious and error-prone process in which a person examined human-readable dumps of data-file-format information. The present software helps a user to encode the specifications for an HDFEOS 5 file, and then inspects the file for conformity with the specifications: First, the user writes the specifications in Extensible Markup Language (XML) by use of a document type definition (DTD) that is part of the program. Next, the portion of the program (denoted the validator) that performs the inspection is executed, using, as inputs, the specifications in XML and the HDF-EOS 5 file to be validated. Finally, the user examines the output of the validator.
Trojanowicz, K; Plaza, E; Trela, J
2017-11-09
In the paper, the extension of mathematical model of partial nitritation-anammox process in a moving bed biofilm reactor (MBBR) is presented. The model was calibrated with a set of kinetic, stoichiometric and biofilm parameters, whose values were taken from the literature and batch tests. The model was validated with data obtained from: laboratory batch experiments, pilot-scale MBBR for a reject water deammonification operated at Himmerfjärden wastewater treatment and pilot-scale MBBR for mainstream wastewater deammonification at Hammarby Sjöstadsverk research facility, Sweden. Simulations were conducted in AQUASIM software. The proposed, extended model proved to be useful for simulating of partial nitritation/anammox process in biofilm reactor both for reject water and mainstream wastewater at variable substrate concentrations (influent total ammonium-nitrogen concentration of 530 ± 68; 45 ± 2.6 and 38 ± 3 gN/m 3 - for reject water - and two cases of mainstream wastewater treatment, respectively), temperature (24 ± 2.8; 15 ± 1.1 and 18 ± 0.5°C), pH (7.8 ± 0.2; 7.3 ± 0.1 and 7.4 ± 0.1) and aeration patterns (continuous aeration and intermittent aeration with variable dissolved oxygen concentrations and length of aerated and anoxic phases). The model can be utilized for optimizing and testing different operational strategies of deammonification process in biofilm systems.
Planetary Geology and Geophysics Program
NASA Technical Reports Server (NTRS)
McGill, George E.
2004-01-01
Geological mapping and topical studies, primarily in the southern Acidalia Planitia/Cydonia Mensae region of Mars is presented. The overall objective was to understand geologic processes and crustal history in the northern lowland in order to assess the probability that an ocean once existed in this region. The major deliverable is a block of 6 1:500,000 scale geologic maps that will be published in 2004 as a single map at 1:1,000,000 scale along with extensive descriptive and interpretive text. A major issue addressed by the mapping was the relative ages of the extensive plains of Acidalia Planitia and the knobs and mesas of Cydonia Mensae. The mapping results clearly favor a younger age for the plains. Topical studies included a preliminary analysis of the very abundant small domes and cones to assess the possibility that their origins could be determined by detailed mapping and remote-sensing analysis. We also tested the validity of putative shorelines by using GIs to co-register full-resolution MOLA altimetry data and Viking images with these shorelines plotted on them. Of the 3 proposed shorelines in this area, one is probably valid, one is definitely not valid, and the third is apparently 2 shorelines closely spaced in elevation. Publications supported entirely or in part by this grant are included.
Jutte, Lisa S; Long, Blaine C; Knight, Kenneth L
2010-01-01
Thermocouples' leads are often too short, necessitating the use of an extension lead. To determine if temperature measures were influenced by extension-lead use or lead temperature changes. Descriptive laboratory study. Laboratory. Experiment 1: 10 IT-21 thermocouples and 5 extension leads. Experiment 2: 5 IT-21 and PT-6 thermocouples. In experiment 1, temperature data were collected on 10 IT-21 thermocouples in a stable water bath with and without extension leads. In experiment 2, temperature data were collected on 5 IT-21 and PT-6 thermocouples in a stable water bath before, during, and after ice-pack application to extension leads. In experiment 1, extension leads did not influence IT-21 validity (P = .45) or reliability (P = .10). In experiment 2, postapplication IT-21 temperatures were greater than preapplication and application measures (P < .05). Extension leads had no influence on temperature measures. Ice application to leads may increase measurement error.
Validation of Organics for Advanced Stirling Convertor (ASC)
NASA Astrophysics Data System (ADS)
Shin, E. Eugene; Scheiman, Dan; Cybulski, Michelle; Quade, Derek; Inghram, Linda; Burke, Chris
2008-01-01
Organic materials are an essential part of the Advanced Stirling Convertor (ASC) construction as adhesives, potting, wire insulation, lubrication coatings, bobbins, bumpers, insulators, thread lockers. Since a long lifetime of such convertors to be used in the Advanced Stirling Radioisotope Generator (ASRG), sometimes up to 17 years, is required in various space applications such as Mars rovers, deep space missions, and lunar surface power, performance, durability and reliability of those organics should be critically evaluated in every possible material-process-fabrication-service environment relations. The objective of this study was to evaluate, validate, and recommend organics for use in ASCs. Systematic and extensive evaluation methodologies were developed and conducted for various organic materials. The overall efforts dealing with organics materials for the last several years are summarized in the key areas, e.g., process-fabrication optimization, adhesive bonding integrity, outgassing, thermal stability, and durability
Isazadeh, Siavash; Feng, Min; Urbina Rivas, Luis Enrique; Frigon, Dominic
2014-04-15
Two pilot-scale activated sludge reactors were operated for 98 days to provide the necessary data to develop and validate a new mathematical model predicting the reduction of biosolids production by ozonation of the return activated sludge (RAS). Three ozone doses were tested during the study. In addition to the pilot-scale study, laboratory-scale experiments were conducted with mixed liquor suspended solids and with pure cultures to parameterize the biomass inactivation process during exposure to ozone. The experiments revealed that biomass inactivation occurred even at the lowest doses, but that it was not associated with extensive COD solubilization. For validation, the model was used to simulate the temporal dynamics of the pilot-scale operational data. Increasing the description accuracy of the inactivation process improved the precision of the model in predicting the operational data. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Analytical Modeling and Performance Prediction of Remanufactured Gearbox Components
NASA Astrophysics Data System (ADS)
Pulikollu, Raja V.; Bolander, Nathan; Vijayakar, Sandeep; Spies, Matthew D.
Gearbox components operate in extreme environments, often leading to premature removal or overhaul. Though worn or damaged, these components still have the ability to function given the appropriate remanufacturing processes are deployed. Doing so reduces a significant amount of resources (time, materials, energy, manpower) otherwise required to produce a replacement part. Unfortunately, current design and analysis approaches require extensive testing and evaluation to validate the effectiveness and safety of a component that has been used in the field then processed outside of original OEM specification. To test all possible combination of component coupled with various levels of potential damage repaired through various options of processing would be an expensive and time consuming feat, thus prohibiting a broad deployment of remanufacturing processes across industry. However, such evaluation and validation can occur through Integrated Computational Materials Engineering (ICME) modeling and simulation. Sentient developed a microstructure-based component life prediction (CLP) tool to quantify and assist gearbox components remanufacturing process. This was achieved by modeling the design-manufacturing-microstructure-property relationship. The CLP tool assists in remanufacturing of high value, high demand rotorcraft, automotive and wind turbine gears and bearings. This paper summarizes the CLP models development, and validation efforts by comparing the simulation results with rotorcraft spiral bevel gear physical test data. CLP analyzes gear components and systems for safety, longevity, reliability and cost by predicting (1) New gearbox component performance, and optimal time-to-remanufacture (2) Qualification of used gearbox components for remanufacturing process (3) Predicting the remanufactured component performance.
Schmettow, Martin; Schnittker, Raphaela; Schraagen, Jan Maarten
2017-05-01
This paper proposes and demonstrates an extended protocol for usability validation testing of medical devices. A review of currently used methods for the usability evaluation of medical devices revealed two main shortcomings. Firstly, the lack of methods to closely trace the interaction sequences and derive performance measures. Secondly, a prevailing focus on cross-sectional validation studies, ignoring the issues of learnability and training. The U.S. Federal Drug and Food Administration's recent proposal for a validation testing protocol for medical devices is then extended to address these shortcomings: (1) a novel process measure 'normative path deviations' is introduced that is useful for both quantitative and qualitative usability studies and (2) a longitudinal, completely within-subject study design is presented that assesses learnability, training effects and allows analysis of diversity of users. A reference regression model is introduced to analyze data from this and similar studies, drawing upon generalized linear mixed-effects models and a Bayesian estimation approach. The extended protocol is implemented and demonstrated in a study comparing a novel syringe infusion pump prototype to an existing design with a sample of 25 healthcare professionals. Strong performance differences between designs were observed with a variety of usability measures, as well as varying training-on-the-job effects. We discuss our findings with regard to validation testing guidelines, reflect on the extensions and discuss the perspectives they add to the validation process. Copyright © 2017 Elsevier Inc. All rights reserved.
Crack propagation analysis using acoustic emission sensors for structural health monitoring systems.
Kral, Zachary; Horn, Walter; Steck, James
2013-01-01
Aerospace systems are expected to remain in service well beyond their designed life. Consequently, maintenance is an important issue. A novel method of implementing artificial neural networks and acoustic emission sensors to form a structural health monitoring (SHM) system for aerospace inspection routines was the focus of this research. Simple structural elements, consisting of flat aluminum plates of AL 2024-T3, were subjected to increasing static tensile loading. As the loading increased, designed cracks extended in length, releasing strain waves in the process. Strain wave signals, measured by acoustic emission sensors, were further analyzed in post-processing by artificial neural networks (ANN). Several experiments were performed to determine the severity and location of the crack extensions in the structure. ANNs were trained on a portion of the data acquired by the sensors and the ANNs were then validated with the remaining data. The combination of a system of acoustic emission sensors, and an ANN could determine crack extension accurately. The difference between predicted and actual crack extensions was determined to be between 0.004 in. and 0.015 in. with 95% confidence. These ANNs, coupled with acoustic emission sensors, showed promise for the creation of an SHM system for aerospace systems.
Lindemann, Ulrich; Zijlstra, Wiebren; Aminian, Kamiar; Chastin, Sebastien F M; de Bruin, Eling D; Helbostad, Jorunn L; Bussmann, Johannes B J
2014-01-10
Physical activity is an important determinant of health and well-being in older persons and contributes to their social participation and quality of life. Hence, assessment tools are needed to study this physical activity in free-living conditions. Wearable motion sensing technology is used to assess physical activity. However, there is a lack of harmonisation of validation protocols and applied statistics, which make it hard to compare available and future studies. Therefore, the aim of this paper is to formulate recommendations for assessing the validity of sensor-based activity monitoring in older persons with focus on the measurement of body postures and movements. Validation studies of body-worn devices providing parameters on body postures and movements were identified and summarized and an extensive inter-active process between authors resulted in recommendations about: information on the assessed persons, the technical system, and the analysis of relevant parameters of physical activity, based on a standardized and semi-structured protocol. The recommended protocols can be regarded as a first attempt to standardize validity studies in the area of monitoring physical activity.
Perception of competence in middle school physical education: instrument development and validation.
Scrabis-Fletcher, Kristin; Silverman, Stephen
2010-03-01
Perception of Competence (POC) has been studied extensively in physical activity (PA) research with similar instruments adapted for physical education (PE) research. Such instruments do not account for the unique PE learning environment. Therefore, an instrument was developed and the scores validated to measure POC in middle school PE. A multiphase design was used consisting of an intensive theoretical review, elicitation study, prepilot study, pilot study, content validation study, and final validation study (N=1281). Data analysis included a multistep iterative process to identify the best model fit. A three-factor model for POC was tested and resulted in root mean square error of approximation = .09, root mean square residual = .07, goodness offit index = .90, and adjusted goodness offit index = .86 values in the acceptable range (Hu & Bentler, 1999). A two-factor model was also tested and resulted in a good fit (two-factor fit indexes values = .05, .03, .98, .97, respectively). The results of this study suggest that an instrument using a three- or two-factor model provides reliable and valid scores ofPOC measurement in middle school PE.
Three validation metrics for automated probabilistic image segmentation of brain tumours
Zou, Kelly H.; Wells, William M.; Kikinis, Ron; Warfield, Simon K.
2005-01-01
SUMMARY The validity of brain tumour segmentation is an important issue in image processing because it has a direct impact on surgical planning. We examined the segmentation accuracy based on three two-sample validation metrics against the estimated composite latent gold standard, which was derived from several experts’ manual segmentations by an EM algorithm. The distribution functions of the tumour and control pixel data were parametrically assumed to be a mixture of two beta distributions with different shape parameters. We estimated the corresponding receiver operating characteristic curve, Dice similarity coefficient, and mutual information, over all possible decision thresholds. Based on each validation metric, an optimal threshold was then computed via maximization. We illustrated these methods on MR imaging data from nine brain tumour cases of three different tumour types, each consisting of a large number of pixels. The automated segmentation yielded satisfactory accuracy with varied optimal thresholds. The performances of these validation metrics were also investigated via Monte Carlo simulation. Extensions of incorporating spatial correlation structures using a Markov random field model were considered. PMID:15083482
NASA Astrophysics Data System (ADS)
Bardin, D.; Bondarenko, S.; Christova, P.; Kalinovskaya, L.; von Schlippe, W.; Uglov, E.
2017-11-01
The implementation of the process γγ → ΖΖ at the one-loop level within SANC system multichannel approach is considered. The derived one-loop scalar form factors can be used for any cross channel after an appropriate permutation of their arguments-Mandelstam variables s, t, u. To check of the correctness of the results we observe the independence of the scalar form factors on the gauge parameters and the validity of Ward identity (external photon transversality). We present the complete analytical results for the covariant and tensor structures and helicity amplitudes for this process. We make an extensive comparison of our analytical and numerical results with those existing in the literature.
2014-05-01
solver to treat the spray process. An Adaptive Mesh Refinement (AMR) and fixed embedding technique is employed to capture the gas - liquid interface with...Adaptive Mesh Refinement (AMR) and fixed embedding technique is employed to capture the gas - liquid interface with high fidelity while keeping the cell...in single and multi-hole nozzle configurations. The models were added to the present CONVERGE liquid fuel database and validated extensively
NASA Astrophysics Data System (ADS)
Asadizadeh, Mostafa; Moosavi, Mahdi; Hossaini, Mohammad Farouq; Masoumi, Hossein
2018-02-01
In this paper, a number of artificial rock specimens with two parallel (stepped and coplanar) non-persistent joints were subjected to direct shearing. The effects of bridge length ( L), bridge angle ( γ), joint roughness coefficient (JRC) and normal stress ( σ n) on shear strength and cracking process of non-persistent jointed rock were studied extensively. The experimental program was designed based on Taguchi method, and the validity of the resulting data was assessed using analysis of variance. The results revealed that σ n and γ have the maximum and minimum effects on shear strength, respectively. Also, increase in L from 10 to 60 mm led to decrease in shear strength where high level of JRC profile and σ n led to the initiation of tensile cracks due to asperity interlocking. Such tensile cracks are known as "interlocking cracks" which normally initiate from the asperity and then propagate toward the specimen boundaries. Finally, the cracking process of specimens was classified into three categories, namely tensile cracking, shear cracking and combination of tension and shear or mixed mode tensile-shear cracking.
Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model.
Berthelsen, Hanne; Hakanen, Jari J; Westerlund, Hugo
2018-01-01
This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field.
Fracture mechanics validity limits
NASA Technical Reports Server (NTRS)
Lambert, Dennis M.; Ernst, Hugo A.
1994-01-01
Fracture behavior is characteristics of a dramatic loss of strength compared to elastic deformation behavior. Fracture parameters have been developed and exhibit a range within which each is valid for predicting growth. Each is limited by the assumptions made in its development: all are defined within a specific context. For example, the stress intensity parameters, K, and the crack driving force, G, are derived using an assumption of linear elasticity. To use K or G, the zone of plasticity must be small as compared to the physical dimensions of the object being loaded. This insures an elastic response, and in this context, K and G will work well. Rice's J-integral has been used beyond the limits imposed on K and G. J requires an assumption of nonlinear elasticity, which is not characteristic of real material behavior, but is thought to be a reasonable approximation if unloading is kept to a minimum. As well, the constraint cannot change dramatically (typically, the crack extension is limited to ten-percent of the initial remaining ligament length). Rice, et al investigated the properties required of J-type parameters, J(sub x), and showed that the time rate, dJ(sub x)/dt, must not be a function of the crack extension rate, da/dt. Ernst devised the modified-J parameter, J(sub M), that meets this criterion. J(sub M) correlates fracture data to much higher crack growth than does J. Ultimately, a limit of the validity of J(sub M) is anticipated, and this has been estimated to be at a crack extension of about 40-percent of the initial remaining ligament length. None of the various parameters can be expected to describe fracture in an environment of gross plasticity, in which case the process is better described by deformation parameters, e.g., stress and strain. In the current study, various schemes to identify the onset of the plasticity-dominated behavior, i.e., the end of fracture mechanics validity, are presented. Each validity limit parameter is developed in detail, and then data is presented and the various schemes for establishing a limit of the validity are compared. The selected limiting parameter is applied to a set of fracture data showing the improvement of correlation gained.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.
Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao
2017-06-30
Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do
2017-01-01
Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113
ERIC Educational Resources Information Center
Calloway, Pauline Frances
This study investigated the construct validity of the Herzberg (1964) theory of motivation as it relates to county Extension agents; and developed an inventory to measure the job satisfaction of county agents in North Carolina. The inventory was administered to 419 agents in 79 counties. Factor analysis was used to determine the number of job…
Assessing the stretch-blow moulding FE simulation of PET over a large process window
NASA Astrophysics Data System (ADS)
Nixon, J.; Menary, G. H.; Yan, S.
2017-10-01
Injection stretch blow moulding has been extensively researched for numerous years and is a well-established method of forming thin-walled containers. This paper is concerned with validating the finite element analysis of the stretch-blow-moulding (SBM) process in an effort to progress the development of injection stretch blow moulding of poly(ethylene terephthalate). Extensive data was obtained experimentally over a wide process window accounting for material temperature, air flow rate and stretch-rod speed while capturing cavity pressure, stretch-rod reaction force, in-mould contact timing and material thickness distribution. This data was then used to assess the accuracy of the correlating FE simulation constructed using ABAQUS/Explicit solver and an appropriate user-defined viscoelastic material subroutine. Results reveal that the simulation was able to pick up the general trends of how the pressure, reaction force and in-mould contact timings vary with the variation in preform temperature and air flow rate. Trends in material thickness were also accurately predicted over the length of the bottle relative to the process conditions. The knowledge gained from these analyses provides insight into the mechanisms of bottle formation, subsequently improving the blow moulding simulation and potentially providing a reduction in production costs.
Quek, June; Brauer, Sandra G; Treleaven, Julia; Pua, Yong-Hao; Mentiplay, Benjamin; Clark, Ross Allan
2014-04-17
Concurrent validity and intra-rater reliability using a customized Android phone application to measure cervical-spine range-of-motion (ROM) has not been previously validated against a gold-standard three-dimensional motion analysis (3DMA) system. Twenty-one healthy individuals (age:31 ± 9.1 years, male:11) participated, with 16 re-examined for intra-rater reliability 1-7 days later. An Android phone was fixed on a helmet, which was then securely fastened on the participant's head. Cervical-spine ROM in flexion, extension, lateral flexion and rotation were performed in sitting with concurrent measurements obtained from both a 3DMA system and the phone.The phone demonstrated moderate to excellent (ICC = 0.53-0.98, Spearman ρ = 0.52-0.98) concurrent validity for ROM measurements in cervical flexion, extension, lateral-flexion and rotation. However, cervical rotation demonstrated both proportional and fixed bias. Excellent intra-rater reliability was demonstrated for cervical flexion, extension and lateral flexion (ICC = 0.82-0.90), but poor for right- and left-rotation (ICC = 0.05-0.33) using the phone. Possible reasons for the outcome are that flexion, extension and lateral-flexion measurements are detected by gravity-dependent accelerometers while rotation measurements are detected by the magnetometer which can be adversely affected by surrounding magnetic fields. The results of this study demonstrate that the tested Android phone application is valid and reliable to measure ROM of the cervical-spine in flexion, extension and lateral-flexion but not in rotation likely due to magnetic interference. The clinical implication of this study is that therapists should be mindful of the plane of measurement when using the Android phone to measure ROM of the cervical-spine.
2014-01-01
Background Concurrent validity and intra-rater reliability using a customized Android phone application to measure cervical-spine range-of-motion (ROM) has not been previously validated against a gold-standard three-dimensional motion analysis (3DMA) system. Findings Twenty-one healthy individuals (age:31 ± 9.1 years, male:11) participated, with 16 re-examined for intra-rater reliability 1–7 days later. An Android phone was fixed on a helmet, which was then securely fastened on the participant’s head. Cervical-spine ROM in flexion, extension, lateral flexion and rotation were performed in sitting with concurrent measurements obtained from both a 3DMA system and the phone. The phone demonstrated moderate to excellent (ICC = 0.53-0.98, Spearman ρ = 0.52-0.98) concurrent validity for ROM measurements in cervical flexion, extension, lateral-flexion and rotation. However, cervical rotation demonstrated both proportional and fixed bias. Excellent intra-rater reliability was demonstrated for cervical flexion, extension and lateral flexion (ICC = 0.82-0.90), but poor for right- and left-rotation (ICC = 0.05-0.33) using the phone. Possible reasons for the outcome are that flexion, extension and lateral-flexion measurements are detected by gravity-dependent accelerometers while rotation measurements are detected by the magnetometer which can be adversely affected by surrounding magnetic fields. Conclusion The results of this study demonstrate that the tested Android phone application is valid and reliable to measure ROM of the cervical-spine in flexion, extension and lateral-flexion but not in rotation likely due to magnetic interference. The clinical implication of this study is that therapists should be mindful of the plane of measurement when using the Android phone to measure ROM of the cervical-spine. PMID:24742001
Thermal Model Development for Ares I-X
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; DelCorso, Joe
2008-01-01
Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.
Graphics processing unit (GPU) real-time infrared scene generation
NASA Astrophysics Data System (ADS)
Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.
2007-04-01
VIRSuite, the GPU-based suite of software tools developed at DSTO for real-time infrared scene generation, is described. The tools include the painting of scene objects with radiometrically-associated colours, translucent object generation, polar plot validation and versatile scene generation. Special features include radiometric scaling within the GPU and the presence of zoom anti-aliasing at the core of VIRSuite. Extension of the zoom anti-aliasing construct to cover target embedding and the treatment of translucent objects is described.
Using Genotype Abundance to Improve Phylogenetic Inference
Mesin, Luka; Victora, Gabriel D; Minin, Vladimir N; Matsen, Frederick A
2018-01-01
Abstract Modern biological techniques enable very dense genetic sampling of unfolding evolutionary histories, and thus frequently sample some genotypes multiple times. This motivates strategies to incorporate genotype abundance information in phylogenetic inference. In this article, we synthesize a stochastic process model with standard sequence-based phylogenetic optimality, and show that tree estimation is substantially improved by doing so. Our method is validated with extensive simulations and an experimental single-cell lineage tracing study of germinal center B cell receptor affinity maturation. PMID:29474671
Quantum image processing: A review of advances in its security technologies
NASA Astrophysics Data System (ADS)
Yan, Fei; Iliyasu, Abdullah M.; Le, Phuc Q.
In this review, we present an overview of the advances made in quantum image processing (QIP) comprising of the image representations, the operations realizable on them, and the likely protocols and algorithms for their applications. In particular, we focus on recent progresses on QIP-based security technologies including quantum watermarking, quantum image encryption, and quantum image steganography. This review is aimed at providing readers with a succinct, yet adequate compendium of the progresses made in the QIP sub-area. Hopefully, this effort will stimulate further interest aimed at the pursuit of more advanced algorithms and experimental validations for available technologies and extensions to other domains.
Extension of moment projection method to the fragmentation process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Shaohua; Yapp, Edward K.Y.; Akroyd, Jethro
2017-04-15
The method of moments is a simple but efficient method of solving the population balance equation which describes particle dynamics. Recently, the moment projection method (MPM) was proposed and validated for particle inception, coagulation, growth and, more importantly, shrinkage; here the method is extended to include the fragmentation process. The performance of MPM is tested for 13 different test cases for different fragmentation kernels, fragment distribution functions and initial conditions. Comparisons are made with the quadrature method of moments (QMOM), hybrid method of moments (HMOM) and a high-precision stochastic solution calculated using the established direct simulation algorithm (DSA) and advantagesmore » of MPM are drawn.« less
Paddock, L E; Veloski, J; Chatterton, M L; Gevirtz, F O; Nash, D B
2000-07-01
To develop a reliable and valid questionnaire to measure patient satisfaction with diabetes disease management programs. Questions related to structure, process, and outcomes were categorized into 14 domains defining the essential elements of diabetes disease management. Health professionals confirmed the content validity. Face validity was established by a patient focus group. The questionnaire was mailed to 711 patients with diabetes who participated in a disease management program. To reduce the number of questionnaire items, a principal components analysis was performed using a varimax rotation. The Scree test was used to select significant components. To further assess reliability and validity; Cronbach's alpha and product-moment correlations were calculated for components having > or =3 items with loadings >0.50. The validated 73-item mailed satisfaction survey had a 34.1% response rate. Principal components analysis yielded 13 components with eigenvalues > 1.0. The Scree test proposed a 6-component solution (39 items), which explained 59% of the total variation. Internal consistency reliabilities computed for the first 6 components (alpha = 0.79-0.95) were acceptable. The final questionnaire, the Diabetes Management Evaluation Tool (DMET), was designed to assess patient satisfaction with diabetes disease management programs. Although more extensive testing of the questionnaire is appropriate, preliminary reliability and validity of the DMET has been demonstrated.
Mancilla-Martinez, Jeannette; Gámez, Perla B; Vagh, Shaher Banu; Lesaux, Nonie K
2016-01-01
This 2-phase study aims to extend research on parent report measures of children's productive vocabulary by investigating the development (n = 38) of the Spanish Vocabulary Extension and validity (n = 194) of the 100-item Spanish and English MacArthur-Bates Communicative Development Inventories Toddler Short Forms and Upward Extension (Fenson et al., 2000, 2007; Jackson-Maldonado, Marchman, & Fernald, 2013) and the Spanish Vocabulary Extension for use with parents from low-income homes and their 24- to 48-month-old Spanish-English bilingual children. Study participants were drawn from Early Head Start and Head Start collaborative programs in the Northeastern United States in which English was the primary language used in the classroom. All families reported Spanish or Spanish-English as their home language(s). The MacArthur Communicative Development Inventories as well as the researcher-designed Spanish Vocabulary Extension were used as measures of children's English and Spanish productive vocabularies. Findings revealed the forms' concurrent and discriminant validity, on the basis of standardized measures of vocabulary, as measures of productive vocabulary for this growing bilingual population. These findings suggest that parent reports, including our researcher-designed form, represent a valid, cost-effective mechanism for vocabulary monitoring purposes in early childhood education settings.
Mining continuous activity patterns from animal trajectory data
Wang, Y.; Luo, Ze; Baoping, Yan; Takekawa, John Y.; Prosser, Diann J.; Newman, Scott H.
2014-01-01
The increasing availability of animal tracking data brings us opportunities and challenges to intuitively understand the mechanisms of animal activities. In this paper, we aim to discover animal movement patterns from animal trajectory data. In particular, we propose a notion of continuous activity pattern as the concise representation of underlying similar spatio-temporal movements, and develop an extension and refinement framework to discover the patterns. We first preprocess the trajectories into significant semantic locations with time property. Then, we apply a projection-based approach to generate candidate patterns and refine them to generate true patterns. A sequence graph structure and a simple and effective processing strategy is further developed to reduce the computational overhead. The proposed approaches are extensively validated on both real GPS datasets and large synthetic datasets.
Qiu, Tian-Xia; Teo, Ee-Chon; Lee, Kim-Kheng; Ng, Hong-Wan; Yang, Kai
2004-04-01
The purpose of this study was to determine the locations and loci of instantaneous axes of rotation (IARs) of the T10-T11 motion segment in flexion and extension. An anatomically accurate three-dimensional model of thoracic T10-T11 functional spinal unit (FSU) was developed and validated against published experimental data under flexion, extension, lateral bending, and axial rotation loading configurations. The validated model was exercised under six load configurations that produced motions only in the sagittal plane to characterize the loci of IARs for flexion and extension. The IARs for both flexion and extension under these six load types were directly below the geometric center of the moving vertebra, and all the loci of IARs were tracked superoanteriorly for flexion and inferoposteriorly for extension with rotation. These findings may offer an insight to better understanding of the kinematics of the human thoracic spine and provide clinically relevant information for the evaluation of spinal stability and implant device functionality.
Heater Validation for the NEXT-C Hollow Cathodes
NASA Technical Reports Server (NTRS)
Verhey, Timothy R.; Soulas, George C.; Mackey, Jonathan Ar.
2017-01-01
Swaged cathode heaters whose design was successfully demonstrated under a prior flight project are to be provided by the NASA Glenn Research Center for the NEXT-C ion thruster being fabricated by Aerojet Rocketdyne. Extensive requalification activities were performed to validate process controls that had to be re-established or revised because systemic changes prevented reuse of the past approaches. A development batch of heaters was successfully fabricated based on the new process controls. Acceptance and cyclic life testing of multiple discharge and neutralizer sized heaters extracted from the development batch was initiated in August, 2016, with the last heater completing testing in April, 2017. Cyclic life testing results substantially exceeded the NEXT-C thruster requirement as well as all past experience for GRC fabricated units. The heaters demonstrated ultimate cyclic life capability of 19050 to 33500 cycles. A qualification batch of heaters is now being fabricated using the finalized process controls. A set of six heaters will be acceptance and cyclic tested to verify conformance to the behavior observed with the development heaters. The heaters for flight use will be then be provided to the contractor. This paper summarizes the fabrication process control activities and the acceptance and life testing of the development heater units.
Terrorism as a process: a critical review of Moghaddam's "Staircase to Terrorism".
Lygre, Ragnhild B; Eid, Jarle; Larsson, Gerry; Ranstorp, Magnus
2011-12-01
This study reviews empirical evidence for Moghaddam's model "Staircase to Terrorism," which portrays terrorism as a process of six consecutive steps culminating in terrorism. An extensive literature search, where 2,564 publications on terrorism were screened, resulted in 38 articles which were subject to further analysis. The results showed that while most of the theories and processes linked to Moghaddam's model are supported by empirical evidence, the proposed transitions between the different steps are not. These results may question the validity of a linear stepwise model and may suggest that a combination of mechanisms/factors could combine in different ways to produce terrorism. © 2011 The Authors. Scandinavian Journal of Psychology © 2011 The Scandinavian Psychological Associations.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Farkas, Daniel H; Miltgen, Nicholas E; Stoerker, Jay; van den Boom, Dirk; Highsmith, W Edward; Cagasan, Lesley; McCullough, Ron; Mueller, Reinhold; Tang, Lin; Tynan, John; Tate, Courtney; Bombard, Allan
2010-09-01
We designed a laboratory developed test (LDT) by using an open platform for mutation/polymorphism detection. Using a 108-member (mutation plus variant) cystic fibrosis carrier screening panel as a model, we completed the last phase of LDT validation by using matrix-assisted laser desorption/ionization time of flight mass spectrometry. Panel customization was accomplished via specific amplification primer and extension probe design. Amplified genomic DNA was subjected to allele specific, single base extension endpoint analysis by mass spectrometry for inspection of the cystic fibrosis transmembrane regulator gene (NM_000492.3). The panel of mutations and variants was tested against 386 blinded samples supplied by "authority" laboratories highly experienced in cystic fibrosis transmembrane regulator genotyping; >98% concordance was observed. All discrepant and discordant results were resolved satisfactorily. Taken together, these results describe the concluding portion of the LDT validation process and the use of mass spectrometry to detect a large number of complex reactions within a single run as well as its suitability as a platform appropriate for interrogation of scores to hundreds of targets.
In-Flight Thermal Performance of the Lidar In-Space Technology Experiment
NASA Technical Reports Server (NTRS)
Roettker, William
1995-01-01
The Lidar In-Space Technology Experiment (LITE) was developed at NASA s Langley Research Center to explore the applications of lidar operated from an orbital platform. As a technology demonstration experiment, LITE was developed to gain experience designing and building future operational orbiting lidar systems. Since LITE was the first lidar system to be flown in space, an important objective was to validate instrument design principles in such areas as thermal control, laser performance, instrument alignment and control, and autonomous operations. Thermal and structural analysis models of the instrument were developed during the design process to predict the behavior of the instrument during its mission. In order to validate those mathematical models, extensive engineering data was recorded during all phases of LITE's mission. This inflight engineering data was compared with preflight predictions and, when required, adjustments to the thermal and structural models were made to more accurately match the instrument s actual behavior. The results of this process for the thermal analysis and design of LITE are presented in this paper.
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2010 CFR
2010-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2013 CFR
2013-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2014 CFR
2014-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2011 CFR
2011-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
41 CFR 60-3.14 - Technical standards for validity studies.
Code of Federal Regulations, 2012 CFR
2012-07-01
... likely to affect validity differences; or that these factors are included in the design of the study and... construct validity is both an extensive and arduous effort involving a series of research studies, which... validity studies. 60-3.14 Section 60-3.14 Public Contracts and Property Management Other Provisions...
Crack Propagation Analysis Using Acoustic Emission Sensors for Structural Health Monitoring Systems
Kral, Zachary; Horn, Walter; Steck, James
2013-01-01
Aerospace systems are expected to remain in service well beyond their designed life. Consequently, maintenance is an important issue. A novel method of implementing artificial neural networks and acoustic emission sensors to form a structural health monitoring (SHM) system for aerospace inspection routines was the focus of this research. Simple structural elements, consisting of flat aluminum plates of AL 2024-T3, were subjected to increasing static tensile loading. As the loading increased, designed cracks extended in length, releasing strain waves in the process. Strain wave signals, measured by acoustic emission sensors, were further analyzed in post-processing by artificial neural networks (ANN).more » Several experiments were performed to determine the severity and location of the crack extensions in the structure. ANNs were trained on a portion of the data acquired by the sensors and the ANNs were then validated with the remaining data. The combination of a system of acoustic emission sensors, and an ANN could determine crack extension accurately. The difference between predicted and actual crack extensions was determined to be between 0.004 in. and 0.015 in. with 95% confidence. These ANNs, coupled with acoustic emission sensors, showed promise for the creation of an SHM system for aerospace systems.« less
Crack Propagation Analysis Using Acoustic Emission Sensors for Structural Health Monitoring Systems
Horn, Walter; Steck, James
2013-01-01
Aerospace systems are expected to remain in service well beyond their designed life. Consequently, maintenance is an important issue. A novel method of implementing artificial neural networks and acoustic emission sensors to form a structural health monitoring (SHM) system for aerospace inspection routines was the focus of this research. Simple structural elements, consisting of flat aluminum plates of AL 2024-T3, were subjected to increasing static tensile loading. As the loading increased, designed cracks extended in length, releasing strain waves in the process. Strain wave signals, measured by acoustic emission sensors, were further analyzed in post-processing by artificial neural networks (ANN). Several experiments were performed to determine the severity and location of the crack extensions in the structure. ANNs were trained on a portion of the data acquired by the sensors and the ANNs were then validated with the remaining data. The combination of a system of acoustic emission sensors, and an ANN could determine crack extension accurately. The difference between predicted and actual crack extensions was determined to be between 0.004 in. and 0.015 in. with 95% confidence. These ANNs, coupled with acoustic emission sensors, showed promise for the creation of an SHM system for aerospace systems. PMID:24023536
Monte Carol-based validation of neutronic methodology for EBR-II analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liaw, J.R.; Finck, P.J.
1993-01-01
The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less
Shelf-life extension of gilthead seabream fillets by osmotic treatment and antimicrobial agents.
Tsironi, T N; Taoukis, P S
2012-02-01
The objectives of the study were to evaluate the effect of selected antimicrobial agents on the shelf life of osmotically pretreated gilthead seabream and to establish reliable kinetic equations for shelf-life determination validated in dynamic conditions. Fresh gilthead seabream (Sparus aurata) fillets were osmotically treated with 50% high dextrose equivalent maltodextrin (HDM, DE 47) plus 5% NaCl and 0·5% carvacrol, 0·5% glucono-δ-lactone or 1% Citrox (commercial antimicrobial mix). Untreated and treated slices were aerobically packed and stored isothermally (0-15°C). Microbial growth and quality-related chemical indices were modelled as functions of temperature. Models were validated at dynamic storage conditions. Osmotic pretreatment with the use of antimicrobials led to significant shelf-life extension of fillets, in terms of microbial growth and organoleptic deterioration. The shelf life was 7 days for control samples at 5°C. The osmotic pretreatment with carvacrol, glucono-δ-lactone and Citrox allowed for shelf-life extension by 8, 10 and 5 days at 5°C, respectively. The results of the study show the potential of adding carvacrol, glucono-δ-lactone or Citrox in the osmotic solution to extend the shelf life and improve commercial value of chilled osmotically pretreated fish products. The developed models can be a reliable tool for predicting the shelf life of fresh or minimally processed gilthead seabream fillets in the real chill chain. © 2012 The Authors. Journal of Applied Microbiology © 2012 The Society for Applied Microbiology.
Design of a multiple kernel learning algorithm for LS-SVM by convex programming.
Jian, Ling; Xia, Zhonghang; Liang, Xijun; Gao, Chuanhou
2011-06-01
As a kernel based method, the performance of least squares support vector machine (LS-SVM) depends on the selection of the kernel as well as the regularization parameter (Duan, Keerthi, & Poo, 2003). Cross-validation is efficient in selecting a single kernel and the regularization parameter; however, it suffers from heavy computational cost and is not flexible to deal with multiple kernels. In this paper, we address the issue of multiple kernel learning for LS-SVM by formulating it as semidefinite programming (SDP). Furthermore, we show that the regularization parameter can be optimized in a unified framework with the kernel, which leads to an automatic process for model selection. Extensive experimental validations are performed and analyzed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model
Hakanen, Jari J.; Westerlund, Hugo
2018-01-01
Aim This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. Material and methods The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. Results This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. Conclusion In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field. PMID:29708998
Penloglou, Giannis; Chatzidoukas, Christos; Kiparissides, Costas
2012-01-01
The microbial production of polyhydroxybutyrate (PHB) is a complex process in which the final quantity and quality of the PHB depend on a large number of process operating variables. Consequently, the design and optimal dynamic operation of a microbial process for the efficient production of PHB with tailor-made molecular properties is an extremely interesting problem. The present study investigates how key process operating variables (i.e., nutritional and aeration conditions) affect the biomass production rate and the PHB accumulation in the cells and its associated molecular weight distribution. A combined metabolic/polymerization/macroscopic modelling approach, relating the process performance and product quality with the process variables, was developed and validated using an extensive series of experiments and measurements. The model predicts the dynamic evolution of the biomass growth, the polymer accumulation, the consumption of carbon and nitrogen sources and the average molecular weights of the PHB in a bioreactor, under batch and fed-batch operating conditions. The proposed integrated model was used for the model-based optimization of the production of PHB with tailor-made molecular properties in Azohydromonas lata bacteria. The process optimization led to a high intracellular PHB accumulation (up to 95% g of PHB per g of DCW) and the production of different grades (i.e., different molecular weight distributions) of PHB. Copyright © 2011 Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
2015-02-01
Although the freeway travel time data has been validated extensively in recent : years, the quality of arterial travel time data is not well known. This project : presents a comprehensive validation scheme for arterial travel time data based : on GPS...
Validity and Reliability of a New Device (WIMU®) for Measuring Hamstring Muscle Extensibility.
Muyor, José M
2017-09-01
The aims of the current study were 1) to evaluate the validity of the WIMU ® system for measuring hamstring muscle extensibility in the passive straight leg raise (PSLR) test using an inclinometer for the criterion and 2) to determine the test-retest reliability of the WIMU ® system to measure hamstring muscle extensibility during the PSLR test. 55 subjects were evaluated on 2 separate occasions. Data from a Unilever inclinometer and WIMU ® system were collected simultaneously. Intraclass correlation coefficients (ICCs) for the validity were very high (0.983-1); a very low systematic bias (-0.21°--0.42°), random error (0.05°-0.04°) and standard error of the estimate (0.43°-0.34°) were observed (left-right leg, respectively) between the 2 devices (inclinometer and the WIMU ® system). The R 2 between the devices was 0.999 (p<0.001) in both the left and right legs. The test-retest reliability of the WIMU ® system was excellent, with ICCs ranging from 0.972-0.995, low coefficients of variation (0.01%), and a low standard error of the estimate (0.19-0.31°). The WIMU ® system showed strong concurrent validity and excellent test-retest reliability for the evaluation of hamstring muscle extensibility in the PSLR test. © Georg Thieme Verlag KG Stuttgart · New York.
Cutting the Wires: Modularization of Cellular Networks for Experimental Design
Lang, Moritz; Summers, Sean; Stelling, Jörg
2014-01-01
Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. PMID:24411264
In Defense of an Instrument-Based Approach to Validity
ERIC Educational Resources Information Center
Hood, S. Brian
2012-01-01
Paul E. Newton argues in favor of a conception of validity, viz, "the consensus definition of validity," according to which the extension of the predicate "is valid" is a subset of "assessment-based decision-making procedure[s], which [are] underwritten by an argument that the assessment procedure can be used to measure the attribute entailed by…
NASA Astrophysics Data System (ADS)
Liang, X. S.
2016-02-01
Central at the processes of mean-eddy-turbulence interaction, e.g., mesoscale eddy shedding, relaminarization, etc., is the transfer of energy among different scales. The existing classical transfers, however, do not take into account the issue of energy conservation and, therefore, are not faithful representations of the real interaction processes, which are fundamentally a redistribution of energy among scales. Based on a new analysis machinery, namely, multiscale window transform (Liang and Anderson, 2007), we were able to obtain a formula for this important processes, with the property of energy conservation a naturally embedded property. This formula has a form reminiscent of the Poisson bracket in Hamiltonian dynamics. It has been validated with many benchmark processes, and, particularly, has been applied with success to control the eddy shedding behind a bluff body. Presented here will be an application study of the instabilities and mean-eddy interactions in the Kuroshio Extension (KE) region. Generally, it is found that the unstable KE jet fuels the mesoscale eddies, but in the offshore eddy decaying region, the cause-effect relation reverses: it is the latter that drive the former. On the whole the eddies act to decelerate the jet in the upstream, whereas accelerating it downstream.
NASA Technical Reports Server (NTRS)
Vachon, Jacques; Curry, Robert E.
2010-01-01
Program Objectives: 1) Satellite Calibration and Validation: Provide methods to perform the cal/val requirements for Earth Observing System satellites. 2) New Sensor Development: Provide methods to reduce risk for new sensor concepts and algorithm development prior to committing sensors to operations. 3) Process Studies: Facilitate the acquisition of high spatial/temporal resolution focused measurements that are required to understand small atmospheric and surface structures which generate powerful Earth system effects. 4) Airborne Networking: Develop disruption-tolerant networking to enable integrated multiple scale measurements of critical environmental features. Dryden Capabilities include: a) Aeronautics history of aircraft developments and milestones. b) Extensive history and experience in instrument integration. c) Extensive history and experience in aircraft modifications. d) Strong background in international deployments. e) Long history of reliable and dependable execution of projects. f) Varied aircraft types providing different capabilities, performance and duration.
A Snapshot of Organizational Climate: Perceptions of Extension Faculty
ERIC Educational Resources Information Center
Tower, Leslie E.; Bowen, Elaine; Alkadry, Mohamad G.
2011-01-01
This article provides a snapshot of the perceptions of workplace climate of Extension faculty at a land-grant, research-high activity university, compared with the perceptions of non-Extension faculty at the same university. An online survey was conducted with a validated instrument. The response rate for university faculty was 44% (968); the…
Catalysis on Single Supported Atoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeBusk, Melanie Moses; Narula, Chaitanya Kumar
2015-01-01
The highly successful application of supported metals as heterogeneous catalysts in automotive catalysts, fuel cells, and other multitudes of industrial processes have led to extensive efforts to understand catalyst behavior at the nano-scale. Recent discovery of simple wet methods to prepare single supported atoms, the smallest nano-catalyst, has allowed for experimental validation of catalytic activity of a variety of catalysts and potential for large scale production for such catalysts for industrial processes. In this chapter, we summarize the synthetic and structural aspects of single supported atoms. We also present proposed mechanisms for the activity of single supported catalysts where conventionalmore » mechanisms cannot operate due to lack of M-M bonds in the catalysts.« less
Validation of biomarkers of food intake-critical assessment of candidate biomarkers.
Dragsted, L O; Gao, Q; Scalbert, A; Vergères, G; Kolehmainen, M; Manach, C; Brennan, L; Afman, L A; Wishart, D S; Andres Lacueva, C; Garcia-Aloy, M; Verhagen, H; Feskens, E J M; Praticò, G
2018-01-01
Biomarkers of food intake (BFIs) are a promising tool for limiting misclassification in nutrition research where more subjective dietary assessment instruments are used. They may also be used to assess compliance to dietary guidelines or to a dietary intervention. Biomarkers therefore hold promise for direct and objective measurement of food intake. However, the number of comprehensively validated biomarkers of food intake is limited to just a few. Many new candidate biomarkers emerge from metabolic profiling studies and from advances in food chemistry. Furthermore, candidate food intake biomarkers may also be identified based on extensive literature reviews such as described in the guidelines for Biomarker of Food Intake Reviews (BFIRev). To systematically and critically assess the validity of candidate biomarkers of food intake, it is necessary to outline and streamline an optimal and reproducible validation process. A consensus-based procedure was used to provide and evaluate a set of the most important criteria for systematic validation of BFIs. As a result, a validation procedure was developed including eight criteria, plausibility, dose-response, time-response, robustness, reliability, stability, analytical performance, and inter-laboratory reproducibility. The validation has a dual purpose: (1) to estimate the current level of validation of candidate biomarkers of food intake based on an objective and systematic approach and (2) to pinpoint which additional studies are needed to provide full validation of each candidate biomarker of food intake. This position paper on biomarker of food intake validation outlines the second step of the BFIRev procedure but may also be used as such for validation of new candidate biomarkers identified, e.g., in food metabolomic studies.
Dry Volume Fracturing Simulation of Shale Gas Reservoir
NASA Astrophysics Data System (ADS)
Xu, Guixi; Wang, Shuzhong; Luo, Xiangrong; Jing, Zefeng
2017-11-01
Application of CO2 dry fracturing technology to shale gas reservoir development in China has advantages of no water consumption, little reservoir damage and promoting CH4 desorption. This paper uses Meyer simulation to study complex fracture network extension and the distribution characteristics of shale gas reservoirs in the CO2 dry volume fracturing process. The simulation results prove the validity of the modified CO2 dry fracturing fluid used in shale volume fracturing and provides a theoretical basis for the following study on interval optimization of the shale reservoir dry volume fracturing.
Lessons Learned in Developing and Validating Models of Visual Search and Target Acquisition
2000-03-01
n al th at d istin gu ish es th e center t x u e t a s t o h t m g t o c r w t e f c lirregularity. texture transition that might occur with a...shown in Figure 5, and it allows the model to simulate the blue squares, and red squares). Neisser and others have performance of experienced human...Processes that Affect STA features support pop-out. For example, Neisser found that after extensive training, observers can learn to rapidly pick Another
Targeted exploration and analysis of large cross-platform human transcriptomic compendia
Zhu, Qian; Wong, Aaron K; Krishnan, Arjun; Aure, Miriam R; Tadych, Alicja; Zhang, Ran; Corney, David C; Greene, Casey S; Bongo, Lars A; Kristensen, Vessela N; Charikar, Moses; Li, Kai; Troyanskaya, Olga G.
2016-01-01
We present SEEK (http://seek.princeton.edu), a query-based search engine across very large transcriptomic data collections, including thousands of human data sets from almost 50 microarray and next-generation sequencing platforms. SEEK uses a novel query-level cross-validation-based algorithm to automatically prioritize data sets relevant to the query and a robust search approach to identify query-coregulated genes, pathways, and processes. SEEK provides cross-platform handling, multi-gene query search, iterative metadata-based search refinement, and extensive visualization-based analysis options. PMID:25581801
A Premiere example of the illusion of harm reduction cigarettes in the 1990s.
Pollay, R W; Dewhirst, T
2003-09-01
To use the product launch of Player's Premiere as a case study for understanding the new cigarette product development process during the 1990s. We determine the (in)validity of industry claims that: (1) development of the physical product preceded the promotional promise of "less irritation"; (2) "less irritation" was actually realised; (3) advertising informed consumers; and (4) advertising regulations caused the product's failure in the marketplace. Court proceedings assessing the constitutionality of Canada's Tobacco Act, which substantially restricts cigarette advertising. The 2002 Quebec Superior Court trial yielded a new collection of internal documents from Imperial Tobacco Ltd (ITL), including several about the development and marketing of Player's Premiere. Trial testimony and corporate documents were reviewed to determine the validity of the industry representations about the new cigarette product development process, focusing on the case history of Player's Premiere. In direct contradiction to industry testimony, the documentary evidence demonstrates that (1) communications for Player's Premiere, which claimed less irritation, were developed long before finding a product that could deliver on the promise; (2) ITL did not sell a "less irritating" product that matched its promotional promise; (3) the advertising and other communications for Player's Premiere were extensive, relying on the hi-tech appearances ("tangible credibility") of a "unique" filter, yet were uninformative and vague; and (4) Player's Premiere failed in the marketplace, despite extensive advertising and retail support, because it was an inferior product that did not live up to its promotional promise, not because of regulation of commercial speech. New product development entails extensive consumer research to craft all communications tools in fine detail. In the case of Player's Premiere, this crafting created a false and misleading impression of technological advances producing a "less irritating" cigarette. This product was solely a massive marketing ploy with neither consumer benefits, nor public health benefits. The industry attempted to deceive both consumers and the court.
A Premiere example of the illusion of harm reduction cigarettes in the 1990s
Pollay, R; Dewhirst, T
2003-01-01
Objective: To use the product launch of Player's Premiere as a case study for understanding the new cigarette product development process during the 1990s. We determine the (in)validity of industry claims that: (1) development of the physical product preceded the promotional promise of "less irritation"; (2) "less irritation" was actually realised; (3) advertising informed consumers; and (4) advertising regulations caused the product's failure in the marketplace. Setting: Court proceedings assessing the constitutionality of Canada's Tobacco Act, which substantially restricts cigarette advertising. The 2002 Quebec Superior Court trial yielded a new collection of internal documents from Imperial Tobacco Ltd (ITL), including several about the development and marketing of Player's Premiere. Method: Trial testimony and corporate documents were reviewed to determine the validity of the industry representations about the new cigarette product development process, focusing on the case history of Player's Premiere. Results: In direct contradiction to industry testimony, the documentary evidence demonstrates that (1) communications for Player's Premiere, which claimed less irritation, were developed long before finding a product that could deliver on the promise; (2) ITL did not sell a "less irritating" product that matched its promotional promise; (3) the advertising and other communications for Player's Premiere were extensive, relying on the hi-tech appearances ("tangible credibility") of a "unique" filter, yet were uninformative and vague; and (4) Player's Premiere failed in the marketplace, despite extensive advertising and retail support, because it was an inferior product that did not live up to its promotional promise, not because of regulation of commercial speech. Conclusions: New product development entails extensive consumer research to craft all communications tools in fine detail. In the case of Player's Premiere, this crafting created a false and misleading impression of technological advances producing a "less irritating" cigarette. This product was solely a massive marketing ploy with neither consumer benefits, nor public health benefits. The industry attempted to deceive both consumers and the court. PMID:12958396
Valid Knowledge: The Economy and the Academy
ERIC Educational Resources Information Center
Williams, Peter John
2007-01-01
The future of Western universities as public institutions is the subject of extensive continuing debate, underpinned by the issue of what constitutes "valid knowledge". Where in the past only propositional knowledge codified by academics was considered valid, in the new economy enabled by information and communications technology, the procedural…
Initial Development and Validation of the Global Citizenship Scale
ERIC Educational Resources Information Center
Morais, Duarte B.; Ogden, Anthony C.
2011-01-01
The purpose of this article is to report on the initial development of a theoretically grounded and empirically validated scale to measure global citizenship. The methodology employed is multi-faceted, including two expert face validity trials, extensive exploratory and confirmatory factor analyses with multiple datasets, and a series of three…
Initial Teacher Licensure Testing in Tennessee: Test Validation.
ERIC Educational Resources Information Center
Bowman, Harry L.; Petry, John R.
In 1988 a study was conducted to determine the validity of candidate teacher licensure examinations for use in Tennessee under the 1984 Comprehensive Education Reform Act. The Department of Education conducted a study to determine the validity of 11 previously unvalidated or extensively revised tests for certification and to make recommendations…
Empirical Validation and Application of the Computing Attitudes Survey
ERIC Educational Resources Information Center
Dorn, Brian; Elliott Tew, Allison
2015-01-01
Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…
NASA Astrophysics Data System (ADS)
Kennedy, J. H.; Bennett, A. R.; Evans, K. J.; Fyke, J. G.; Vargo, L.; Price, S. F.; Hoffman, M. J.
2016-12-01
Accurate representation of ice sheets and glaciers are essential for robust predictions of arctic climate within Earth System models. Verification and Validation (V&V) is a set of techniques used to quantify the correctness and accuracy of a model, which builds developer/modeler confidence, and can be used to enhance the credibility of the model. Fundamentally, V&V is a continuous process because each model change requires a new round of V&V testing. The Community Ice Sheet Model (CISM) development community is actively developing LIVVkit, the Land Ice Verification and Validation toolkit, which is designed to easily integrate into an ice-sheet model's development workflow (on both personal and high-performance computers) to provide continuous V&V testing.LIVVkit is a robust and extensible python package for V&V, which has components for both software V&V (construction and use) and model V&V (mathematics and physics). The model Verification component is used, for example, to verify model results against community intercomparisons such as ISMIP-HOM. The model validation component is used, for example, to generate a series of diagnostic plots showing the differences between model results against observations for variables such as thickness, surface elevation, basal topography, surface velocity, surface mass balance, etc. Because many different ice-sheet models are under active development, new validation datasets are becoming available, and new methods of analysing these models are actively being researched, LIVVkit includes a framework to easily extend the model V&V analyses by ice-sheet modelers. This allows modelers and developers to develop evaluations of parameters, implement changes, and quickly see how those changes effect the ice-sheet model and earth system model (when coupled). Furthermore, LIVVkit outputs a portable hierarchical website allowing evaluations to be easily shared, published, and analysed throughout the arctic and Earth system communities.
permGPU: Using graphics processing units in RNA microarray association studies.
Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros
2010-06-16
Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.
Applying the multivariate time-rescaling theorem to neural population models
Gerhard, Felipe; Haslinger, Robert; Pipa, Gordon
2011-01-01
Statistical models of neural activity are integral to modern neuroscience. Recently, interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However any statistical model must be validated by an appropriate goodness-of-fit test. Kolmogorov-Smirnov tests based upon the time-rescaling theorem have proven to be useful for evaluating point-process-based statistical models of single-neuron spike trains. Here we discuss the extension of the time-rescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains, models which neglect couplings between neurons can be erroneously passed by the univariate time-rescaling test. We present the multivariate version of the time-rescaling theorem, and provide a practical step-by-step procedure for applying it towards testing the sufficiency of neural population models. Using several simple analytically tractable models and also more complex simulated and real data sets, we demonstrate that important features of the population activity can only be detected using the multivariate extension of the test. PMID:21395436
Tousignant, Michel; Smeesters, Cécil; Breton, Anne-Marie; Breton, Emilie; Corriveau, Hélène
2006-04-01
This study compared range of motion (ROM) measurements using a cervical range of motion device (CROM) and an optoelectronic system (OPTOTRAK). To examine the criterion validity of the CROM for the measurement of cervical ROM on healthy adults. Whereas measurements of cervical ROM are recognized as part of the assessment of patients with neck pain, few devices are available in clinical settings. Two papers published previously showed excellent criterion validity for measurements of cervical flexion/extension and lateral flexion using the CROM. Subjects performed neck rotation, flexion/extension, and lateral flexion while sitting on a wooden chair. The ROM values were measured by the CROM as well as the OPTOTRAK. The cervical rotational ROM values using the CROM demonstrated a good to excellent linear relationship with those using the OPTOTRAK: right rotation, r = 0.89 (95% confidence interval, 0.81-0.94), and left rotation, r = 0.94 (95% confidence interval, 0.90-0.97). Similar results were also obtained for flexion/extension and lateral flexion ROM values. The CROM showed excellent criterion validity for measurements of cervical rotation. We propose using ROM values measured by the CROM as outcome measures for patients with neck pain.
Mena, Marisa; Lloveras, Belen; Tous, Sara; Bogers, Johannes; Maffini, Fausto; Gangane, Nitin; Kumar, Rekha Vijay; Somanathan, Thara; Lucas, Eric; Anantharaman, Devasena; Gheit, Tarik; Castellsagué, Xavier; Pawlita, Michael; de Sanjosé, Silvia; Alemany, Laia; Tommasino, Massimo
2017-01-01
Worldwide use of formalin-fixed paraffin-embedded blocks (FFPE) is extensive in diagnosis and research. Yet, there is a lack of optimized/standardized protocols to process the blocks and verify the quality and presence of the targeted tissue. In the context of an international study on head and neck cancer (HNC)-HPV-AHEAD, a standardized protocol for optimizing the use of FFPEs in molecular epidemiology was developed and validated. First, a protocol for sectioning the FFPE was developed to prevent cross-contamination and distributed between participating centers. Before processing blocks, all sectioning centers underwent a quality control to guarantee a satisfactory training process. The first and last sections of the FFPEs were used for histopathological assessment. A consensus histopathology evaluation form was developed by an international panel of pathologists and evaluated for four indicators in a pilot analysis in order to validate it: 1) presence/type of tumor tissue, 2) identification of other tissue components that could affect the molecular diagnosis and 3) quality of the tissue. No HPV DNA was found in sections from empty FFPE generated in any histology laboratories of HPV-AHEAD consortium and all centers passed quality assurance for processing after quality control. The pilot analysis to validate the histopathology form included 355 HNC cases. The form was filled by six pathologists and each case was randomly assigned to two of them. Most samples (86%) were considered satisfactory. Presence of >50% of invasive carcinoma was observed in all sections of 66% of cases. Substantial necrosis (>50%) was present in <2% of samples. The concordance for the indicators targeted to validate the histopathology form was very high (kappa > 0.85) between first and last sections and fair to high between pathologists (kappa/pabak 0.21-0.72). The protocol allowed to correctly process without signs of contamination all FFPE of the study. The histopathology evaluation of the cases assured the presence of the targeted tissue, identified the presence of other tissues that could disturb the molecular diagnosis and allowed the assessment of tissue quality.
Sasaki, Shizuka; Chiba, Daisuke; Yamamoto, Yuji; Nawata, Atsushi; Tsuda, Eiichi; Nakaji, Shigeyuki; Ishibashi, Yasuyuki
2018-01-01
Trunk muscle weakness and imbalance are risk factors for postural instability, low back pain, and poor postoperative outcomes. The association between trunk muscle strength and aging is poorly understood, and establishing normal reference values is difficult. We aimed to establish the validity of a novel portable trunk muscle torque measurement instrument (PTMI). We then estimated reference data for healthy young adults and elucidated age-related weakness in trunk muscle strength. Twenty-four university students were enrolled to validate values for PTMI, and 816 volunteers from the general population who were recruited to the Iwaki Health Promotion Project were included to estimate reference data for trunk muscle strength. Trunk flexion and extension torque were measured with PTMI and KinCom, and interclass correlation coefficients (ICC) were estimated to evaluate the reliability of PTMI values. Furthermore, from the young adult reference, the age-related reduction in trunk muscle torque and the prevalence of sarcopenia among age-sex groups were estimated. The ICC in flexion and extension torque were 0.807 (p<0.001) and 0.789 (p<0.001), respectively. The prevalence of sarcopenia increased with age, and the prevalence due to flexion torque was double that of extension torque. Flexion torque decreased significantly after 60 years of age, and extension torque decreased after 70 years of age. In males over age 80, trunk muscle torque decreased to 49.1% in flexion and 63.5% in extension. In females over age 80, trunk muscle torque decreased to 60.7% in flexion and 68.4% in extension. The validity of PTMI was confirmed by correlation with KinCom. PTMI produced reference data for healthy young adults, and demonstrated age-related reduction in trunk muscle torque. Trunk sarcopenia progressed with aging, and the loss of flexion torque began earlier than extension torque. At age 80, trunk muscle torque had decreased 60% compared with healthy young adults. PMID:29471310
Sasaki, Eiji; Sasaki, Shizuka; Chiba, Daisuke; Yamamoto, Yuji; Nawata, Atsushi; Tsuda, Eiichi; Nakaji, Shigeyuki; Ishibashi, Yasuyuki
2018-01-01
Trunk muscle weakness and imbalance are risk factors for postural instability, low back pain, and poor postoperative outcomes. The association between trunk muscle strength and aging is poorly understood, and establishing normal reference values is difficult. We aimed to establish the validity of a novel portable trunk muscle torque measurement instrument (PTMI). We then estimated reference data for healthy young adults and elucidated age-related weakness in trunk muscle strength. Twenty-four university students were enrolled to validate values for PTMI, and 816 volunteers from the general population who were recruited to the Iwaki Health Promotion Project were included to estimate reference data for trunk muscle strength. Trunk flexion and extension torque were measured with PTMI and KinCom, and interclass correlation coefficients (ICC) were estimated to evaluate the reliability of PTMI values. Furthermore, from the young adult reference, the age-related reduction in trunk muscle torque and the prevalence of sarcopenia among age-sex groups were estimated. The ICC in flexion and extension torque were 0.807 (p<0.001) and 0.789 (p<0.001), respectively. The prevalence of sarcopenia increased with age, and the prevalence due to flexion torque was double that of extension torque. Flexion torque decreased significantly after 60 years of age, and extension torque decreased after 70 years of age. In males over age 80, trunk muscle torque decreased to 49.1% in flexion and 63.5% in extension. In females over age 80, trunk muscle torque decreased to 60.7% in flexion and 68.4% in extension. The validity of PTMI was confirmed by correlation with KinCom. PTMI produced reference data for healthy young adults, and demonstrated age-related reduction in trunk muscle torque. Trunk sarcopenia progressed with aging, and the loss of flexion torque began earlier than extension torque. At age 80, trunk muscle torque had decreased 60% compared with healthy young adults.
NASA Astrophysics Data System (ADS)
Talagani, Mohamad R.; Abdi, Frank; Saravanos, Dimitris; Chrysohoidis, Nikos; Nikbin, Kamran; Ragalini, Rose; Rodov, Irena
2013-05-01
The paper proposes the diagnostic and prognostic modeling and test validation of a Wireless Integrated Strain Monitoring and Simulation System (WISMOS). The effort verifies a hardware and web based software tool that is able to evaluate and optimize sensorized aerospace composite structures for the purpose of Structural Health Monitoring (SHM). The tool is an extension of an existing suite of an SHM system, based on a diagnostic-prognostic system (DPS) methodology. The goal of the extended SHM-DPS is to apply multi-scale nonlinear physics-based Progressive Failure analyses to the "as-is" structural configuration to determine residual strength, remaining service life, and future inspection intervals and maintenance procedures. The DPS solution meets the JTI Green Regional Aircraft (GRA) goals towards low weight, durable and reliable commercial aircraft. It will take advantage of the currently developed methodologies within the European Clean sky JTI project WISMOS, with the capability to transmit, store and process strain data from a network of wireless sensors (e.g. strain gages, FBGA) and utilize a DPS-based methodology, based on multi scale progressive failure analysis (MS-PFA), to determine structural health and to advice with respect to condition based inspection and maintenance. As part of the validation of the Diagnostic and prognostic system, Carbon/Epoxy ASTM coupons were fabricated and tested to extract the mechanical properties. Subsequently two composite stiffened panels were manufactured, instrumented and tested under compressive loading: 1) an undamaged stiffened buckling panel; and 2) a damaged stiffened buckling panel including an initial diamond cut. Next numerical Finite element models of the two panels were developed and analyzed under test conditions using Multi-Scale Progressive Failure Analysis (an extension of FEM) to evaluate the damage/fracture evolution process, as well as the identification of contributing failure modes. The comparisons between predictions and test results were within 10% accuracy.
He, Guoxi; Liang, Yongtu; Li, Yansong; Wu, Mengyu; Sun, Liying; Xie, Cheng; Li, Feng
2017-06-15
The accidental leakage of long-distance pressurized oil pipelines is a major area of risk, capable of causing extensive damage to human health and environment. However, the complexity of the leaking process, with its complex boundary conditions, leads to difficulty in calculating the leakage volume. In this study, the leaking process is divided into 4 stages based on the strength of transient pressure. 3 models are established to calculate the leaking flowrate and volume. First, a negative pressure wave propagation attenuation model is applied to calculate the sizes of orifices. Second, a transient oil leaking model, consisting of continuity, momentum conservation, energy conservation and orifice flow equations, is built to calculate the leakage volume. Third, a steady-state oil leaking model is employed to calculate the leakage after valves and pumps shut down. Moreover, sensitive factors that affect the leak coefficient of orifices and volume are analyzed respectively to determine the most influential one. To validate the numerical simulation, two types of leakage test with different sizes of leakage holes were conducted from Sinopec product pipelines. More validations were carried out by applying commercial software to supplement the experimental insufficiency. Thus, the leaking process under different leaking conditions are described and analyzed. Copyright © 2017 Elsevier B.V. All rights reserved.
Heater Validation for the NEXT-C Hollow Cathodes
NASA Technical Reports Server (NTRS)
Verhey, Timothy R.; Soulas, George C.; Mackey, Jonathan A.
2018-01-01
Swaged cathode heaters whose design was successfully demonstrated under a prior flight project are to be provided by the NASA Glenn Research Center for the NEXT-C ion thruster being fabricated by Aerojet Rocketdyne. Extensive requalification activities were performed to validate process controls that had to be re-established or revised because systemic changes prevented reuse of the past approaches. A development batch of heaters was successfully fabricated based on the new process controls. Acceptance and cyclic life testing of multiple discharge and neutralizer sized heaters extracted from the development batch was initiated in August, 2016, with the last heater completing testing in April, 2017. Cyclic life testing results substantially exceeded the NEXT-C thruster requirement as well as all past experience for GRC-fabricated units. The heaters demonstrated ultimate cyclic life capability of 19050 to 33500 cycles. A qualification batch of heaters is now being fabricated using the finalized process controls. A set of six heaters will be acceptance and cyclic tested to verify conformance to the behavior observed with the development heaters. The heaters for flight use will be then be provided to the contractor from the remainder of the qualification batch. This paper summarizes the fabrication process control activities and the acceptance and life testing of the development heater units.
Enhanced TCAS 2/CDTI traffic Sensor digital simulation model and program description
NASA Technical Reports Server (NTRS)
Goka, T.
1984-01-01
Digital simulation models of enhanced TCAS 2/CDTI traffic sensors are developed, based on actual or projected operational and performance characteristics. Two enhanced Traffic (or Threat) Alert and Collision Avoidance Systems are considered. A digital simulation program is developed in FORTRAN. The program contains an executive with a semireal time batch processing capability. The simulation program can be interfaced with other modules with a minimum requirement. Both the traffic sensor and CAS logic modules are validated by means of extensive simulation runs. Selected validation cases are discussed in detail, and capabilities and limitations of the actual and simulated systems are noted. The TCAS systems are not specifically intended for Cockpit Display of Traffic Information (CDTI) applications. These systems are sufficiently general to allow implementation of CDTI functions within the real systems' constraints.
LACIE performance predictor final operational capability program description, volume 1
NASA Technical Reports Server (NTRS)
1976-01-01
The program EPHEMS computes the orbital parameters for up to two vehicles orbiting the earth for up to 549 days. The data represents a continuous swath about the earth, producing tables which can be used to determine when and if certain land segments will be covered. The program GRID processes NASA's climatology tape to obtain the weather indices along with associated latitudes and longitudes. The program LUMP takes substrata historical data and sample segment ID, crop window, crop window error and statistical data, checks for valid input parameters and generates the segment ID file, crop window file and the substrata historical file. Finally, the System Error Executive (SEE) Program checks YES error and truth data, CAMS error data, and signature extension data for validity and missing elements. A message is printed for each error found.
Entropy from State Probabilities: Hydration Entropy of Cations
2013-01-01
Entropy is an important energetic quantity determining the progression of chemical processes. We propose a new approach to obtain hydration entropy directly from probability density functions in state space. We demonstrate the validity of our approach for a series of cations in aqueous solution. Extensive validation of simulation results was performed. Our approach does not make prior assumptions about the shape of the potential energy landscape and is capable of calculating accurate hydration entropy values. Sampling times in the low nanosecond range are sufficient for the investigated ionic systems. Although the presented strategy is at the moment limited to systems for which a scalar order parameter can be derived, this is not a principal limitation of the method. The strategy presented is applicable to any chemical system where sufficient sampling of conformational space is accessible, for example, by computer simulations. PMID:23651109
Chandrasekaran, Sivapragasam; Sankararajan, Vanitha; Neelakandhan, Nampoothiri; Ram Kumar, Mahalakshmi
2017-11-04
This study, through extensive experiments and mathematical modeling, reveals that other than retention time and wastewater temperature (T w ), atmospheric parameters also play important role in the effective functioning of aquatic macrophyte-based treatment system. Duckweed species Lemna minor is considered in this study. It is observed that the combined effect of atmospheric temperature (T atm ), wind speed (U w ), and relative humidity (RH) can be reflected through one parameter, namely the "apparent temperature" (T a ). A total of eight different models are considered based on the combination of input parameters and the best mathematical model is arrived at which is validated through a new experimental set-up outside the modeling period. The validation results are highly encouraging. Genetic programming (GP)-based models are found to reveal deeper understandings of the wetland process.
Theuer, D; Dillschneider, J; Mieth, M; Büchler, M W
2012-01-01
The spectacular increase in liability processes in the field of surgery and in particular in visceral surgery, necessitates an objectification of the conflict between surgical medical professionals and medico-legal institutions, firms of solicitors and courts. Out of court settlements assisted by expert opinion commissions of the Medical Council can avoid many legal conflicts. For improvement of the legal standpoint of a defendant medical professional an unambiguous, extensive and detailed documentation of medical examination findings, the indications for the planned operative intervention, extensive and detailed documentation on disclosure and informed consent of the patient for the planned operative intervention, an extensive, detailed careful and responsibly guided report of the operation as well as a systematic, orderly well-planned postoperative complication management are necessary to counter the accusation of an organizational failure of medical professionals and the accused hospital. The mutual building of confidence between surgical medical professionals and legal institutions is safeguarded by a comprehensive documentation and an unambiguous description and formulation of the medical discharge report on termination of inpatient treatment.
FDA perspective on specifications for biotechnology products--from IND to PLA.
Murano, G
1997-01-01
Quality standards are obligatory throughout development, approval and post-marketing phases of biotechnology-derived products, thus assuring product identity, purity, and potency/strength. The process of developing and setting specifications should be based on sound science and should represent a logical progression of actions based on the use of experiential data spanning manufacturing process validation, consistency in production, and characterization of relevant product properties/attributes, by multiple analytical means. This interactive process occurs in phases, varying in rigour. It is best described as encompassing a framework which starts with the implementation of realistic/practical operational quality limits, progressing to the establishment/adoption of more stringent specifications. The historical database is generated from preclinical, toxicology and early clinical lots. This supports the clinical development programme which, as it progresses, allows for further assay method validation/refinement, adoption/addition due to relevant or newly recognized product attributes or rejection due to irrelevance. In the next phase, (licensing/approval) specifications are set through extended experience and validation of both the preparative and analytical processes, to include availability of suitable reference standards and extensive product characterization throughout its proposed dating period. Subsequent to product approval, the incremental database of test results serves as a natural continuum for further evolving/refining specifications. While there is considerable latitude in the kinds of testing modalities finally adopted to establish product quality on a routine basis, for both drugs and drug products, it is important that the selection takes into consideration relevant (significant) product characteristics that appropriately reflect on identity, purity and potency.
Wild, Diane; Furtado, Tamzin; Angalakuditi, Mallik
2012-01-01
Background The Child Behavior Checklist (CBCL) is a caregiver rating scale for assessing the behavioral profile of children. It was developed in the US, and has been extensively translated and used in a large number of studies internationally. Objective The objective of this study was to translate the CBCL into six languages using a rigorous translation methodology, placing particular emphasis on cultural adaptation and ensuring that the measure has content validity with carers of children with epilepsy. Methods A rigorous translation and cultural adaptation methodology was used. This is a process which includes two forward translations, reconciliation, two back-translations, and cognitive debriefing interviews with five carers of children with epilepsy in each country. In addition, a series of open-ended questions were asked of the carers in order to provide evidence of content validity. Results A number of cultural adaptations were made during the translation process. This included adaptations to the examples of sports and hobbies. An addition of “milk delivery” was made to the job examples in the Malayalam translation. In addition, two sexual problem items were removed from the Hebrew translation for Israel. Conclusion An additional six translations of the CBCL are now available for use in multinational studies. These translations have evidence of content validity for use with parents of children with epilepsy and have been appropriately culturally adapted so that they are acceptable for use in the target countries. The study highlights the importance of a rigorous translation process and the process of cultural adaptation. PMID:22715318
1985-03-01
conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To
77 FR 46750 - Agency Information Collection Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-06
... Questionnaire Testing, Evaluation, and Research.'' The proposed collection will utilize qualitative and quantitative methodologies to pretest questionnaires and validate EIA survey forms data quality, including..., Evaluation, and Research; (3) Type of Request: Extension, Without Change, of a Previously Approved Collection...
Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory
ERIC Educational Resources Information Center
Long, Haiying
2017-01-01
Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…
Developing and Validating a Survey of Korean Early Childhood English Teachers' Knowledge
ERIC Educational Resources Information Center
Kim, Jung In
2015-01-01
The main purpose of this study is to develop and validate a valid measure of the early childhood (EC) English teacher knowledge. Through extensive literature review on second/foreign language (L2/FL) teacher knowledge, early childhood teacher knowledge and early childhood language teacher knowledge, and semi-structured interviews from current…
Using ontologies to improve semantic interoperability in health data.
Liyanage, Harshana; Krause, Paul; De Lusignan, Simon
2015-07-10
The present-day health data ecosystem comprises a wide array of complex heterogeneous data sources. A wide range of clinical, health care, social and other clinically relevant information are stored in these data sources. These data exist either as structured data or as free-text. These data are generally individual person-based records, but social care data are generally case based and less formal data sources may be shared by groups. The structured data may be organised in a proprietary way or be coded using one-of-many coding, classification or terminologies that have often evolved in isolation and designed to meet the needs of the context that they have been developed. This has resulted in a wide range of semantic interoperability issues that make the integration of data held on these different systems changing. We present semantic interoperability challenges and describe a classification of these. We propose a four-step process and a toolkit for those wishing to work more ontologically, progressing from the identification and specification of concepts to validating a final ontology. The four steps are: (1) the identification and specification of data sources; (2) the conceptualisation of semantic meaning; (3) defining to what extent routine data can be used as a measure of the process or outcome of care required in a particular study or audit and (4) the formalisation and validation of the final ontology. The toolkit is an extension of a previous schema created to formalise the development of ontologies related to chronic disease management. The extensions are focused on facilitating rapid building of ontologies for time-critical research studies.
Six sigma critical success factors in manufacturing industries
NASA Astrophysics Data System (ADS)
Mustafa, Zainol; Jamaluddin, Z.
2017-04-01
The success of Six Sigma implementation is known to depend on a number of contributing factors. The purpose of this paper is to explore Six Sigma critical success factors (CSFs) in the context of Malaysian manufacturing organizations. Although Six Sigma success factors have been abundantly researched in the global context, in this paper, a maiden attempt is made to identify, through an extensive literature review, the CSFs for Six Sigma implementation followed by their validation using primary data collection from Malaysian manufacturing companies. A total of 33 indicators have thus been compiled through an extensive literature review which then been grouped into 6 contributing factors. These contributing success factors are then validated through an empirical research of selected Malaysian manufacturing companies at various stages of implementation of the Six Sigma process improvement methodology. There has been an overemphasis on the role and commitment of the management in the success of a Six Sigma program. Though it is undoubted, certain other factors also play an equally important role in ensuring that the Six Sigma programs are successful. The factor analysis of CSFs of the Malaysian manufacturing organizations selected in this study demonstrates that the top factor is a composite factor showing combination of the ability of the project teams to use the process management on quality initiative and a training using a proper analysis in problem solving. The CSFs extracted through the factor analysis could provide a basis for manufacturing organizations embarking on the Six Sigma journey to look beyond just management involvement. Thus, one can develop an integrated framework of other factors as outlined and give them appropriate priority and focus.
Adaptation of clinical prediction models for application in local settings.
Kappen, Teus H; Vergouwe, Yvonne; van Klei, Wilton A; van Wolfswinkel, Leo; Kalkman, Cor J; Moons, Karel G M
2012-01-01
When planning to use a validated prediction model in new patients, adequate performance is not guaranteed. For example, changes in clinical practice over time or a different case mix than the original validation population may result in inaccurate risk predictions. To demonstrate how clinical information can direct updating a prediction model and development of a strategy for handling missing predictor values in clinical practice. A previously derived and validated prediction model for postoperative nausea and vomiting was updated using a data set of 1847 patients. The update consisted of 1) changing the definition of an existing predictor, 2) reestimating the regression coefficient of a predictor, and 3) adding a new predictor to the model. The updated model was then validated in a new series of 3822 patients. Furthermore, several imputation models were considered to handle real-time missing values, so that possible missing predictor values could be anticipated during actual model use. Differences in clinical practice between our local population and the original derivation population guided the update strategy of the prediction model. The predictive accuracy of the updated model was better (c statistic, 0.68; calibration slope, 1.0) than the original model (c statistic, 0.62; calibration slope, 0.57). Inclusion of logistical variables in the imputation models, besides observed patient characteristics, contributed to a strategy to deal with missing predictor values at the time of risk calculation. Extensive knowledge of local, clinical processes provides crucial information to guide the process of adapting a prediction model to new clinical practices.
Diagnosing the impact of alternative calibration strategies on coupled hydrologic models
NASA Astrophysics Data System (ADS)
Smith, T. J.; Perera, C.; Corrigan, C.
2017-12-01
Hydrologic models represent a significant tool for understanding, predicting, and responding to the impacts of water on society and society on water resources and, as such, are used extensively in water resources planning and management. Given this important role, the validity and fidelity of hydrologic models is imperative. While extensive focus has been paid to improving hydrologic models through better process representation, better parameter estimation, and better uncertainty quantification, significant challenges remain. In this study, we explore a number of competing model calibration scenarios for simple, coupled snowmelt-runoff models to better understand the sensitivity / variability of parameterizations and its impact on model performance, robustness, fidelity, and transferability. Our analysis highlights the sensitivity of coupled snowmelt-runoff model parameterizations to alterations in calibration approach, underscores the concept of information content in hydrologic modeling, and provides insight into potential strategies for improving model robustness / fidelity.
Leveraging the BPEL Event Model to Support QoS-aware Process Execution
NASA Astrophysics Data System (ADS)
Zaid, Farid; Berbner, Rainer; Steinmetz, Ralf
Business processes executed using compositions of distributed Web Services are susceptible to different fault types. The Web Services Business Process Execution Language (BPEL) is widely used to execute such processes. While BPEL provides fault handling mechanisms to handle functional faults like invalid message types, it still lacks a flexible native mechanism to handle non-functional exceptions associated with violations of QoS levels that are typically specified in a governing Service Level Agreement (SLA), In this paper, we present an approach to complement BPEL's fault handling, where expected QoS levels and necessary recovery actions are specified declaratively in form of Event-Condition-Action (ECA) rules. Our main contribution is leveraging BPEL's standard event model which we use as an event space for the created ECA rules. We validate our approach by an extension to an open source BPEL engine.
A System for Cost and Reimbursement Control in Hospitals
Fetter, Robert B.; Thompson, John D.; Mills, Ronald E.
1976-01-01
This paper approaches the design of a regional or statewide hospital rate-setting system as the underpinning of a larger system which permits a regulatory agency to satisfy the requirements of various public laws now on the books or in process. It aims to generate valid interinstitutional monitoring on the three parameters of cost, utilization, and quality review. Such an approach requires the extension of the usual departmental cost and budgeting system to include consideration of the mix of patients treated and the utilization of various resources, including patient days, in the treatment of these patients. A sampling framework for the application of process-based quality studies and the generation of selected performance measurements is also included. PMID:941461
Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R
2017-01-01
Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.
Measurement of Laser Weld Temperatures for 3D Model Input
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dagel, Daryl; Grossetete, Grant; Maccallum, Danny O.
Laser welding is a key joining process used extensively in the manufacture and assembly of critical components for several weapons systems. Sandia National Laboratories advances the understanding of the laser welding process through coupled experimentation and modeling. This report summarizes the experimental portion of the research program, which focused on measuring temperatures and thermal history of laser welds on steel plates. To increase confidence in measurement accuracy, researchers utilized multiple complementary techniques to acquire temperatures during laser welding. This data serves as input to and validation of 3D laser welding models aimed at predicting microstructure and the formation of defectsmore » and their impact on weld-joint reliability, a crucial step in rapid prototyping of weapons components.« less
Camera-tracking gaming control device for evaluation of active wrist flexion and extension.
Shefer Eini, Dalit; Ratzon, Navah Z; Rizzo, Albert A; Yeh, Shih-Ching; Lange, Belinda; Yaffe, Batia; Daich, Alexander; Weiss, Patrice L; Kizony, Rachel
Cross sectional. Measuring wrist range of motion (ROM) is an essential procedure in hand therapy clinics. To test the reliability and validity of a dynamic ROM assessment, the Camera Wrist Tracker (CWT). Wrist flexion and extension ROM of 15 patients with distal radius fractures and 15 matched controls were assessed with the CWT and with a universal goniometer. One-way model intraclass correlation coefficient analysis indicated high test-retest reliability for extension (ICC = 0.92) and moderate reliability for flexion (ICC = 0.49). Standard error for extension was 2.45° and for flexion was 4.07°. Repeated-measures analysis revealed a significant main effect for group; ROM was greater in the control group (F[1, 28] = 47.35; P < .001). The concurrent validity of the CWT was partially supported. The results indicate that the CWT may provide highly reliable scores for dynamic wrist extension ROM, and moderately reliable scores for flexion, in people recovering from a distal radius fracture. N/A. Copyright © 2016 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.
Improving the Plasticity of LIMS Implementation: LIMS Extension through Microsoft Excel
NASA Technical Reports Server (NTRS)
Culver, Mark
2017-01-01
A Laboratory Information Management System (LIMS) is a databasing software with many built-in tools ideal for handling and documenting most laboratory processes in an accurate and consistent manner, making it an indispensable tool for the modern laboratory. However, a lot of LIMS end users will find that in the performance of analyses that have unique considerations such as standard curves, multiple stages incubations, or logical considerations, a base LIMS distribution may not ideally suit their needs. These considerations bring about the need for extension languages, which can extend the functionality of a LIMS. While these languages do provide the implementation team the functionality required to accommodate these special laboratory analyses, they are usually too complex for the end user to modify to compensate for natural changes in laboratory operations. The LIMS utilized by our laboratory offers a unique and easy-to-use choice for an extension language, one that is already heavily relied upon not only in science but also in most academic and business pursuits: Microsoft Excel. The validity of Microsoft Excel as a pseudo programming language and its usability and versatility as a LIMS extension language will be discussed. The NELAC implications and overall drawbacks of this LIMS configuration will also be discussed.
Cutting the wires: modularization of cellular networks for experimental design.
Lang, Moritz; Summers, Sean; Stelling, Jörg
2014-01-07
Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Does the U.S. exercise contagion on Italy? A theoretical model and empirical evidence
NASA Astrophysics Data System (ADS)
Cerqueti, Roy; Fenga, Livio; Ventura, Marco
2018-06-01
This paper deals with the theme of contagion in financial markets. At this aim, we develop a model based on Mixed Poisson Processes to describe the abnormal returns of financial markets of two considered countries. In so doing, the article defines the theoretical conditions to be satisfied in order to state that one of them - the so-called leader - exercises contagion on the others - the followers. Specifically, we employ an invariant probabilistic result stating that a suitable transformation of a Mixed Poisson Process is still a Mixed Poisson Process. The theoretical claim is validated by implementing an extensive simulation analysis grounded on empirical data. The countries considered are the U.S. (as the leader) and Italy (as the follower) and the period under scrutiny is very large, ranging from 1970 to 2014.
Barratt, M D; Langowski, J J
1999-01-01
The DEREK knowledge-based computer system contains a subset of approximately 50 rules describing chemical substructures (toxophores) responsible for skin sensitization. This rulebase, based originally on Unilever historical in-house guinea pig maximization test data, has been subject to extensive validation and is undergoing refinement as the next stage of its development. As part of an ongoing program of validation and testing, the predictive ability of the sensitization rule set has been assessed by processing the structures of the 84 chemical substances in the list of contact allergens issued by the BgVV (German Federal Institute for Health Protection of Consumers). This list of chemicals is important because the biological data for each of the chemicals have been carefully scrutinized and peer reviewed, a key consideration in an area of toxicology in which much unreliable and potentially misleading data have been published. The existing DEREK rulebase for skin sensitization identified toxophores for skin sensitization in the structures of 71 out of the 84 chemicals (85%). The exercise highlighted areas of chemistry where further development of the rulebase was required, either by extension of the scope of existing rules or by generation of new rules where a sound mechanistic rationale for the biological activity could be established. Chemicals likely to be acting as photoallergens were identified, and new rules for photoallergenicity have subsequently been written. At the end of the exercise, the refined rulebase was able to identify toxophores for skin sensitization for 82 of the 84 chemicals in the BgVV list.
Fluorescence In Situ Hybridization Probe Validation for Clinical Use.
Gu, Jun; Smith, Janice L; Dowling, Patricia K
2017-01-01
In this chapter, we provide a systematic overview of the published guidelines and validation procedures for fluorescence in situ hybridization (FISH) probes for clinical diagnostic use. FISH probes-which are classified as molecular probes or analyte-specific reagents (ASRs)-have been extensively used in vitro for both clinical diagnosis and research. Most commercially available FISH probes in the United States are strictly regulated by the U.S. Food and Drug Administration (FDA), the Centers for Disease Control and Prevention (CDC), the Centers for Medicare & Medicaid Services (CMS) the Clinical Laboratory Improvement Amendments (CLIA), and the College of American Pathologists (CAP). Although home-brewed FISH probes-defined as probes made in-house or acquired from a source that does not supply them to other laboratories-are not regulated by these agencies, they too must undergo the same individual validation process prior to clinical use as their commercial counterparts. Validation of a FISH probe involves initial validation and ongoing verification of the test system. Initial validation includes assessment of a probe's technical specifications, establishment of its standard operational procedure (SOP), determination of its clinical sensitivity and specificity, development of its cutoff, baseline, and normal reference ranges, gathering of analytics, confirmation of its applicability to a specific research or clinical setting, testing of samples with or without the abnormalities that the probe is meant to detect, staff training, and report building. Ongoing verification of the test system involves testing additional normal and abnormal samples using the same method employed during the initial validation of the probe.
On the Validity of Student Evaluation of Teaching: The State of the Art
ERIC Educational Resources Information Center
Spooren, Pieter; Brockx, Bert; Mortelmans, Dimitri
2013-01-01
This article provides an extensive overview of the recent literature on student evaluation of teaching (SET) in higher education. The review is based on the SET meta-validation model, drawing upon research reports published in peer-reviewed journals since 2000. Through the lens of validity, we consider both the more traditional research themes in…
Validation of the FEA of a deep drawing process with additional force transmission
NASA Astrophysics Data System (ADS)
Behrens, B.-A.; Bouguecha, A.; Bonk, C.; Grbic, N.; Vucetic, M.
2017-10-01
In order to meet requirements by automotive industry like decreasing the CO2 emissions, which reflects in reducing vehicles mass in the car body, the chassis and the powertrain, the continuous innovation and further development of existing production processes are required. In sheet metal forming processes the process limits and components characteristics are defined through the process specific loads. While exceeding the load limits, a failure in the material occurs, which can be avoided by additional force transmission activated in the deep drawing process before the process limit is achieved. This contribution deals with experimental investigations of a forming process with additional force transmission regarding the extension of the process limits. Based on FEA a tool system is designed and developed by IFUM. For this purpose, the steel material HCT600 is analyzed numerically. Within the experimental investigations, the deep drawing processes, with and without the additional force transmission are carried out. Here, a comparison of the produced rectangle cups is done. Subsequently, the identical deep drawing processes are investigated numerically. Thereby, the values of the punch reaction force and displacement are estimated and compared with experimental results. Thus, the validation of material model is successfully carried out on process scale. For further quantitative verification of the FEA results the experimental determined geometry of the rectangular cup is measured optically with ATOS system of the company GOM mbH and digitally compared with external software Geomagic®QualifyTM. The goal of this paper is the verification of the transferability of the FEA model for a conventional deep drawing process to a deep drawing process with additional force transmission with a counter punch.
Application of ideal pressure distribution in development process of automobile seats.
Kilincsoy, U; Wagner, A; Vink, P; Bubb, H
2016-07-19
In designing a car seat the ideal pressure distribution is important as it is the largest contact surface between the human and the car. Because of obstacles hindering a more general application of the ideal pressure distribution in seating design, multidimensional measuring techniques are necessary with extensive user tests. The objective of this study is to apply and integrate the knowledge about the ideal pressure distribution in the seat design process for a car manufacturer in an efficient way. Ideal pressure distribution was combined with pressure measurement, in this case pressure mats. In order to integrate this theoretical knowledge of seating comfort in the seat development process for a car manufacturer a special user interface was defined and developed. The mapping of the measured pressure distribution in real-time and accurately scaled to actual seats during test setups directly lead to design implications for seat design even during the test situation. Detailed analysis of the subject's feedback was correlated with objective measurements of the subject's pressure distribution in real time. Therefore existing seating characteristics were taken into account as well. A user interface can incorporate theoretical and validated 'state of the art' models of comfort. Consequently, this information can reduce extensive testing and lead to more detailed results in a shorter time period.
Materials Safety - Not just Flammability and Toxic Offgassing
NASA Technical Reports Server (NTRS)
Pedley, Michael D.
2007-01-01
For many years, the safety community has focused on a limited subset of materials and processes requirements as key to safety: Materials flammability, Toxic offgassing, Propellant compatibility, Oxygen compatibility, and Stress-corrosion cracking. All these items are important, but the exclusive focus on these items neglects many other items that are equally important to materials safety. Examples include (but are not limited to): 1. Materials process control -- proper qualification and execution of manufacturing processes such as structural adhesive bonding, welding, and forging are crucial to materials safety. Limitation of discussions on materials process control to an arbitrary subset of processes, known as "critical processes" is a mistake, because any process where the quality of the product cannot be verified by inspection can potentially result in unsafe hardware 2 Materials structural design allowables -- development of valid design allowables when none exist in the literature requires extensive testing of multiple lots of materials and is extremely expensive. But, without valid allowables, structural analysis cannot verify structural safety 3. Corrosion control -- All forms of corrosion, not just stress corrosion, can affect structural integrity of hardware 4. Contamination control during ground processing -- contamination control is critical to manufacturing processes such as adhesive bonding and also to elimination foreign objects and debris (FOD) that are hazardous to the crew of manned spacecraft in microgravity environments. 5. Fasteners -- Fastener design, the use of verifiable secondary locking features, and proper verification of fastener torque are essential for proper structural performance This presentation discusses some of these key factors and the importance of considering them in ensuring the safety of space hardware.
López-Miñarro, Pedro Ángel; Vaquero-Cristóbal, Raquel; Muyor, José María; Espejo-Antúnez, Luis
2015-07-01
lumbo-sacral posture and the sit-andreach score have been proposed as measures of hamstring extensibility. However, the validity is influenced by sample characteristics. to determine the validity of lumbo-horizontal angle and score in the sit-and-reach test as measures of hamstring extensibility in older women. a hundred and twenty older women performed the straight leg raise test with both leg, and the sit-and-reach test (SR) in a random order. For the sitand- reach test, the score and the lumbo-sacral posture in bending (lumbo-horizontal angle, L-Hfx) were measured. the mean values of straight leg raise in left and right leg were 81.70 ± 13.83º and 82.10 ± 14.36º, respectively. The mean value of EPR of both legs was 81.90 ± 12.70º. The mean values of SR score and L-Hfx were -1.54 ± 8.09 cm and 91.08º ± 9.32º, respectively. The correlation values between the mean straight leg raise test with respect to lumbo-sacral posture and SR score were moderate (L-Hfx: r = -0.72, p < 0.01; SR: r = 0.70, p < 0.01). Both variables independently explained about 50% of the variance (L-Hfx: R2 = 0.52, p < 0,001; SR: R2 = 0.49, p < 0,001). the validity of lumbo-sacral posture in bending as measure of hamstring muscle extensibility on older women is moderate, with similar values than SR score. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
NASA Astrophysics Data System (ADS)
Marion, Giles M.; Kargel, Jeffrey S.
Implementation of the Pitzer approach is through the FREZCHEM (FREEZING CHEMISTRY) model, which is at the core of this work. This model was originally designed to simulate salt chemistries and freezing processes at low temperatures (-54 to 25°C) and 1 atm pressure. Over the years, this model has been broadened to include more chemistries (from 16 to 58 solid phases), a broader temperature range for some chemistries (to 113°C), and incorporation of a pressure dependence (1 to 1000 bars) into the model. Implementation, parameterization, validation, and limitations of the FREZCHEM model are extensively discussed in Chapter 3.
A non-orthogonal material model of woven composites in the preforming process
Zhang, Weizhao; Ren, Huaqing; Liang, Biao; ...
2017-05-04
Woven composites are considered as a promising material choice for lightweight applications. An improved non-orthogonal material model that can decouple the strong tension and weak shear behaviour of the woven composite under large shear deformation is proposed for simulating the preforming of woven composites. The tension, shear and compression moduli in the model are calibrated using the tension, bias-extension and bending experiments, respectively. The interaction between the composite layers is characterized by a sliding test. The newly developed material model is implemented in the commercial finite element software LS-DYNA® and validated by a double dome study.
Oral Biofluid Biomarker Research: Current Status and Emerging Frontiers
Wang, Austin; Wang, Chris P.; Tu, Michael; Wong, David T.W.
2016-01-01
Salivary diagnostics is a rapidly advancing field that offers clinicians and patients the potential of rapid, noninvasive diagnostics with excellent accuracy. In order for the complete realization of the potential of saliva, however, extensive profiling of constituents must be conducted and diagnostic biomarkers must be thoroughly validated. This article briefly overviews the process of conducting a study of salivary biomarkers in a patient cohort and highlights the studies that have been conducted on different classes of molecules in the saliva. Emerging frontiers in salivary diagnostics research that may significantly advance the field will also be highlighted. PMID:27999326
Hospital Based Customization of a Medical Information System
Rath, Marilyn A.; Ferguson, Julie C.
1983-01-01
A Medical Information System must be current if it is to be a viable adjunct to patient care within a hospital setting. Hospital-based customization provides a means of achieving this timeliness with maximum user satisfaction. It, however, requires a major commitment in personnel time as well as additional software and training expenses. The enhanced control of system modifications and overall flexibility in planning the change process result in enthusiastic support of this approach by many hospitals. The key factors for success include careful selection of local personnel with adequate vendor support, extensive QA control, thorough auditing/validation and direct user involvement.
Multiscale GPS tomography during COPS: validation and applications
NASA Astrophysics Data System (ADS)
Champollion, Cédric; Flamant, Cyrille; Masson, Frédéric; Gégout, Pascal; Boniface, Karen; Richard, Evelyne
2010-05-01
Accurate 3D description of the water vapour field is of interest for process studies such as convection initiation. None of the current techniques (LIDAR, satellite, radio soundings, GPS) can provide an all weather continuous 3D field of moisture. The combination of GPS tomography with radio-soundings (and/or LIDAR) has been used for such process studies using both advantages of vertically resolved soundings and high temporal density of GPS measurements. GPS tomography has been used at short scale (10 km horizontal resolution but in a 50 km² area) for process studies such as the ESCOMPTE experiment (Bastin et al., 2005) and at larger scale (50 km horizontal resolution) during IHOP_2002. But no extensive statistical validation has been done so far. The overarching goal of the COPS field experiment is to advance the quality of forecasts of orographically induced convective precipitation by four-dimensional observations and modeling of its life cycle for identifying the physical and chemical processes responsible for deficiencies in QPF over low-mountain regions. During the COPS field experiment, a GPS network of about 100 GPS stations has been continuously operating during three months in an area of 500 km² in the East of France (Vosges Mountains) and West of Germany (Black Forest). If the mean spacing between the GPS is about 50 km, an East-West GPS profile with a density of about 10 km is dedicated to high resolution tomography. One major goal of the GPS COPS experiment is to validate the GPS tomography with different spatial resolutions. Validation is based on additional radio-soundings and airborne / ground-based LIDAR measurement. The number and the high quality of vertically resolved water vapor observations give an unique data set for GPS tomography validation. Numerous tests have been done on real data to show the type water vapor structures that can be imaging by GPS tomography depending of the assimilation of additional data (radio soundings), the resolution of the tomography grid and the density of GPS network. Finally some applications to different cases studies will be shortly presented.
Shackleton, David; Pagram, Jenny; Ives, Lesley; Vanhinsbergh, Des
2018-06-02
The RapidHIT™ 200 System is a fully automated sample-to-DNA profile system designed to produce high quality DNA profiles within 2h. The use of RapidHIT™ 200 System within the United Kingdom Criminal Justice System (UKCJS) has required extensive development and validation of methods with a focus on AmpFℓSTR ® NGMSElect™ Express PCR kit to comply with specific regulations for loading to the UK National DNA Database (NDNAD). These studies have been carried out using single source reference samples to simulate live reference samples taken from arrestees and victims for elimination. The studies have shown that the system is capable of generating high quality profile and has achieved the accreditations necessary to load to the NDNAD; a first for the UK. Copyright © 2018 Elsevier B.V. All rights reserved.
Goyens, C; Jamet, C; Ruddick, K G
2013-09-09
The present study provides an extensive overview of red and near infra-red (NIR) spectral relationships found in the literature and used to constrain red or NIR-modeling schemes in current atmospheric correction (AC) algorithms with the aim to improve water-leaving reflectance retrievals, ρw(λ), in turbid waters. However, most of these spectral relationships have been developed with restricted datasets and, subsequently, may not be globally valid, explaining the need of an accurate validation exercise. Spectral relationships are validated here with turbid in situ data for ρw(λ). Functions estimating ρw(λ) in the red were only valid for moderately turbid waters (ρw(λNIR) < 3.10(-3)). In contrast, bounding equations used to limit ρw(667) retrievals according to the water signal at 555 nm, appeared to be valid for all turbidity ranges presented in the in situ dataset. In the NIR region of the spectrum, the constant NIR reflectance ratio suggested by Ruddick et al. (2006) (Limnol. Oceanogr. 51, 1167-1179), was valid for moderately to very turbid waters (ρw(λNIR) < 10(-2)) while the polynomial function, initially developed by Wang et al. (2012) (Opt. Express 20, 741-753) with remote sensing reflectances over the Western Pacific, was also valid for extremely turbid waters (ρw(λNIR) > 10(-2)). The results of this study suggest to use the red bounding equations and the polynomial NIR function to constrain red or NIR-modeling schemes in AC processes with the aim to improve ρw(λ) retrievals where current AC algorithms fail.
Mootanah, R.; Imhauser, C.W.; Reisse, F.; Carpanen, D.; Walker, R.W.; Koff, M.F.; Lenhoff, M.W.; Rozbruch, S.R.; Fragomen, A.T.; Dewan, Z.; Kirane, Y.M.; Cheah, Pamela A.; Dowell, J.K.; Hillstrom, H.J.
2014-01-01
A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between EE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning. PMID:24786914
Mootanah, R; Imhauser, C W; Reisse, F; Carpanen, D; Walker, R W; Koff, M F; Lenhoff, M W; Rozbruch, S R; Fragomen, A T; Dewan, Z; Kirane, Y M; Cheah, K; Dowell, J K; Hillstrom, H J
2014-01-01
A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between FE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning.
Validating spatial structure in canopy water content using geostatistics
NASA Technical Reports Server (NTRS)
Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.
1995-01-01
Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.
Integrating technologies for oil spill response in the SW Iberian coast
NASA Astrophysics Data System (ADS)
Janeiro, J.; Neves, A.; Martins, F.; Relvas, P.
2017-09-01
An operational oil spill modelling system developed for the SW Iberia Coast is used to investigate the relative importance of the different components and technologies integrating an oil spill monitoring and response structure. A backtrack of a CleanSeaNet oil detection in the region is used to demonstrate the concept. Taking advantage of regional operational products available, the system provides the necessary resolution to go from regional to coastal scales using a downscalling approach, while a multi-grid methodology allows the based oil spill model to span across model domains taking full advantage of the increasing resolution between the model grids. An extensive validation procedure using a multiplicity of sensors, with good spatial and temporal coverage, strengthens the operational system ability to accurately solve coastal scale processes. The model is validated using available trajectories from satellite-tracked drifters. Finally, a methodology is proposed to identifying potential origins for the CleanSeaNet oil detection, by combining model backtrack results with ship trajectories supplied by AIS was developed, including the error estimations found in the backtrack validation.
NASA Astrophysics Data System (ADS)
Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang
2018-02-01
Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.
2016-02-08
Data Display Markup Language HUD heads-up display IRIG Inter-Range Instrumentation Group RCC Range Commanders Council SVG Scalable Vector Graphics...T&E test and evaluation TMATS Telemetry Attributes Transfer Standard XML eXtensible Markup Language DDML Schema Validation, RCC 126-16, February...2016 viii This page intentionally left blank. DDML Schema Validation, RCC 126-16, February 2016 1 1. Introduction This Data Display Markup
The Resilience Scale for Adults: Construct Validity and Measurement in a Belgian Sample
ERIC Educational Resources Information Center
Hjemdal, Odin; Friborg, Oddgeir; Braun, Stephanie; Kempenaers, Chantal; Linkowski, Paul; Fossion, Pierre
2011-01-01
The Resilience Scale for Adults (RSA) was developed and has been extensively validated in Norwegian samples. The purpose of this study was to explore the construct validity of the Resilience Scale for Adults in a French-speaking Belgian sample and test measurement invariance between the Belgian and a Norwegian sample. A Belgian student sample (N =…
Extension, validation and application of the NASCAP code
NASA Technical Reports Server (NTRS)
Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.
1979-01-01
Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.
ERIC Educational Resources Information Center
Seboka, B.; Deressa, A.
2000-01-01
Indigenous social networks of Ethiopian farmers participate in seed exchange based on mutual interdependence and trust. A government-imposed extension program must validate the role of local seed systems in developing a national seed industry. (SK)
Syntactic and semantic restrictions on morphological recomposition: MEG evidence from Greek.
Neophytou, K; Manouilidou, C; Stockall, L; Marantz, A
2018-05-16
Complex morphological processing has been extensively studied in the past decades. However, most of this work has either focused on only certain steps involved in this process, or it has been conducted on a few languages, like English. The purpose of the present study is to investigate the spatiotemporal cortical processing profile of the distinct steps previously reported in the literature, from decomposition to re-composition of morphologically complex items, in a relatively understudied language, Greek. Using magnetoencephalography, we confirm the role of the fusiform gyrus in early, form-based morphological decomposition, we relate the syntactic licensing of stem-suffix combinations to the ventral visual processing stream, somewhat independent from lexical access for the stem, and we further elucidate the role of orbitofrontal regions in semantic composition. Thus, the current study offers the most comprehensive test to date of visual morphological processing and additional, crosslinguistic validation of the steps involved in it. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Prunescu, Remus Mihail; Sin, Gürkan
2013-12-01
The enzymatic hydrolysis process is one of the key steps in second generation biofuel production. After being thermally pretreated, the lignocellulosic material is liquefied by enzymes prior to fermentation. The scope of this paper is to evaluate a dynamic model of the hydrolysis process on a demonstration scale reactor. The following novel features are included: the application of the Convection-Diffusion-Reaction equation to a hydrolysis reactor to assess transport and mixing effects; the extension of a competitive kinetic model with enzymatic pH dependency and hemicellulose hydrolysis; a comprehensive pH model; and viscosity estimations during the course of reaction. The model is evaluated against real data extracted from a demonstration scale biorefinery throughout several days of operation. All measurements are within predictions uncertainty and, therefore, the model constitutes a valuable tool to support process optimization, performance monitoring, diagnosis and process control at full-scale studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Vranckx, Pascal; McFadden, Eugene; Cutlip, Donald E; Mehran, Roxana; Swart, Michael; Kint, P P; Zijlstra, Felix; Silber, Sigmund; Windecker, Stephan; Serruys, Patrick W C J
2013-01-01
Globalisation in coronary stent research calls for harmonization of clinical endpoint definitions and event adjudication. Little has been published about the various processes used for event adjudication or their impact on outcome reporting. We performed a validation of the clinical event committee (CEC) adjudication process on 100 suspected events in the RESOLUTE All-comers trial (Resolute-AC). Two experienced Clinical Research Organisations (CRO) that had already extensive internal validation processes in place, participated in the study. After initial adjudication by the primary-CEC, events were cross-adjudicated by an external-CEC using the same definitions. Major discrepancies affecting the primary end point of target-lesion failure (TLF), a composite of cardiac death, target vessel myocardial infarction (TV-MI), or clinically-indicated target-lesion revascularization (CI-TLR), were analysed by an independent oversight committee who provided recommendations for harmonization. Discordant adjudications were reconsidered by the primary CEC. Subsequently, the RAC database was interrogated for cases that based on these recommendations merited re-adjudication and these cases were also re-adjudicated by the primary CEC. Final discrepancies in adjudication of individual components of TLF occurred in 7 out of 100 events in 5 patients. Discrepancies for the (hierarchical) primary endpoint occurred in 5 events (2 cardiac deaths and 3 TV-MI). After application of harmonization recommendations to the overall RAC population (n=2292), the primary CEC adjudicated 3 additional clinical-TLRs and considered 1 TV-MI as no event. A harmonization process provided a high level of concordance for event adjudication and improved accuracy for final event reporting. These findings suggest it is feasible to pool clinical event outcome data across clinical trials even when different CECs are responsible for event adjudication. Copyright © 2012 Elsevier Inc. All rights reserved.
Motor assessment using the NIH Toolbox
Magasi, Susan; McCreath, Heather E.; Bohannon, Richard W.; Wang, Ying-Chih; Bubela, Deborah J.; Rymer, William Z.; Beaumont, Jennifer; Rine, Rose Marie; Lai, Jin-Shei; Gershon, Richard C.
2013-01-01
Motor function involves complex physiologic processes and requires the integration of multiple systems, including neuromuscular, musculoskeletal, and cardiopulmonary, and neural motor and sensory-perceptual systems. Motor-functional status is indicative of current physical health status, burden of disease, and long-term health outcomes, and is integrally related to daily functioning and quality of life. Given its importance to overall neurologic health and function, motor function was identified as a key domain for inclusion in the NIH Toolbox for Assessment of Neurological and Behavioral Function (NIH Toolbox). We engaged in a 3-stage developmental process to: 1) identify key subdomains and candidate measures for inclusion in the NIH Toolbox, 2) pretest candidate measures for feasibility across the age span of people aged 3 to 85 years, and 3) validate candidate measures against criterion measures in a sample of healthy individuals aged 3 to 85 years (n = 340). Based on extensive literature review and input from content experts, the 5 subdomains of dexterity, strength, balance, locomotion, and endurance were recommended for inclusion in the NIH Toolbox motor battery. Based on our validation testing, valid and reliable measures that are simultaneously low-cost and portable have been recommended to assess each subdomain, including the 9-hole peg board for dexterity, grip dynamometry for upper-extremity strength, standing balance test, 4-m walk test for gait speed, and a 2-minute walk test for endurance. PMID:23479547
Tranpsort phenomena in solidification processing of functionally graded materials
NASA Astrophysics Data System (ADS)
Gao, Juwen
A combined numerical and experimental study of the transport phenomena during solidification processing of metal matrix composite functionally graded materials (FGMs) is conducted in this work. A multiphase transport model for the solidification of metal-matrix composite FGMs has been developed that accounts for macroscopic particle segregation due to liquid-particle flow and particle-solid interactions. An experimental study has also been conducted to gain physical insight as well as to validate the model. A novel method to in-situ measure the particle volume fraction using fiber optic probes is developed for transparent analogue solidification systems. The model is first applied to one-dimensional pure matrix FGM solidification under gravity or centrifugal field and is extensively validated against the experimental results. The mechanisms for the formation of particle concentration gradient are identified. Two-dimensional solidification of pure matrix FGM with convection is then studied using the model as well as experiments. The interaction among convection flow, solidification process and the particle transport is demonstrated. The results show the importance of convection in the particle concentration gradient formation. Then, simulations for alloy FGM solidification are carried out for unidirectional solidification as well as two-dimensional solidification with convection. The interplay among heat and species transport, convection and particle motion is investigated. Finally, future theoretical and experimental work is outlined.
NASA Technical Reports Server (NTRS)
Starr, David
1999-01-01
The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include ASTER, CERES, MISR, MODIS and MOPITT. In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS), AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2, though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community. Detailed information about the EOS Terra validation Program can be found on the EOS Validation program homepage i/e.: http://ospso.gsfc.nasa.gov/validation/valpage.html).
NASA Technical Reports Server (NTRS)
Nieten, Joseph L.; Burke, Roger
1992-01-01
The System Diagnostic Builder (SDB) is an automated software verification and validation tool using state-of-the-art Artificial Intelligence (AI) technologies. The SDB is used extensively by project BURKE at NASA-JSC as one component of a software re-engineering toolkit. The SDB is applicable to any government or commercial organization which performs verification and validation tasks. The SDB has an X-window interface, which allows the user to 'train' a set of rules for use in a rule-based evaluator. The interface has a window that allows the user to plot up to five data parameters (attributes) at a time. Using these plots and a mouse, the user can identify and classify a particular behavior of the subject software. Once the user has identified the general behavior patterns of the software, he can train a set of rules to represent his knowledge of that behavior. The training process builds rules and fuzzy sets to use in the evaluator. The fuzzy sets classify those data points not clearly identified as a particular classification. Once an initial set of rules is trained, each additional data set given to the SDB will be used by a machine learning mechanism to refine the rules and fuzzy sets. This is a passive process and, therefore, it does not require any additional operator time. The evaluation component of the SDB can be used to validate a single software system using some number of different data sets, such as a simulator. Moreover, it can be used to validate software systems which have been re-engineered from one language and design methodology to a totally new implementation.
Containerless Processing on ISS: Ground Support Program for EML
NASA Technical Reports Server (NTRS)
Diefenbach, Angelika; Schneider, Stephan; Willnecker, Rainer
2012-01-01
EML is an electromagnetic levitation facility planned for the ISS aiming at processing and investigating liquid metals or semiconductors by using electromagnetic levitation technique under microgravity with reduced electromagnetic fields and convection conditions. Its diagnostics and processing methods allow to measure thermophysical properties in the liquid state over an extended temperature range and to investigate solidification phenomena in undercooled melts. The EML project is a common effort of The European Space Agency (ESA) and the German Space Agency DLR. The Microgravity User Support Centre MUSC at Cologne, Germany, has been assigned the responsibility for EML operations. For the EML experiment preparation an extensive scientific ground support program is established at MUSC, providing scientific and technical services in the preparation, performance and evaluation of the experiments. Its final output is the transcription of the scientific goals and requirements into validated facility control parameters for the experiment execution onboard the ISS.
The value and validation of broad spectrum biosensors for diagnosis and biodefense
Metzgar, David; Sampath, Rangarajan; Rounds, Megan A; Ecker, David J
2013-01-01
Broad spectrum biosensors capable of identifying diverse organisms are transitioning from the realm of research into the clinic. These technologies simultaneously capture signals from a wide variety of biological entities using universal processes. Specific organisms are then identified through bioinformatic signature-matching processes. This is in contrast to currently accepted molecular diagnostic technologies, which utilize unique reagents and processes to detect each organism of interest. This paradigm shift greatly increases the breadth of molecular diagnostic tools with little increase in biochemical complexity, enabling simultaneous diagnostic, epidemiologic, and biothreat surveillance capabilities at the point of care. This, in turn, offers the promise of increased biosecurity and better antimicrobial stewardship. Efficient realization of these potential gains will require novel regulatory paradigms reflective of the generalized, information-based nature of these assays, allowing extension of empirical data obtained from readily available organisms to support broader reporting of rare, difficult to culture, or extremely hazardous organisms. PMID:24128433
Hounsome, J; Whittington, R; Brown, A; Greenhill, B; McGuire, J
2018-01-01
While structured professional judgement approaches to assessing and managing the risk of violence have been extensively examined in mental health/forensic settings, the application of the findings to people with an intellectual disability is less extensively researched and reviewed. This review aimed to assess whether risk assessment tools have adequate predictive validity for violence in adults with an intellectual disability. Standard systematic review methodology was used to identify and synthesize appropriate studies. A total of 14 studies were identified as meeting the inclusion criteria. These studies assessed the predictive validity of 18 different risk assessment tools, mainly in forensic settings. All studies concluded that the tools assessed were successful in predicting violence. Studies were generally of a high quality. There is good quality evidence that risk assessment tools are valid for people with intellectual disability who offend but further research is required to validate tools for use with people with intellectual disability who offend. © 2016 John Wiley & Sons Ltd.
Morf, Carolyn C; Schürch, Eva; Küfner, Albrecht; Siegrist, Philip; Vater, Aline; Back, Mitja; Mestel, Robert; Schröder-Abé, Michela
2017-06-01
The Pathological Narcissism Inventory (PNI) is a multidimensional measure for assessing grandiose and vulnerable features in narcissistic pathology. The aim of the present research was to construct and validate a German translation of the PNI and to provide further information on the PNI's nomological net. Findings from a first study confirm the psychometric soundness of the PNI and replicate its seven-factor first-order structure. A second-order structure was also supported but with several equivalent models. A second study investigating associations with a broad range of measures ( DSM Axis I and II constructs, emotions, personality traits, interpersonal and dysfunctional behaviors, and well-being) supported the concurrent validity of the PNI. Discriminant validity with the Narcissistic Personality Inventory was also shown. Finally, in a third study an extension in a clinical inpatient sample provided further evidence that the PNI is a useful tool to assess the more pathological end of narcissism.
Ensuring Data Quality in Extension Research and Evaluation Studies
ERIC Educational Resources Information Center
Radhakrishna, Rama; Tobin, Daniel; Brennan, Mark; Thomson, Joan
2012-01-01
This article presents a checklist as a guide for Extension professionals to use in research and evaluation studies they carry out. A total of 40 statements grouped under eight data quality components--relevance, objectivity, validity, reliability, integrity, generalizability, completeness, and utility--are identified to ensure that research…
ERIC Educational Resources Information Center
Bolton, Elizabeth B.; White, Lynn
Nineteen papers are included in this document: "Potential and Impact: Assessment and Validation in Leadership Development" (Boatman); "Using an Organizational Diagnostic Instrument to Analyze Perceptions of the Virginia Extension Homemakers Council" (Newhouse, Chandler, Tuckwiller); "Image: Who Needs It?" (Hendricks,…
Validation of the ULCEAT methodology by applying it in retrospect to the Roboticbed.
Nakamura, Mio; Suzurikawa, Jun; Tsukada, Shohei; Kume, Yohei; Kawakami, Hideo; Inoue, Kaoru; Inoue, Takenobu
2015-01-01
In answer to the increasing demand for care by the Japanese oldest portion of the population, an extensive programme of life support robots is under development, advocated by the Japanese government. Roboticbed® (RB) is developed to facilitate patients in their daily life in making independent transfers from and to the bed. The bed is intended both for elderly and persons with a disability. The purpose of this study is to examine the validity of the user and user's life centred clinical evaluation of assistive technology (ULCEAT) methodology. To support user centred development of life support robots the ULCEAT method was developed. By means of the ULCEAT method the target users and the use environment were re-established in an earlier study. The validity of the method is tested by re-evaluating the development of RB in retrospect. Six participants used the first prototype of RB (RB1) and eight participants used the second prototype of RB (RB2). The results indicated that the functionality was improved owing to the end-user evaluations. Therefore, we confirmed the content validity of the proposed ULCEAT method. In this study we confirmed the validation of the ULCEAT methodology by applying it in retrospect to RB using development process. This method will be used for the development of Life-support robots and prototype assistive technologies.
Zullig, Keith J; Collins, Rani; Ghani, Nadia; Patton, Jon M; Scott Huebner, E; Ajamie, Jean
2014-02-01
The School Climate Measure (SCM) was developed and validated in 2010 in response to a dearth of psychometrically sound school climate instruments. This study sought to further validate the SCM on a large, diverse sample of Arizona public school adolescents (N = 20,953). Four SCM domains (positive student-teacher relationships, academic support, order and discipline, and physical environment) were available for the analysis. Confirmatory factor analysis and structural equation modeling were established to construct validity, and criterion-related validity was assessed via selected Youth Risk Behavior Survey (YRBS) school safety items and self-reported grade (GPA) point average. Analyses confirmed the 4 SCM school climate domains explained approximately 63% of the variance (factor loading range .45-.92). Structural equation models fit the data well χ(2) = 14,325 (df = 293, p < .001), comparative fit index (CFI) = .951, Tuker-Lewis index (TLI) = .952, root mean square error of approximation (RMSEA) = .05). The goodness-of-fit index was .940. Coefficient alphas ranged from .82 to .93. Analyses of variance with post hoc comparisons suggested the SCM domains related in hypothesized directions with the school safety items and GPA. Additional evidence supports the validity and reliability of the SCM. Measures, such as the SCM, can facilitate data-driven decisions and may be incorporated into evidenced-based processes designed to improve student outcomes. © 2014, American School Health Association.
Multi-agent cooperation pursuit based on an extension of AALAADIN organisational model
NASA Astrophysics Data System (ADS)
Souidi, Mohammed El Habib; Songhao, Piao; Guo, Li; Lin, Chang
2016-11-01
An approach of cooperative pursuit for multiple mobile targets based on multi-agents system is discussed. In this kind of problem the pursuit process is divided into two kinds of tasks. The first one (coalition problem) is designed to solve the problem of the pursuit team formation. To achieve this mission, we used an innovative method based on a dynamic organisation and reorganisation of the pursuers' groups. We introduce our coalition strategy extended from the organisational agent, group, role model by assigning an access mechanism to the groups inspired by fuzzy logic principles. The second task (motion problem) is the treatment of the pursuers' motion strategy. To manage this problem we applied the principles of the Markov decision process. Simulation results show the feasibility and validity of the given proposal.
Enhanced diffusion on oscillating surfaces through synchronization
NASA Astrophysics Data System (ADS)
Wang, Jin; Cao, Wei; Ma, Ming; Zheng, Quanshui
2018-02-01
The diffusion of molecules and clusters under nanoscale confinement or absorbed on surfaces is the key controlling factor in dynamical processes such as transport, chemical reaction, or filtration. Enhancing diffusion could benefit these processes by increasing their transport efficiency. Using a nonlinear Langevin equation with an extensive number of simulations, we find a large enhancement in diffusion through surface oscillation. For helium confined in a narrow carbon nanotube, the diffusion enhancement is estimated to be over three orders of magnitude. A synchronization mechanism between the kinetics of the particles and the oscillating surface is revealed. Interestingly, a highly nonlinear negative correlation between diffusion coefficient and temperature is predicted based on this mechanism, and further validated by simulations. Our results provide a general and efficient method for enhancing diffusion, especially at low temperatures.
A glacier runoff extension to the Precipitation Runoff Modeling System
Van Beusekom, Ashley E.; Viger, Roland
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while maintaining model usability. PRMSglacier is validated on two basins in Alaska, Wolverine, and Gulkana Glacier basin, which have been studied since 1966 and have a substantial amount of data with which to test model performance over a long period of time covering a wide range of climatic and hydrologic conditions. When error in field measurements is considered, the Nash-Sutcliffe efficiencies of streamflow are 0.87 and 0.86, the absolute bias fractions of the winter mass balance simulations are 0.10 and 0.08, and the absolute bias fractions of the summer mass balances are 0.01 and 0.03, all computed over 42 years for the Wolverine and Gulkana Glacier basins, respectively. Without taking into account measurement error, the values are still within the range achieved by the more computationally expensive codes tested over shorter time periods.
Structural aspects of Lorentz-violating quantum field theory
NASA Astrophysics Data System (ADS)
Cambiaso, M.; Lehnert, R.; Potting, R.
2018-01-01
In the last couple of decades the Standard Model Extension has emerged as a fruitful framework to analyze the empirical and theoretical extent of the validity of cornerstones of modern particle physics, namely, of Special Relativity and of the discrete symmetries C, P and T (or some combinations of these). The Standard Model Extension allows to contrast high-precision experimental tests with posited alterations representing minute Lorentz and/or CPT violations. To date no violation of these symmetry principles has been observed in experiments, mostly prompted by the Standard-Model Extension. From the latter, bounds on the extent of departures from Lorentz and CPT symmetries can be obtained with ever increasing accuracy. These analyses have been mostly focused on tree-level processes. In this presentation I would like to comment on structural aspects of perturbative Lorentz violating quantum field theory. I will show that some insight coming from radiative corrections demands a careful reassessment of perturbation theory. Specifically I will argue that both the standard renormalization procedure as well as the Lehmann-Symanzik-Zimmermann reduction formalism need to be adapted given that the asymptotic single-particle states can receive quantum corrections from Lorentz-violating operators that are not present in the original Lagrangian.
Real-Time Prediction of Temperature Elevation During Robotic Bone Drilling Using the Torque Signal.
Feldmann, Arne; Gavaghan, Kate; Stebinger, Manuel; Williamson, Tom; Weber, Stefan; Zysset, Philippe
2017-09-01
Bone drilling is a surgical procedure commonly required in many surgical fields, particularly orthopedics, dentistry and head and neck surgeries. While the long-term effects of thermal bone necrosis are unknown, the thermal damage to nerves in spinal or otolaryngological surgeries might lead to partial paralysis. Previous models to predict the temperature elevation have been suggested, but were not validated or have the disadvantages of computation time and complexity which does not allow real time predictions. Within this study, an analytical temperature prediction model is proposed which uses the torque signal of the drilling process to model the heat production of the drill bit. A simple Green's disk source function is used to solve the three dimensional heat equation along the drilling axis. Additionally, an extensive experimental study was carried out to validate the model. A custom CNC-setup with a load cell and a thermal camera was used to measure the axial drilling torque and force as well as temperature elevations. Bones with different sets of bone volume fraction were drilled with two drill bits ([Formula: see text]1.8 mm and [Formula: see text]2.5 mm) and repeated eight times. The model was calibrated with 5 of 40 measurements and successfully validated with the rest of the data ([Formula: see text]C). It was also found that the temperature elevation can be predicted using only the torque signal of the drilling process. In the future, the model could be used to monitor and control the drilling process of surgeries close to vulnerable structures.
NASA Astrophysics Data System (ADS)
Rack, Wolfgang; Haas, Christian; Langhorne, Pat; Leonard, Greg; Price, Dan; Barnsdale, Kelvin; Soltanzadeh, Iman
2014-05-01
Melting and freezing processes in the ice shelf cavities of the Ross and McMurdo Ice Shelves significantly influence the sea ice formation in McMurdo Sound. Between 2009 and 2013 we used a helicopter-borne laser and electromagnetic induction sounder (EM bird) to measure thickness and freeboard profiles across the ice shelf and the landfast sea ice, which was accompanied by extensive field validation, and coordinated with satellite altimeter overpasses. Using freeboard and thickness, the bulk density of all ice types was calculated assuming hydrostatic equilibrium. Significant density steps were detected between first-year and multi-year sea ice, with higher values for the younger sea ice. Values are overestimated in areas with abundance of sub-ice platelets because of overestimation in both ice thickness and freeboard. On the ice shelf, bulk ice densities were sometimes higher than that of pure ice, which can be explained by both the accretion of marine ice and glacial sediments. For thin ice, the freeboard to thickness conversion critically depends on the knowledge of snow properties. Our measurements allow tuning and validation of snow cover simulations using the Weather Research Forecasting (WRF) model. The simulated snowcover is used to calculate ice thickness from satellite derived freeboard. The results of our measurements, which are supported by the New Zealand Antarctic programme, draw a picture of how oceanographic processes influence the ice shelf morphology and sea ice formation in McMurdo Sound, and how satellite derived freeboard of ICESat and CryoSat together with information on snow cover can potentially capture the signature of these processes.
ERIC Educational Resources Information Center
Lane, Kathleen Lynne; Oakes, Wendy P.; Ennis, Robin Parks; Cox, Meredith Lucille; Schatschneider, Christopher; Lambert, Warren
2013-01-01
This study reports findings from a validation study of the Student Risk Screening Scale for use with 9th- through 12th-grade students (N = 1854) attending a rural fringe school. Results indicated high internal consistency, test-retest stability, and inter-rater reliability. Predictive validity was established across two academic years, with Spring…
ERIC Educational Resources Information Center
Kennealy, Patrick J.; Hicks, Brian M.; Patrick, Christopher J.
2007-01-01
The validity of the Psychopathy Checklist-Revised (PCL-R) has been examined extensively in men, but its validity for women remains understudied. Specifically, the correlates of the general construct of psychopathy and its components as assessed by PCL-R total, factor, and facet scores have yet to be examined in depth. Based on previous research…
ERIC Educational Resources Information Center
Benner, Gregory J.; Uhing, Brad M.; Pierce, Corey D.; Beaudoin, Kathleen M.; Ralston, Nicole C.; Mooney, Paul
2009-01-01
We sought to extend instrument validation research for the Systematic Screening for Behavior Disorders (SSBD) (Walker & Severson, 1990) using convergent validation techniques. Associations between Critical Events, Adaptive Behavior, and Maladaptive Behavior indices of the SSBD were examined in relation to syndrome, broadband, and total scores…
Neil, Sarah E; Myring, Alec; Peeters, Mon Jef; Pirie, Ian; Jacobs, Rachel; Hunt, Michael A; Garland, S Jayne; Campbell, Kristin L
2013-11-01
Muscular strength is a key parameter of rehabilitation programs and a strong predictor of functional capacity. Traditional methods to measure strength, such as manual muscle testing (MMT) and hand-held dynamometry (HHD), are limited by the strength and experience of the tester. The Performance Recorder 1 (PR1) is a strength assessment tool attached to resistance training equipment and may be a time- and cost-effective tool to measure strength in clinical practice that overcomes some limitations of MMT and HHD. However, reliability and validity of the PR1 have not been reported. Test-retest and inter-rater reliability was assessed using the PR1 in healthy adults (n = 15) during isometric knee flexion and extension. Criterion-related validity was assessed through comparison of values obtained from the PR1 and Biodex® isokinetic dynamometer. Test-retest reliability was excellent for peak knee flexion (intra-class correlation coefficient [ICC] of 0.96, 95% CI: 0.85, 0.99) and knee extension (ICC = 0.96, 95% CI: 0.87, 0.99). Inter-rater reliability was also excellent for peak knee flexion (ICC = 0.95, 95% CI: 0.85, 0.99) and peak knee extension (ICC = 0.97, 95% CI: 0.91, 0.99). Validity was moderate for peak knee flexion (ICC = 0.75, 95% CI: 0.38, 0.92) but poor for peak knee extension (ICC = 0.37, 95% CI: 0, 0.73). The PR1 provides a reliable measure of isometric knee flexor and extensor strength in healthy adults that could be used in the clinical setting, but absolute values may not be comparable to strength assessment by gold-standard measures.
Validation of a photography-based goniometry method for measuring joint range of motion.
Blonna, Davide; Zarkadas, Peter C; Fitzsimmons, James S; O'Driscoll, Shawn W
2012-01-01
A critical component of evaluating the outcomes after surgery to restore lost elbow motion is the range of motion (ROM) of the elbow. This study examined if digital photography-based goniometry is as accurate and reliable as clinical goniometry for measuring elbow ROM. Instrument validity and reliability for photography-based goniometry were evaluated for a consecutive series of 50 elbow contractures by 4 observers with different levels of elbow experience. Goniometric ROM measurements were taken with the elbows in full extension and full flexion directly in the clinic (once) and from digital photographs (twice in a blinded random manner). Instrument validity for photography-based goniometry was extremely high (intraclass correlation coefficient: extension = 0.98, flexion = 0.96). For extension and flexion measurements by the expert surgeon, systematic error was negligible (0° and 1°, respectively). Limits of agreement were 7° (95% confidence interval [CI], 5° to 9°) and -7° (95% CI, -5° to -9°) for extension and 8° (95% CI, 6° to 10°) and -7° (95% CI, -5° to -9°) for flexion. Interobserver reliability for photography-based goniometry was better than that for clinical goniometry. The least experienced observer's photographic goniometry measurements were closer to the reference measurements than the clinical goniometry measurements. Photography-based goniometry is accurate and reliable for measuring elbow ROM. The photography-based method relied less on observer expertise than clinical goniometry. This validates an objective measure of patient outcome without requiring doctor-patient contact at a tertiary care center, where most contracture surgeries are done. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.
Validation of the Leap Motion Controller using markered motion capture technology.
Smeragliuolo, Anna H; Hill, N Jeremy; Disla, Luis; Putrino, David
2016-06-14
The Leap Motion Controller (LMC) is a low-cost, markerless motion capture device that tracks hand, wrist and forearm position. Integration of this technology into healthcare applications has begun to occur rapidly, making validation of the LMC׳s data output an important research goal. Here, we perform a detailed evaluation of the kinematic data output from the LMC, and validate this output against gold-standard, markered motion capture technology. We instructed subjects to perform three clinically-relevant wrist (flexion/extension, radial/ulnar deviation) and forearm (pronation/supination) movements. The movements were simultaneously tracked using both the LMC and a marker-based motion capture system from Motion Analysis Corporation (MAC). Adjusting for known inconsistencies in the LMC sampling frequency, we compared simultaneously acquired LMC and MAC data by performing Pearson׳s correlation (r) and root mean square error (RMSE). Wrist flexion/extension and radial/ulnar deviation showed good overall agreement (r=0.95; RMSE=11.6°, and r=0.92; RMSE=12.4°, respectively) with the MAC system. However, when tracking forearm pronation/supination, there were serious inconsistencies in reported joint angles (r=0.79; RMSE=38.4°). Hand posture significantly influenced the quality of wrist deviation (P<0.005) and forearm supination/pronation (P<0.001), but not wrist flexion/extension (P=0.29). We conclude that the LMC is capable of providing data that are clinically meaningful for wrist flexion/extension, and perhaps wrist deviation. It cannot yet return clinically meaningful data for measuring forearm pronation/supination. Future studies should continue to validate the LMC as updated versions of their software are developed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Detection of brain tumor margins using optical coherence tomography
NASA Astrophysics Data System (ADS)
Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier
2018-02-01
In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, noncancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancerinfiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End-Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.
Detection of brain tumor margins using optical coherence tomography
NASA Astrophysics Data System (ADS)
Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier
2018-02-01
In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, non-cancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancer-infiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End- Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.
Nalluri, Joseph J; Rana, Pratip; Barh, Debmalya; Azevedo, Vasco; Dinh, Thang N; Vladimirov, Vladimir; Ghosh, Preetam
2017-08-15
In recent studies, miRNAs have been found to be extremely influential in many of the essential biological processes. They exhibit a self-regulatory mechanism through which they act as positive/negative regulators of expression of genes and other miRNAs. This has direct implications in the regulation of various pathophysiological conditions, signaling pathways and different types of cancers. Studying miRNA-disease associations has been an extensive area of research; however deciphering miRNA-miRNA network regulatory patterns in several diseases remains a challenge. In this study, we use information diffusion theory to quantify the influence diffusion in a miRNA-miRNA regulation network across multiple disease categories. Our proposed methodology determines the critical disease specific miRNAs which play a causal role in their signaling cascade and hence may regulate disease progression. We extensively validate our framework using existing computational tools from the literature. Furthermore, we implement our framework on a comprehensive miRNA expression data set for alcohol dependence and identify the causal miRNAs for alcohol-dependency in patients which were validated by the phase-shift in their expression scores towards the early stages of the disease. Finally, our computational framework for identifying causal miRNAs implicated in diseases is available as a free online tool for the greater scientific community.
Shu, Ting; Zhang, Bob; Yan Tang, Yuan
2017-04-01
Researchers have recently discovered that Diabetes Mellitus can be detected through non-invasive computerized method. However, the focus has been on facial block color features. In this paper, we extensively study the effects of texture features extracted from facial specific regions at detecting Diabetes Mellitus using eight texture extractors. The eight methods are from four texture feature families: (1) statistical texture feature family: Image Gray-scale Histogram, Gray-level Co-occurance Matrix, and Local Binary Pattern, (2) structural texture feature family: Voronoi Tessellation, (3) signal processing based texture feature family: Gaussian, Steerable, and Gabor filters, and (4) model based texture feature family: Markov Random Field. In order to determine the most appropriate extractor with optimal parameter(s), various parameter(s) of each extractor are experimented. For each extractor, the same dataset (284 Diabetes Mellitus and 231 Healthy samples), classifiers (k-Nearest Neighbors and Support Vector Machines), and validation method (10-fold cross validation) are used. According to the experiments, the first and third families achieved a better outcome at detecting Diabetes Mellitus than the other two. The best texture feature extractor for Diabetes Mellitus detection is the Image Gray-scale Histogram with bin number=256, obtaining an accuracy of 99.02%, a sensitivity of 99.64%, and a specificity of 98.26% by using SVM. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pourahmadi, Mohammad Reza; Bagheri, Rasool; Taghipour, Morteza; Takamjani, Ismail Ebrahimi; Sarrafzadeh, Javad; Mohseni-Bandpei, Mohammad Ali
2018-03-01
Measurement of cervical spine range of motion (ROM) is often considered to be an essential component of cervical spine physiotherapy assessment. This study aimed to investigate the reliability and validity of an iPhone application (app) (Goniometer Pro) for measuring active craniocervical ROM (ACCROM) in patients with non-specific neck pain. A cross-sectional study was conducted at the musculoskeletal biomechanics laboratory located at Iran University of Medical Sciences. Forty non-specific neck pain patients participated in this study. The outcome measure was the ACCROM, including flexion, extension, lateral flexion, and rotation. Following the recruitment process, ACCROM was measured using a universal goniometer (UG) and iPhone 7 app. Two blinded examiners each used the UG and iPhone to measure ACCROM in the following sequences: flexion, extension, lateral flexion, and rotation. The second (2 hours later) and third (48 hours later) sessions were carried out in the same manner as the first session. Intraclass correlation coefficient (ICC) models were used to determine the intra-rater and inter-rater reliability. The Pearson correlation coefficients were used to establish concurrent validity of the iPhone app. Minimum detectable change at the 95% confidence level (MDC 95 ) was also computed. Good intra-rater and inter-rater reliability was demonstrated for the goniometer with ICC values of ≥0.66 and ≥0.70 and the iPhone app with ICC values of ≥0.62 and ≥0.65, respectively. The MDC 95 ranged from 2.21° to 12.50° for the intra-rater analysis and from 3.40° to 12.61° for the inter-rater analysis. The concurrent validity between the two instruments was high, with r valuesof ≥0.63. The magnitude of the differences between the UG and iPhone app values (effect sizes) was small, with Cohen d values of ≤0.17. The iPhone app possesses good reliability and high validity. It seems that this app can be used for measuring ACCROM. Copyright © 2017 Elsevier Inc. All rights reserved.
PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation
NASA Astrophysics Data System (ADS)
España, S; Herraiz, J L; Vicente, E; Vaquero, J J; Desco, M; Udias, J M
2009-03-01
Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.
Fitmunk: improving protein structures by accurate, automatic modeling of side-chain conformations.
Porebski, Przemyslaw Jerzy; Cymborowski, Marcin; Pasenkiewicz-Gierula, Marta; Minor, Wladek
2016-02-01
Improvements in crystallographic hardware and software have allowed automated structure-solution pipelines to approach a near-`one-click' experience for the initial determination of macromolecular structures. However, in many cases the resulting initial model requires a laborious, iterative process of refinement and validation. A new method has been developed for the automatic modeling of side-chain conformations that takes advantage of rotamer-prediction methods in a crystallographic context. The algorithm, which is based on deterministic dead-end elimination (DEE) theory, uses new dense conformer libraries and a hybrid energy function derived from experimental data and prior information about rotamer frequencies to find the optimal conformation of each side chain. In contrast to existing methods, which incorporate the electron-density term into protein-modeling frameworks, the proposed algorithm is designed to take advantage of the highly discriminatory nature of electron-density maps. This method has been implemented in the program Fitmunk, which uses extensive conformational sampling. This improves the accuracy of the modeling and makes it a versatile tool for crystallographic model building, refinement and validation. Fitmunk was extensively tested on over 115 new structures, as well as a subset of 1100 structures from the PDB. It is demonstrated that the ability of Fitmunk to model more than 95% of side chains accurately is beneficial for improving the quality of crystallographic protein models, especially at medium and low resolutions. Fitmunk can be used for model validation of existing structures and as a tool to assess whether side chains are modeled optimally or could be better fitted into electron density. Fitmunk is available as a web service at http://kniahini.med.virginia.edu/fitmunk/server/ or at http://fitmunk.bitbucket.org/.
Wang, Huey-Yuh; Chen, Yueh-Chih; Lin, Dong-Tsamn; Gau, Bih-Shya
2005-06-01
The purpose of this article is to describe the process of designing an Infection Control Health Education Program (ICP) for adolescents with cancer, to describe the content of that program, and to evaluate its validity. The program consisted of an audiovisual "Infection Control Health Education Program in Video Compact Disc (VCD)" and "Self-Care Daily Checklist (SCDC)". The VCD was developed from systematic literature reviews and consultations with experts in pediatric oncology care. It addresses the main issues of infection control among adolescents. The content of the SCDC was designed to enhance adolescents' self-care capabilities by means of twice daily self-recording. The response format for content validity of the VCD and SCDC was a 5-point Likert scale. The mean score for content validity was 4.72 for the VCD and 4.82 for the SCDC. The percentage of expert agreement was 99% for the VCD and 98% for the SCDC. In summary, the VCD was effective in improving adolescents' capacity for self-care and the extensive reinforcement SCDC was also shown to be useful. In a subsequent pilot study, the authors used this program to increase adolescent cancer patients' self-care knowledge and behavior for, and decrease their levels of secondary infection.
Validation of a program for supercritical power plant calculations
NASA Astrophysics Data System (ADS)
Kotowicz, Janusz; Łukowicz, Henryk; Bartela, Łukasz; Michalski, Sebastian
2011-12-01
This article describes the validation of a supercritical steam cycle. The cycle model was created with the commercial program GateCycle and validated using in-house code of the Institute of Power Engineering and Turbomachinery. The Institute's in-house code has been used extensively for industrial power plants calculations with good results. In the first step of the validation process, assumptions were made about the live steam temperature and pressure, net power, characteristic quantities for high- and low-pressure regenerative heat exchangers and pressure losses in heat exchangers. These assumptions were then used to develop a steam cycle model in Gate-Cycle and a model based on the code developed in-house at the Institute of Power Engineering and Turbomachinery. Properties, such as thermodynamic parameters at characteristic points of the steam cycle, net power values and efficiencies, heat provided to the steam cycle and heat taken from the steam cycle, were compared. The last step of the analysis was calculation of relative errors of compared values. The method used for relative error calculations is presented in the paper. The assigned relative errors are very slight, generally not exceeding 0.1%. Based on our analysis, it can be concluded that using the GateCycle software for calculations of supercritical power plants is possible.
77 FR 25469 - Applications for New Awards; Investing in Innovation Fund, Validation
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-30
... DEPARTMENT OF EDUCATION Applications for New Awards; Investing in Innovation Fund, Validation... Innovation and Improvement, Department of Education. ACTION: Notice; extension of deadline date and correction. SUMMARY: On March 27, 2012, the Office of Innovation and Improvement in the U.S. Department of...
Point cloud modeling using the homogeneous transformation for non-cooperative pose estimation
NASA Astrophysics Data System (ADS)
Lim, Tae W.
2015-06-01
A modeling process to simulate point cloud range data that a lidar (light detection and ranging) sensor produces is presented in this paper in order to support the development of non-cooperative pose (relative attitude and position) estimation approaches which will help improve proximity operation capabilities between two adjacent vehicles. The algorithms in the modeling process were based on the homogeneous transformation, which has been employed extensively in robotics and computer graphics, as well as in recently developed pose estimation algorithms. Using a flash lidar in a laboratory testing environment, point cloud data of a test article was simulated and compared against the measured point cloud data. The simulated and measured data sets match closely, validating the modeling process. The modeling capability enables close examination of the characteristics of point cloud images of an object as it undergoes various translational and rotational motions. Relevant characteristics that will be crucial in non-cooperative pose estimation were identified such as shift, shadowing, perspective projection, jagged edges, and differential point cloud density. These characteristics will have to be considered in developing effective non-cooperative pose estimation algorithms. The modeling capability will allow extensive non-cooperative pose estimation performance simulations prior to field testing, saving development cost and providing performance metrics of the pose estimation concepts and algorithms under evaluation. The modeling process also provides "truth" pose of the test objects with respect to the sensor frame so that the pose estimation error can be quantified.
Non-destructive inspection in industrial equipment using robotic mobile manipulation
NASA Astrophysics Data System (ADS)
Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah
2016-05-01
MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.
van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W
2016-10-01
Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.
An overview of mesoscale aerosol processes, comparisons, and validation studies from DRAGON networks
NASA Astrophysics Data System (ADS)
Holben, Brent N.; Kim, Jhoon; Sano, Itaru; Mukai, Sonoyo; Eck, Thomas F.; Giles, David M.; Schafer, Joel S.; Sinyuk, Aliaksandr; Slutsker, Ilya; Smirnov, Alexander; Sorokin, Mikhail; Anderson, Bruce E.; Che, Huizheng; Choi, Myungje; Crawford, James H.; Ferrare, Richard A.; Garay, Michael J.; Jeong, Ukkyo; Kim, Mijin; Kim, Woogyung; Knox, Nichola; Li, Zhengqiang; Lim, Hwee S.; Liu, Yang; Maring, Hal; Nakata, Makiko; Pickering, Kenneth E.; Piketh, Stuart; Redemann, Jens; Reid, Jeffrey S.; Salinas, Santo; Seo, Sora; Tan, Fuyi; Tripathi, Sachchida N.; Toon, Owen B.; Xiao, Qingyang
2018-01-01
Over the past 24 years, the AErosol RObotic NETwork (AERONET) program has provided highly accurate remote-sensing characterization of aerosol optical and physical properties for an increasingly extensive geographic distribution including all continents and many oceanic island and coastal sites. The measurements and retrievals from the AERONET global network have addressed satellite and model validation needs very well, but there have been challenges in making comparisons to similar parameters from in situ surface and airborne measurements. Additionally, with improved spatial and temporal satellite remote sensing of aerosols, there is a need for higher spatial-resolution ground-based remote-sensing networks. An effort to address these needs resulted in a number of field campaign networks called Distributed Regional Aerosol Gridded Observation Networks (DRAGONs) that were designed to provide a database for in situ and remote-sensing comparison and analysis of local to mesoscale variability in aerosol properties. This paper describes the DRAGON deployments that will continue to contribute to the growing body of research related to meso- and microscale aerosol features and processes. The research presented in this special issue illustrates the diversity of topics that has resulted from the application of data from these networks.
42 CFR 137.426 - May an Indian Tribe get an extension of time to file a notice of appeal?
Code of Federal Regulations, 2010 CFR
2010-10-01
...-GOVERNANCE Appeals Pre-Award Disputes § 137.426 May an Indian Tribe get an extension of time to file a notice... time period. If the Indian Tribe has a valid reason for not filing its notice of appeal on time, it may...
Using Evaluation to Guide and Validate Improvements to the Utah Master Naturalist Program
ERIC Educational Resources Information Center
Larese-Casanova, Mark
2015-01-01
Integrating evaluation into an Extension program offers multiple opportunities to understand program success through achieving program goals and objectives, delivering programming using the most effective techniques, and refining program audiences. It is less common that evaluation is used to guide and validate the effectiveness of program…
The Open Curriculum and Selection of Qualified Staff: Instrument Validation.
ERIC Educational Resources Information Center
Greene, John F.; And Others
The impact of open education on today's curriculum has been extensive. Of the many requests for research in this area, none is more important than instrument validation. This study examines the internal structure of Barth's Assumptions about Learning and Knowledge scale and explores its relationship to established "progressivism" and…
USDA-ARS?s Scientific Manuscript database
Diet composition of free roaming livestock and wildlife in extensive rangelands are difficult to quantify. Recent technological advances now allow us to reconstruct plant species-specific dietary protein composition using fecal samples. However, it has been suggested that validation of the method i...
Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew
2015-01-01
Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.
NASA Astrophysics Data System (ADS)
Anastasopoulos, Dimitrios; Moretti, Patrizia; Geernaert, Thomas; De Pauw, Ben; Nawrot, Urszula; De Roeck, Guido; Berghmans, Francis; Reynders, Edwin
2017-03-01
The presence of damage in a civil structure alters its stiffness and consequently its modal characteristics. The identification of these changes can provide engineers with useful information about the condition of a structure and constitutes the basic principle of the vibration-based structural health monitoring. While eigenfrequencies and mode shapes are the most commonly monitored modal characteristics, their sensitivity to structural damage may be low relative to their sensitivity to environmental influences. Modal strains or curvatures could offer an attractive alternative but current measurement techniques encounter difficulties in capturing the very small strain (sub-microstrain) levels occurring during ambient, or operational excitation, with sufficient accuracy. This paper investigates the ability to obtain sub-microstrain accuracy with standard fiber-optic Bragg gratings using a novel optical signal processing algorithm that identifies the wavelength shift with high accuracy and precision. The novel technique is validated in an extensive experimental modal analysis test on a steel I-beam which is instrumented with FBG sensors at its top and bottom flange. The raw wavelength FBG data are processed into strain values using both a novel correlation-based processing technique and a conventional peak tracking technique. Subsequently, the strain time series are used for identifying the beam's modal characteristics. Finally, the accuracy of both algorithms in identification of modal characteristics is extensively investigated.
Verification of S&D Solutions for Network Communications and Devices
NASA Astrophysics Data System (ADS)
Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen
This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.
NASA Technical Reports Server (NTRS)
Lutz, B. L.; Owen, T.; Cess, R. D.
1982-01-01
Lutz et al. (1976) have reported the first quantitative analyses of the strengths of the blue-green bands of methane which dominate the visible spectra of the outer planets. The present investigation represents an extension of the first study to include a number of bands between 6000 and 7500 A. The objective of this extension is to establish the validity of the scaled numerical curve of growth of the first study further into the saturated region and to test the apparent pressure independence of the high-overtone bands over a large pressure range. In addition, it is desired to provide a set of homogeneously determined band strengths and curves of growth over a large spectral region and over a large range of band strengths. This will make it possible to investigate feasible apparent dependences of planetary methane abundances on wavelength and band strength as a probe of the scattering processes in the planetary atmospheres.
Pienaar, Andries W; Barnard, Justhinus G
2017-04-01
This study describes the development of a new portable muscle testing device, using air pressure as a biofeedback and strength testing tool. For this purpose, a pressure air biofeedback device (PAB ® ) was developed to measure and record the isometric extension strength of the lumbar multifidus muscle in asymptomatic and low back pain (LBP) persons. A total of 42 subjects (age 47.58 years, ±18.58) participated in this study. The validity of PAB ® was assessed by comparing a selected measure, air pressure force in millibar (mb), to a standard criterion; calibrated weights in kilograms (kg) during day-to-day tests. Furthermore, clinical trial-to-trial and day-to-day tests of maximum voluntary isometric contraction (MVIC) of L5 lumbar multifidus were done to compare air pressure force (mb) to electromyography (EMG) in microvolt (μV) and to measure the reliability of PAB ® . A highly significant relationship were found between air pressure output (mb) and calibrated weights (kg). In addition, Pearson correlation calculations showed a significant relationship between PAB ® force (mb) and EMG activity (μV) for all subjects (n = 42) examined, as well as for the asymptomatic group (n = 24). No relationship was detected for the LBP group (n = 18). In terms of lumbar extension strength, we found that asymptomatic subjects were significantly stronger than LBP subjects. The results of the PAB ® test differentiated between LBP and asymptomatic subject's lumbar isometric extension strength without any risk to the subjects and also indicate that the lumbar isometric extension test with the new PAB ® device is reliable and valid.
NASA Astrophysics Data System (ADS)
Das, A.; Bang, H. S.; Bang, H. S.
2018-05-01
Multi-material combinations of aluminium alloy and carbon-fiber-reinforced-plastics (CFRP) have gained attention in automotive and aerospace industries to enhance fuel efficiency and strength-to-weight ratio of components. Various limitations of laser beam welding, adhesive bonding and mechanical fasteners make these processes inefficient to join metal and CFRP sheets. Friction lap joining is an alternative choice for the same. Comprehensive studies in friction lap joining of aluminium to CFRP sheets are essential and scare in the literature. The present work reports a combined theoretical and experimental study in joining of AA5052 and CFRP sheets using friction lap joining process. A three-dimensional finite element based heat transfer model is developed to compute the temperature fields and thermal cycles. The computed results are validated extensively with the corresponding experimentally measured results.
Extensive estimates of forest productivity are required to understand the
relationships between shifting land use, changing climate and carbon storage
and fluxes. Aboveground net primary production of wood (NPPAw) is a major component
of total NPP and...
ERIC Educational Resources Information Center
Liu, I-Fan; Young, Shelley S. -C.
2017-01-01
The purpose of this study is to describe an online community-based English extensive reading contest to investigate whether the participants' intrinsic, extrinsic, and interpersonal motivations and learning results show significant gender differences. A total of 501 valid questionnaires (285 females and 216 males) from Taiwanese high school…
Cavallero, Serena; Bruno, Alessandro; Arletti, Enrico; Caffara, Monica; Fioravanti, Maria Letizia; Costa, Antonella; Cammilleri, Gaetano; Graci, Stefania; Ferrantelli, Vincenzo; D'Amelio, Stefano
2017-09-18
Anisakids are parasitic nematodes responsible for a zoonosis that occurs following the ingestion of fish and fish products infected with larvae belonging to the genera Anisakis and Pseudoterranova. Rarely Contracaecum is found in association with gastric/intestinal illness, while Hysterothylacium is commonly considered not pathogenic. Although Real Time PCR assays have been recently used with the aim to detect and quantify these parasites in food products, methods applied did not undergo through extensive validation process, a feature highly desirable or mandatory in the case of testing laboratories accredited for the ISO EN 17025:2005. Here, a comprehensive study has been performed to validate a commercial kit based on multiplex real time PCR for the qualitative detection of Anisakis and Pseudoterranova. Inclusivity/exclusivity trials were carried out on DNA from species of the genera Anisakis, Pseudoterranova, Contracaecum, Hysterothylacium and Ascaris, on fish intentionally contaminated with Anisakis spp. and Pseudoterranova spp. and on marine organisms as fish, crustacean and squid to test the commercial kit on a large sample. The assay gave positive amplification for several Anisakis and Pseudoterranova species, while providing no signal for the members of the remaining genera. Each sample was correctly assigned either to Anisakis or Pseudoterranova, thus indicating that no cross-reaction occurred. The LOD was determined using two independent standard curves. Robustness was assayed by using two different thermocyclers in three distinct laboratories with different operators. The establishment of a validation dossier will permit the use of the commercial kit for the detection of Anisakis and Pseudoterranova DNA in fish and fish products intended for human consumption by public or private laboratories, following the requirements regarding the quality assurance processes described in the ISO EN 17025:2005. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Cousans, Fran; Patterson, Fiona; Edwards, Helena; Walker, Kim; McLachlan, John C.; Good, David
2017-01-01
Although there is extensive evidence confirming the predictive validity of situational judgement tests (SJTs) in medical education, there remains a shortage of evidence for their predictive validity for performance of postgraduate trainees in their first role in clinical practice. Moreover, to date few researchers have empirically examined the…
Two-Method Planned Missing Designs for Longitudinal Research
ERIC Educational Resources Information Center
Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.
2014-01-01
We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…
Perception of Competence in Middle School Physical Education: Instrument Development and Validation
ERIC Educational Resources Information Center
Scrabis-Fletcher, Kristin; Silverman, Stephen
2010-01-01
Perception of Competence (POC) has been studied extensively in physical activity (PA) research with similar instruments adapted for physical education (PE) research. Such instruments do not account for the unique PE learning environment. Therefore, an instrument was developed and the scores validated to measure POC in middle school PE. A…
Peissig, Peggy L; Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B
2012-01-01
There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries.
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
NASA Astrophysics Data System (ADS)
Ragno, Rino; Ballante, Flavio; Pirolli, Adele; Wickersham, Richard B.; Patsilinakos, Alexandros; Hesse, Stéphanie; Perspicace, Enrico; Kirsch, Gilbert
2015-08-01
Vascular endothelial growth factor receptor-2, (VEGFR-2), is a key element in angiogenesis, the process by which new blood vessels are formed, and is thus an important pharmaceutical target. Here, 3-D quantitative structure-activity relationship (3-D QSAR) were used to build a quantitative screening and pharmacophore model of the VEGFR-2 receptors for design of inhibitors with improved activities. Most of available experimental data information has been used as training set to derive optimized and fully cross-validated eight mono-probe and a multi-probe quantitative models. Notable is the use of 262 molecules, aligned following both structure-based and ligand-based protocols, as external test set confirming the 3-D QSAR models' predictive capability and their usefulness in design new VEGFR-2 inhibitors. From a survey on literature, this is the first generation of a wide-ranging computational medicinal chemistry application on VEGFR2 inhibitors.
Semmler, Egmont; Novak, Wenzel; Allinson, Wilf; Wallis, Darren; Wood, Nigel; Awakowicz, Peter; Wunderlich, Joachim
2016-01-01
A new technology to the pharmaceutical field is presented: surface decontamination by plasmas The technology is comparable to established barrier systems like e-beam, volatile hydrogen peroxide, or radiation inactivation of microbiological contaminations. This plasma technology is part of a fully automated and validated syringe filling line at a major pharmaceutical company and is in production operation. Incoming pre-sterilized syringe containers ("tubs") are processed by plasma, solely on the outside, and passed into the aseptic filling isolator upon successful decontamination. The objective of this article is to present the operating principles and develop and establish a validation routine on the basis of standard commercial biological indicators. Their decontamination efficacies are determined and correlated to the actual inactivation efficacy on the pharmaceutical packaging material.The reference setup is explained in detail and a short presentation of the cycle development and the relevant plasma control parameters is given, with a special focus on the in-process monitor determining the cycle validity. Different microbial inactivation mechanisms are also discussed and evaluated for their contribution and interaction to enhance plasma decontamination. A material-dependent inactivation behavior was observed. In order to be able to correlate the tub surface inactivation of Geobacillus stearothermophilus endospores to metallic biological indicators, a comparative study was performed. Through consistently demonstrating the linear inactivation behavior between the different materials, it becomes possible to develop an effective and time-saving validation scheme. The challenge in new decontamination systems lies in a thorough validation of the inactivation efficacy under different operating regimes. With plasma, as an ionized gas, a new barrier concept is introduced into pharmaceutical aseptic processing of syringes. The presented system operates in vacuum and only decontaminates the outer surface of pre-sterilized syringe containers ("tubs"), before they are transferred into the aseptic area. The plasma does not penetrate into the tub. This article discusses the phase from development and test germ selection, across the identified sporicidal mechanisms, to a proposal for a validation scheme on the basis of commercially available biological indicators. A special focus is placed on an extensive investigation to establish a link between the tub surface microbial kill (polystyrene and Tyvek(and (2)) ) and biological indicator inactivation (stainless steel). Additionally, a rationale is developed on how an optical in-process monitor can be applied to establish a validatable limit on the base of the predetermined inactivation data of Geobacillus stearothermophilus endospores. © PDA, Inc. 2016.
NASA Astrophysics Data System (ADS)
Farrell, Kathryn; Oden, J. Tinsley
2014-07-01
Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and methods through applications to representative atomic structures and we discuss extensions to the validation process for molecular models of polymer structures encountered in certain semiconductor nanomanufacturing processes. The powerful method of model plausibility as a means for selecting interaction potentials for coarse-grained models is discussed in connection with a coarse-grained hexane molecule. Discussions of how all-atom information is used to construct priors are contained in an appendix.
Complete Report on the Development of Welding Parameters for Irradiated Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frederick, Greg; Sutton, Benjamin J.; Tatman, Jonathan K.
The advanced welding facility at the Radiochemical Engineering Development Center of Oak Ridge National Laboratory, which was conceived to enable research and development of weld repair techniques for nuclear power plant life extension, is now operational. The development of the facility and its advanced welding capabilities, along with the model materials for initial welding trials, were funded jointly by the U.S. Department of Energy, Office of Nuclear Energy, Light Water Reactor Sustainability Program, the Electric Power Research Institute, Long Term Operations Program and the Welding and Repair Technology Center, with additional support from Oak Ridge National Laboratory. Welding of irradiatedmore » materials was initiated on November 17, 2017, which marked a significant step in the development of the facility and the beginning of extensive welding research and development campaigns on irradiated materials that will eventually produce validated techniques and guidelines for weld repair activities carried out to extend the operational lifetimes of nuclear power plants beyond 60 years. This report summarizes the final steps that were required to complete weld process development, initial irradiated materials welding activities, near-term plans for irradiated materials welding, and plans for post-weld analyses that will be carried out to assess the ability of the advanced welding processes to make repairs on irradiated materials.« less
Generalized model for k -core percolation and interdependent networks
NASA Astrophysics Data System (ADS)
Panduranga, Nagendra K.; Gao, Jianxi; Yuan, Xin; Stanley, H. Eugene; Havlin, Shlomo
2017-09-01
Cascading failures in complex systems have been studied extensively using two different models: k -core percolation and interdependent networks. We combine the two models into a general model, solve it analytically, and validate our theoretical results through extensive simulations. We also study the complete phase diagram of the percolation transition as we tune the average local k -core threshold and the coupling between networks. We find that the phase diagram of the combined processes is very rich and includes novel features that do not appear in the models studying each of the processes separately. For example, the phase diagram consists of first- and second-order transition regions separated by two tricritical lines that merge and enclose a two-stage transition region. In the two-stage transition, the size of the giant component undergoes a first-order jump at a certain occupation probability followed by a continuous second-order transition at a lower occupation probability. Furthermore, at certain fixed interdependencies, the percolation transition changes from first-order → second-order → two-stage → first-order as the k -core threshold is increased. The analytic equations describing the phase boundaries of the two-stage transition region are set up, and the critical exponents for each type of transition are derived analytically.
A Bayesian Model for Highly Accelerated Phase-Contrast MRI
Rich, Adam; Potter, Lee C.; Jin, Ning; Ash, Joshua; Simonetti, Orlando P.; Ahmad, Rizwan
2015-01-01
Purpose Phase-contrast magnetic resonance imaging (PC-MRI) is a noninvasive tool to assess cardiovascular disease by quantifying blood flow; however, low data acquisition efficiency limits the spatial and temporal resolutions, real-time application, and extensions to 4D flow imaging in clinical settings. We propose a new data processing approach called Reconstructing Velocity Encoded MRI with Approximate message passing aLgorithms (ReVEAL) that accelerates the acquisition by exploiting data structure unique to PC-MRI. Theory and Methods ReVEAL models physical correlations across space, time, and velocity encodings. The proposed Bayesian approach exploits the relationships in both magnitude and phase among velocity encodings. A fast iterative recovery algorithm is introduced based on message passing. For validation, prospectively undersampled data are processed from a pulsatile flow phantom and five healthy volunteers. Results ReVEAL is in good agreement, quantified by peak velocity and stroke volume (SV), with reference data for acceleration rates R ≤ 10. For SV, Pearson r ≥ 0.996 for phantom imaging (n = 24) and r ≥ 0.956 for prospectively accelerated in vivo imaging (n = 10) for R ≤ 10. Conclusion ReVEAL enables accurate quantification of blood flow from highly undersampled data. The technique is extensible to 4D flow imaging, where higher acceleration may be possible due to additional redundancy. PMID:26444911
Extension Handbook. Processes and Practices. Second Edition.
ERIC Educational Resources Information Center
Blackburn, Donald J., Ed.
This book contains the following papers about processes and practices in extension education in Canada: "Historical Roots" (Blackburn, Flaherty); "Transitions and Directions in Extension" (Blackburn, Flaherty); "Applying Learning Theory in Extension Work" (Griffith); "Understanding and Applying Motivation…
Lynn, Scott K.; Watkins, Casey M.; Wong, Megan A.; Balfany, Katherine; Feeney, Daniel F.
2018-01-01
The Athos ® wearable system integrates surface electromyography (sEMG ) electrodes into the construction of compression athletic apparel. The Athos system reduces the complexity and increases the portability of collecting EMG data and provides processed data to the end user. The objective of the study was to determine the reliability and validity of Athos as compared with a research grade sEMG system. Twelve healthy subjects performed 7 trials on separate days (1 baseline trial and 6 repeated trials). In each trial subjects wore the wearable sEMG system and had a research grade sEMG system’s electrodes placed just distal on the same muscle, as close as possible to the wearable system’s electrodes. The muscles tested were the vastus lateralis (VL), vastus medialis (VM), and biceps femoris (BF). All testing was done on an isokinetic dynamometer. Baseline testing involved performing isometric 1 repetition maximum tests for the knee extensors and flexors and three repetitions of concentric-concentric knee flexion and extension at MVC for each testing speed: 60, 180, and 300 deg/sec. Repeated trials 2-7 each comprised 9 sets where each set included three repetitions of concentric-concentric knee flexion-extension. Each repeated trial (2-7) comprised one set at each speed and percent MVC (50%, 75%, 100%) combination. The wearable system and research grade sEMG data were processed using the same methods and aligned in time. The amplitude metrics calculated from the sEMG for each repetition were the peak amplitude, sum of the linear envelope, and 95th percentile. Validity results comprise two main findings. First, there is not a significant effect of system (Athos or research grade system) on the repetition amplitude metrics (95%, peak, or sum). Second, the relationship between torque and sEMG is not significantly different between Athos and the research grade system. For reliability testing, the variation across trials and averaged across speeds was 0.8%, 7.3%, and 0.2% higher for Athos from BF, VL and VM, respectively. Also, using the standard deviation of the MVC normalized repetition amplitude, the research grade system showed 10.7% variability while Athos showed 12%. The wearable technology (Athos) provides sEMG measures that are consistent with controlled, research grade technologies and data collection procedures. Key points Surface EMG embedded into athletic garments (Athos) had similar validity and reliability when compared with a research grade system There was no difference in the torque-EMG relationship between the two systems No statistically significant difference in reliability across 6 trials between the two systems The validity and reliability of Athos demonstrates the potential for sEMG to be applied in dynamic rehabilitation and sports settings PMID:29769821
McKinney, Bill; Meyer, Peter A; Crosas, Mercè; Sliz, Piotr
2017-01-01
Access to experimental X-ray diffraction image data is important for validation and reproduction of macromolecular models and indispensable for the development of structural biology processing methods. In response to the evolving needs of the structural biology community, we recently established a diffraction data publication system, the Structural Biology Data Grid (SBDG, data.sbgrid.org), to preserve primary experimental datasets supporting scientific publications. All datasets published through the SBDG are freely available to the research community under a public domain dedication license, with metadata compliant with the DataCite Schema (schema.datacite.org). A proof-of-concept study demonstrated community interest and utility. Publication of large datasets is a challenge shared by several fields, and the SBDG has begun collaborating with the Institute for Quantitative Social Science at Harvard University to extend the Dataverse (dataverse.org) open-source data repository system to structural biology datasets. Several extensions are necessary to support the size and metadata requirements for structural biology datasets. In this paper, we describe one such extension-functionality supporting preservation of file system structure within Dataverse-which is essential for both in-place computation and supporting non-HTTP data transfers. © 2016 New York Academy of Sciences.
Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi
2017-07-21
Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.
Semi-automated ontology generation within OBO-Edit.
Wächter, Thomas; Schroeder, Michael
2010-06-15
Ontologies and taxonomies have proven highly beneficial for biocuration. The Open Biomedical Ontology (OBO) Foundry alone lists over 90 ontologies mainly built with OBO-Edit. Creating and maintaining such ontologies is a labour-intensive, difficult, manual process. Automating parts of it is of great importance for the further development of ontologies and for biocuration. We have developed the Dresden Ontology Generator for Directed Acyclic Graphs (DOG4DAG), a system which supports the creation and extension of OBO ontologies by semi-automatically generating terms, definitions and parent-child relations from text in PubMed, the web and PDF repositories. DOG4DAG is seamlessly integrated into OBO-Edit. It generates terms by identifying statistically significant noun phrases in text. For definitions and parent-child relations it employs pattern-based web searches. We systematically evaluate each generation step using manually validated benchmarks. The term generation leads to high-quality terms also found in manually created ontologies. Up to 78% of definitions are valid and up to 54% of child-ancestor relations can be retrieved. There is no other validated system that achieves comparable results. By combining the prediction of high-quality terms, definitions and parent-child relations with the ontology editor OBO-Edit we contribute a thoroughly validated tool for all OBO ontology engineers. DOG4DAG is available within OBO-Edit 2.1 at http://www.oboedit.org. Supplementary data are available at Bioinformatics online.
M.A. Lefsky; D.P. Turner; M. Guzy; W.B. Cohen
2005-01-01
Extensive estimates of forest productivity are required to understand the relationships between shifting land use, changing climate and carbon storage and fluxes. Aboveground net primary production of wood (NPPAw) is a major component of total NPP and of net ecosystem production (NEP). Remote sensing of NPP and NPPAw is...
USDA-ARS?s Scientific Manuscript database
In this study, optimization, extension, and validation of a streamlined, qualitative and quantitative multiclass, multiresidue method was conducted to monitor great than100 veterinary drug residues in meat using ultrahigh-performance liquid chromatography – tandem mass spectrometry (UHPLC-MS/MS). I...
Status and plans for the ANOPP/HSR prediction system
NASA Technical Reports Server (NTRS)
Nolan, Sandra K.
1992-01-01
ANOPP is a comprehensive prediction system which was developed and validated by NASA. Because ANOPP is a system prediction program, it allows aerospace industry researchers to create trade-off studies with a variety of aircraft noise problems. The extensive validation of ANOPP allows the program results to be used as a benchmark for testing other prediction codes.
Measuring the Effect of Tourism Services on Travelers' Quality of Life: Further Validation.
ERIC Educational Resources Information Center
Neal, Janet D.; Sirgy, M. Joseph; Uysal, Muzaffer
2004-01-01
lication and extension study provided additional validational support of the original tourism services satisfaction measure in relation to QOL-related measures.Neal, Sirgy and Uysal (1999) developed a model and a measure to capture the effect of tourism services on travelers' quality of life (QOL). They hypothesized that travelers' overall life…
ERIC Educational Resources Information Center
Moore, Brooke A.; Klingner, Janette K.
2014-01-01
This article synthesizes reading intervention research studies intended for use with struggling or at-risk students to determine which studies adequately address population validity, particularly in regard to the diverse reading needs of English language learners. An extensive search of the professional literature between 2001 and 2010 yielded a…
ERIC Educational Resources Information Center
Mancilla-Martinez, Jeannette; Gámez, Perla B.; Vagh, Shaher Banu; Lesaux, Nonie K.
2016-01-01
Purpose: This 2-phase study aims to extend research on parent report measures of children's productive vocabulary by investigating the development (n = 38) of the Spanish Vocabulary Extension and validity (n = 194) of the 100-item Spanish and English MacArthur-Bates Communicative Development Inventories Toddler Short Forms and Upward Extension…
DOT National Transportation Integrated Search
2017-02-08
The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...
Optimization of an asymmetric thin-walled tube in rotary draw bending process
NASA Astrophysics Data System (ADS)
Xue, Xin; Liao, Juan; Vincze, Gabriela; Gracio, Jose J.
2013-12-01
The rotary draw bending is one of the advanced thin-walled tube forming processes with high efficiency, low consumption and good flexibility in several industries such as automotive, aerospace and shipping. However it may cause undesirable deformations such as over-thinning and ovalization, which bring the weakening of the strength and difficulties in the assembly process respectively. Accurate modeling and effective optimization design to eliminate or reduce undesirable deformations in tube bending process have been a challenging topic. In this paper, in order to study the deformation behaviors of an asymmetric thin-walled tube in rotary draw bending process, a 3D elastic-plastic finite element model has been built under the ABAQUS environment, and the reliability of the model is validated by comparison with experiment. Then, the deformation mechanism of thin-walled tube in bending process was briefly analysis and the effects of wall thickness ratio, section height width ratio and mandrel extension on wall thinning and ovalization in bending process were investigated by using Response Surface Methodology. Finally, multi-objective optimization method was used to obtain an optimum solution of design variables based on simulation results.
Maurage, Pierre; Campanella, Salvatore
2013-01-01
Crossmodal processing (i.e., the construction of a unified representation stemming from distinct sensorial modalities inputs) constitutes a crucial ability in humans' everyday life. It has been extensively explored at cognitive and cerebral levels during the last decade among healthy controls. Paradoxically however, and while difficulties to perform this integrative process have been suggested in a large range of psychopathological states (e.g., schizophrenia and autism), these crossmodal paradigms have been very rarely used in the exploration of psychiatric populations. The main aim of the present paper is thus to underline the experimental and clinical usefulness of exploring crossmodal processes in psychiatry. We will illustrate this proposal by means of the recent data obtained in the crossmodal exploration of emotional alterations in alcohol-dependence. Indeed, emotional decoding impairments might have a role in the development and maintenance of alcohol-dependence, and have been extensively investigated by means of experiments using separated visual or auditory stimulations. Besides these unimodal explorations, we have recently conducted several studies using audio-visual crossmodal paradigms, which has allowed us to improve the ecological validity of the unimodal experimental designs and to offer new insights on the emotional alterations among alcohol-dependent individuals. We will show how these preliminary results can be extended to develop a coherent and ambitious research program using crossmodal designs in various psychiatric populations and sensory modalities. We will finally end the paper by underlining the various potential clinical applications and the fundamental implications that can be raised by this emerging project. PMID:23898250
The MiPACQ Clinical Question Answering System
Cairns, Brian L.; Nielsen, Rodney D.; Masanz, James J.; Martin, James H.; Palmer, Martha S.; Ward, Wayne H.; Savova, Guergana K.
2011-01-01
The Multi-source Integrated Platform for Answering Clinical Questions (MiPACQ) is a QA pipeline that integrates a variety of information retrieval and natural language processing systems into an extensible question answering system. We present the system’s architecture and an evaluation of MiPACQ on a human-annotated evaluation dataset based on the Medpedia health and medical encyclopedia. Compared with our baseline information retrieval system, the MiPACQ rule-based system demonstrates 84% improvement in Precision at One and the MiPACQ machine-learning-based system demonstrates 134% improvement. Other performance metrics including mean reciprocal rank and area under the precision/recall curves also showed significant improvement, validating the effectiveness of the MiPACQ design and implementation. PMID:22195068
The MiPACQ clinical question answering system.
Cairns, Brian L; Nielsen, Rodney D; Masanz, James J; Martin, James H; Palmer, Martha S; Ward, Wayne H; Savova, Guergana K
2011-01-01
The Multi-source Integrated Platform for Answering Clinical Questions (MiPACQ) is a QA pipeline that integrates a variety of information retrieval and natural language processing systems into an extensible question answering system. We present the system's architecture and an evaluation of MiPACQ on a human-annotated evaluation dataset based on the Medpedia health and medical encyclopedia. Compared with our baseline information retrieval system, the MiPACQ rule-based system demonstrates 84% improvement in Precision at One and the MiPACQ machine-learning-based system demonstrates 134% improvement. Other performance metrics including mean reciprocal rank and area under the precision/recall curves also showed significant improvement, validating the effectiveness of the MiPACQ design and implementation.
Higher Throughput Calorimetry: Opportunities, Approaches and Challenges
Recht, Michael I.; Coyle, Joseph E.; Bruce, Richard H.
2010-01-01
Higher throughput thermodynamic measurements can provide value in structure-based drug discovery during fragment screening, hit validation, and lead optimization. Enthalpy can be used to detect and characterize ligand binding, and changes that affect the interaction of protein and ligand can sometimes be detected more readily from changes in the enthalpy of binding than from the corresponding free-energy changes or from protein-ligand structures. Newer, higher throughput calorimeters are being incorporated into the drug discovery process. Improvements in titration calorimeters come from extensions of a mature technology and face limitations in scaling. Conversely, array calorimetry, an emerging technology, shows promise for substantial improvements in throughput and material utilization, but improved sensitivity is needed. PMID:20888754
Image encryption using a synchronous permutation-diffusion technique
NASA Astrophysics Data System (ADS)
Enayatifar, Rasul; Abdullah, Abdul Hanan; Isnin, Ismail Fauzi; Altameem, Ayman; Lee, Malrey
2017-03-01
In the past decade, the interest on digital images security has been increased among scientists. A synchronous permutation and diffusion technique is designed in order to protect gray-level image content while sending it through internet. To implement the proposed method, two-dimensional plain-image is converted to one dimension. Afterward, in order to reduce the sending process time, permutation and diffusion steps for any pixel are performed in the same time. The permutation step uses chaotic map and deoxyribonucleic acid (DNA) to permute a pixel, while diffusion employs DNA sequence and DNA operator to encrypt the pixel. Experimental results and extensive security analyses have been conducted to demonstrate the feasibility and validity of this proposed image encryption method.
Drug-loaded erythrocytes: on the road toward marketing approval
Bourgeaux, Vanessa; Lanao, José M; Bax, Bridget E; Godfrin, Yann
2016-01-01
Erythrocyte drug encapsulation is one of the most promising therapeutic alternative approaches for the administration of toxic or rapidly cleared drugs. Drug-loaded erythrocytes can operate through one of the three main mechanisms of action: extension of circulation half-life (bioreactor), slow drug release, or specific organ targeting. Although the clinical development of erythrocyte carriers is confronted with regulatory and development process challenges, industrial development is expanding. The manufacture of this type of product can be either centralized or bedside based, and different procedures are employed for the encapsulation of therapeutic agents. The major challenges for successful industrialization include production scalability, process validation, and quality control of the released therapeutic agents. Advantages and drawbacks of the different manufacturing processes as well as success key points of clinical development are discussed. Several entrapment technologies based on osmotic methods have been industrialized. Companies have already achieved many of the critical clinical stages, thus providing the opportunity in the future to cover a wide range of diseases for which effective therapies are not currently available. PMID:26929599
"Lacking warmth": Alexithymia trait is related to warm-specific thermal somatosensory processing.
Borhani, Khatereh; Làdavas, Elisabetta; Fotopoulou, Aikaterini; Haggard, Patrick
2017-09-01
Alexithymia is a personality trait involving deficits in emotional processing. The personality construct has been extensively validated, but the underlying neural and physiological systems remain controversial. One theory suggests that low-level somatosensory mechanisms act as somatic markers of emotion, underpinning cognitive and affective impairments in alexithymia. In two separate samples (total N=100), we used an established Quantitative Sensory Testing (QST) battery to probe multiple neurophysiological submodalities of somatosensation, and investigated their associations with the widely-used Toronto Alexithymia Scale (TAS-20). Experiment one found reduced sensitivity to warmth in people with higher alexithymia scores, compared to individuals with lower scores, without deficits in other somatosensory submodalities. Experiment two replicated this result in a new group of participants using a full-sample correlation between threshold for warm detection and TAS-20 scores. We discuss the relations between low-level thermoceptive function and cognitive processing of emotion. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Drug-loaded erythrocytes: on the road toward marketing approval.
Bourgeaux, Vanessa; Lanao, José M; Bax, Bridget E; Godfrin, Yann
2016-01-01
Erythrocyte drug encapsulation is one of the most promising therapeutic alternative approaches for the administration of toxic or rapidly cleared drugs. Drug-loaded erythrocytes can operate through one of the three main mechanisms of action: extension of circulation half-life (bioreactor), slow drug release, or specific organ targeting. Although the clinical development of erythrocyte carriers is confronted with regulatory and development process challenges, industrial development is expanding. The manufacture of this type of product can be either centralized or bedside based, and different procedures are employed for the encapsulation of therapeutic agents. The major challenges for successful industrialization include production scalability, process validation, and quality control of the released therapeutic agents. Advantages and drawbacks of the different manufacturing processes as well as success key points of clinical development are discussed. Several entrapment technologies based on osmotic methods have been industrialized. Companies have already achieved many of the critical clinical stages, thus providing the opportunity in the future to cover a wide range of diseases for which effective therapies are not currently available.
Strong monogamy inequalities for four qubits
NASA Astrophysics Data System (ADS)
Regula, Bartosz; Osterloh, Andreas; Adesso, Gerardo
2016-05-01
We investigate possible generalizations of the Coffman-Kundu-Wootters monogamy inequality to four qubits, accounting for multipartite entanglement in addition to the bipartite terms. We show that the most natural extension of the inequality does not hold in general, and we describe the violations of this inequality in detail. We investigate alternative ways to extend the monogamy inequality to express a constraint on entanglement sharing valid for all four-qubit states, and perform an extensive numerical analysis of randomly generated four-qubit states to explore the properties of such extensions.
Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2000-01-01
A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.
Pourahmadi, Mohammad Reza; Taghipour, Morteza; Jannati, Elham; Mohseni-Bandpei, Mohammad Ali; Ebrahimi Takamjani, Ismail; Rajabzadeh, Fatemeh
2016-01-01
Measurement of lumbar spine range of motion (ROM) is often considered to be an essential component of lumbar spine physiotherapy and orthopedic assessment. The measurement can be carried out through various instruments such as inclinometers, goniometers, and etc. Recent smartphones have been equipped with accelerometers and magnetometers, which, through specific software applications (apps) can be used for inclinometric functions. The main purpose was to investigate the reliability and validity of an iPhone(®) app (TiltMeter(©) -advanced level and inclinometer) for measuring standing lumbar spine flexion-extension ROM in asymptomatic subjects. A cross-sectional study was carried out. This study was conducted in a physiotherapy clinic located at School of Rehabilitation Sciences, Iran University of Medical Science and Health Services, Tehran, Iran. A convenience sample of 30 asymptomatic adults (15 males; 15 females; age range = 18-55 years) was recruited between August 2015 and December 2015. Following a 2-minute warm-up, the subjects were asked to stand in a relaxed position and their skin was marked at the T12-L1 and S1-S2 spinal levels. From this position, they were asked to perform maximum lumbar flexion followed by maximum lumbar extension with their knees straight. Two blinded raters each used an inclinometer and the iPhone (®) app to measure lumbar spine flexion-extension ROM. A third rater read the measured angles. To calculate total lumbar spine flexion-extension ROM, the measurement from S1-S2 was subtracted from T12-L1. The second (2 hours later) and third (48 hours later) sessions were carried out in the same manner as the first session. All of the measurements were conducted 3 times and the mean value of 3 repetitions for each measurement was used for analysis. Intraclass correlation coefficient (ICC) models (3, k) and (2, k) were used to determine the intra-rater and inter-rater reliability, respectively. The Pearson correlation coefficients were used to establish concurrent validity of the iPhone(®) app. Furthermore, minimum detectable change at the 95% confidence level (MDC95) was computed as 1.96 × standard error of measurement × [Formula: see text]. Good to excellent intra-rater and inter-rater reliability were demonstrated for both the gravity-based inclinometer with ICC values of ≥0.84 and ≥0.77 and the iPhone(®) app with ICC values of ≥0.85 and ≥0.85, respectively. The MDC95 ranged from 5.82°to 8.18°for the intra-rater analysis and from 7.38°to 8.66° for the inter-rater analysis. The concurrent validity for flexion and extension between the 2 instruments was 0.85 and 0.91, respectively. The iPhone(®)app possesses good to excellent intra-rater and inter-rater reliability and concurrent validity. It seems that the iPhone(®) app can be used for the measurement of lumbar spine flexion-extension ROM. IIb.
Piezoresistive Cantilever Performance—Part II: Optimization
Park, Sung-Jin; Doll, Joseph C.; Rastegar, Ali J.; Pruitt, Beth L.
2010-01-01
Piezoresistive silicon cantilevers fabricated by ion implantation are frequently used for force, displacement, and chemical sensors due to their low cost and electronic readout. However, the design of piezoresistive cantilevers is not a straightforward problem due to coupling between the design parameters, constraints, process conditions, and performance. We systematically analyzed the effect of design and process parameters on force resolution and then developed an optimization approach to improve force resolution while satisfying various design constraints using simulation results. The combined simulation and optimization approach is extensible to other doping methods beyond ion implantation in principle. The optimization results were validated by fabricating cantilevers with the optimized conditions and characterizing their performance. The measurement results demonstrate that the analytical model accurately predicts force and displacement resolution, and sensitivity and noise tradeoff in optimal cantilever performance. We also performed a comparison between our optimization technique and existing models and demonstrated eight times improvement in force resolution over simplified models. PMID:20333323
Some Findings Concerning Requirements in Agile Methodologies
NASA Astrophysics Data System (ADS)
Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan
Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.
Adaptive control and noise suppression by a variable-gain gradient algorithm
NASA Technical Reports Server (NTRS)
Merhav, S. J.; Mehta, R. S.
1987-01-01
An adaptive control system based on normalized LMS filters is investigated. The finite impulse response of the nonparametric controller is adaptively estimated using a given reference model. Specifically, the following issues are addressed: The stability of the closed loop system is analyzed and heuristically established. Next, the adaptation process is studied for piecewise constant plant parameters. It is shown that by introducing a variable-gain in the gradient algorithm, a substantial reduction in the LMS adaptation rate can be achieved. Finally, process noise at the plant output generally causes a biased estimate of the controller. By introducing a noise suppression scheme, this bias can be substantially reduced and the response of the adapted system becomes very close to that of the reference model. Extensive computer simulations validate these and demonstrate assertions that the system can rapidly adapt to random jumps in plant parameters.
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
The ESA FRM4DOAS project: Towards a quality-controlled MAXDOAS Centralized Processing System
NASA Astrophysics Data System (ADS)
Hendrick, Francois; Fayt, Caroline; Friess, Udo; Kreher, Karin; Piters, Ankie; Richter, Andreas; Wagner, Thomas; Cede, Alexander; Spinei, Elena; von Bismarck, Jonas; Fehr, Thorsten; Van Roozendael, Michel
2017-04-01
The Fiducial Reference Measurements for Ground-Based DOAS Air-Quality Observations (FRM4DOAS) is a two-year project funded by the European Space Agency (ESA). Started in July 2016, FRM4DOAS aims at further harmonizing MAXDOAS measurements and data sets, through (1) the specification of best practices for instrument operation, (2) the selection of state-of-the art retrieval algorithms, procedures, and settings, (3) the demonstration of a centralised rapid-delivery (6-24h latency) processing system for MAXDOAS instruments to be operated within the international Network for the Detection of Atmospheric Composition Change (NDACC). The project also links with the Pandonia initiative. In a first phase, the system concentrates on the development of 3 key products: NO2 vertical profiles, total O3 and tropospheric HCHO profiles, which will be retrieved at 11 MAXDOAS pilot stations. The system will also be tested and validated on data from the CINDI-2 campaign, and designed to allow further extension after commissioning. These activities will help and guarantee that homogenous, fully traceable, and quality-controlled datasets are generated from reference ground-based UV-vis instruments, which will play a crucial role in the validation of future ESA/Copernicus Sentinel satellite missions S-5P, S-4, and S-5.
SMOS L1C and L2 Validation in Australia
NASA Technical Reports Server (NTRS)
Rudiger, Christoph; Walker, Jeffrey P.; Kerr, Yann H.; Mialon, Arnaud; Merlin, Olivier; Kim, Edward J.
2012-01-01
Extensive airborne field campaigns (Australian Airborne Cal/val Experiments for SMOS - AACES) were undertaken during the 2010 summer and winter seasons of the southern hemisphere. The purpose of those campaigns was the validation of the Level 1c (brightness temperature) and Level 2 (soil moisture) products of the ESA-led Soil Moisture and Ocean Salinity (SMOS) mission. As SMOS is the first satellite to globally map L-band (1.4GHz) emissions from the Earth?s surface, and the first 2-dimensional interferometric microwave radiometer used for Earth observation, large scale and long-term validation campaigns have been conducted world-wide, of which AACES is the most extensive. AACES combined large scale medium-resolution airborne L-band and spectral observations, along with high-resolution in-situ measurements of soil moisture across a 50,000km2 area of the Murrumbidgee River catchment, located in south-eastern Australia. This paper presents a qualitative assessment of the SMOS brightness temperature and soil moisture products.
ERIC Educational Resources Information Center
Benner, Gregory J.; Beaudoin, Kathleen; Mooney, Paul; Uhing, Brad M.; Pierce, Corey D.
2008-01-01
In the present study, we sought to extend instrument validation research for a strength-based emotional and behavior rating scale, the "Teacher Rating Scale of the Behavior and Emotional Rating Scale-Second Edition" (BERS-2; Epstein, M. H. (2004). "Behavioral and emotional rating scale" (2nd ed.). Austin, TX: PRO-ED) through…
ERIC Educational Resources Information Center
Zullig, Keith J.; Collins, Rani; Ghani, Nadia; Patton, Jon M.; Huebner, E. Scott; Ajamie, Jean
2014-01-01
Background: The School Climate Measure (SCM) was developed and validated in 2010 in response to a dearth of psychometrically sound school climate instruments. This study sought to further validate the SCM on a large, diverse sample of Arizona public school adolescents (N = 20,953). Methods: Four SCM domains (positive student-teacher relationships,…
ERIC Educational Resources Information Center
Aquino, Cesar A.
2014-01-01
This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…
USDA-ARS?s Scientific Manuscript database
Physical activity (PA) correlates have not been extensively studied in Hong Kong children. The aim of this study is to assess the validity and reliability of translated scales to measure PA related self-efficacy, enjoyment and social support in Hong Kong Chinese children. Sample 1 (n=273, aged 8–12 ...
ERIC Educational Resources Information Center
Mak, Jennifer Y.; Cheung, Siu-Yin; King, Carina C.; Lam, Eddie T. C.
2016-01-01
There have been extensive studies of local residents' perception and reaction to the impacts of mega events. However, there is limited empirical research on the social impacts that shape foreign attitudes toward the host country. The purpose of this study was to develop and validate the Olympic Games Attitude Scale (OGAS) to examine viewers'…
Quality of College Life (QCL) of Students: Further Validation of a Measure of Well-Being
ERIC Educational Resources Information Center
Sirgy, M. Joseph; Lee, Dong-Jin; Grzeskowiak, Stephan; Yu, Grace B.; Webb, Dave; El-Hasan, Karma; Vega, Jose Jesus Garcia; Ekici, Ahmet; Johar, J. S.; Krishen, Anjala; Kangal, Ayca; Swoboda, Bernhard; Claiborne, C. B.; Maggino, Filomena; Rahtz, Don; Canton, Alicia; Kuruuzum, Ayse
2010-01-01
This paper reports a study designed to further validate a measure of quality of college life (QCL) of university students (Sirgy, Grzeskowiak, Rahtz, "Soc Indic Res" 80(2), 343-360, 2007). Two studies were conducted: a replication study and an extension study. The replication study involved surveys of 10 different college campuses in different…
The PKRC's Value as a Professional Development Model Validated
ERIC Educational Resources Information Center
Larson, Dale
2013-01-01
After a brief review of the 4-H professional development standards, a new model for determining the value of continuing professional development is introduced and applied to the 4-H standards. The validity of the 4-H standards is affirmed. 4-H Extension professionals are encouraged to celebrate the strength of their standards and to engage the…
Strong monogamy conjecture for multiqubit entanglement: the four-qubit case.
Regula, Bartosz; Di Martino, Sara; Lee, Soojoon; Adesso, Gerardo
2014-09-12
We investigate the distribution of bipartite and multipartite entanglement in multiqubit states. In particular, we define a set of monogamy inequalities sharpening the conventional Coffman-Kundu-Wootters constraints, and we provide analytical proofs of their validity for relevant classes of states. We present extensive numerical evidence validating the conjectured strong monogamy inequalities for arbitrary pure states of four qubits.
Energy-Efficient Bioalcohol Recovery by Gel Stripping
NASA Astrophysics Data System (ADS)
Godbole, Rutvik; Ma, Lan; Hedden, Ronald
2014-03-01
Design of energy-efficient processes for recovering butanol and ethanol from dilute fermentations is a key challenge facing the biofuels industry due to the high energy consumption of traditional multi-stage distillation processes. Gel stripping is an alternative purification process by which a dilute alcohol is stripped from the fermentation product by passing it through a packed bed containing particles of a selectively absorbent polymeric gel material. The gel must be selective for the alcohol, while swelling to a reasonable degree in dilute alcohol-water mixtures. To accelerate materials optimization, a combinatorial approach is taken to screen a matrix of copolymer gels having orthogonal gradients in crosslinker concentration and hydrophilicity. Using a combination of swelling in pure solvents, the selectivity and distribution coefficients of alcohols in the gels can be predicted based upon multi-component extensions of Flory-Rehner theory. Predictions can be validated by measuring swelling in water/alcohol mixtures and conducting h HPLC analysis of the external liquid. 95% + removal of butanol from dilute aqueous solutions has been demonstrated, and a mathematical model of the unsteady-state gel stripping process has been developed. NSF CMMI Award 1335082.
Perceptual load interacts with stimulus processing across sensory modalities.
Klemen, J; Büchel, C; Rose, M
2009-06-01
According to perceptual load theory, processing of task-irrelevant stimuli is limited by the perceptual load of a parallel attended task if both the task and the irrelevant stimuli are presented to the same sensory modality. However, it remains a matter of debate whether the same principles apply to cross-sensory perceptual load and, more generally, what form cross-sensory attentional modulation in early perceptual areas takes in humans. Here we addressed these questions using functional magnetic resonance imaging. Participants undertook an auditory one-back working memory task of low or high perceptual load, while concurrently viewing task-irrelevant images at one of three object visibility levels. The processing of the visual and auditory stimuli was measured in the lateral occipital cortex (LOC) and auditory cortex (AC), respectively. Cross-sensory interference with sensory processing was observed in both the LOC and AC, in accordance with previous results of unisensory perceptual load studies. The present neuroimaging results therefore warrant the extension of perceptual load theory from a unisensory to a cross-sensory context: a validation of this cross-sensory interference effect through behavioural measures would consolidate the findings.
NASA Technical Reports Server (NTRS)
Starr, David
2000-01-01
The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community.
RRegrs: an R package for computer-aided model selection with multiple regression models.
Tsiliki, Georgia; Munteanu, Cristian R; Seoane, Jose A; Fernandez-Lozano, Carlos; Sarimveis, Haralambos; Willighagen, Egon L
2015-01-01
Predictive regression models can be created with many different modelling approaches. Choices need to be made for data set splitting, cross-validation methods, specific regression parameters and best model criteria, as they all affect the accuracy and efficiency of the produced predictive models, and therefore, raising model reproducibility and comparison issues. Cheminformatics and bioinformatics are extensively using predictive modelling and exhibit a need for standardization of these methodologies in order to assist model selection and speed up the process of predictive model development. A tool accessible to all users, irrespectively of their statistical knowledge, would be valuable if it tests several simple and complex regression models and validation schemes, produce unified reports, and offer the option to be integrated into more extensive studies. Additionally, such methodology should be implemented as a free programming package, in order to be continuously adapted and redistributed by others. We propose an integrated framework for creating multiple regression models, called RRegrs. The tool offers the option of ten simple and complex regression methods combined with repeated 10-fold and leave-one-out cross-validation. Methods include Multiple Linear regression, Generalized Linear Model with Stepwise Feature Selection, Partial Least Squares regression, Lasso regression, and Support Vector Machines Recursive Feature Elimination. The new framework is an automated fully validated procedure which produces standardized reports to quickly oversee the impact of choices in modelling algorithms and assess the model and cross-validation results. The methodology was implemented as an open source R package, available at https://www.github.com/enanomapper/RRegrs, by reusing and extending on the caret package. The universality of the new methodology is demonstrated using five standard data sets from different scientific fields. Its efficiency in cheminformatics and QSAR modelling is shown with three use cases: proteomics data for surface-modified gold nanoparticles, nano-metal oxides descriptor data, and molecular descriptors for acute aquatic toxicity data. The results show that for all data sets RRegrs reports models with equal or better performance for both training and test sets than those reported in the original publications. Its good performance as well as its adaptability in terms of parameter optimization could make RRegrs a popular framework to assist the initial exploration of predictive models, and with that, the design of more comprehensive in silico screening applications.Graphical abstractRRegrs is a computer-aided model selection framework for R multiple regression models; this is a fully validated procedure with application to QSAR modelling.
Development and validation of spray models for investigating diesel engine combustion and emissions
NASA Astrophysics Data System (ADS)
Som, Sibendu
Diesel engines intrinsically generate NOx and particulate matter which need to be reduced significantly in order to comply with the increasingly stringent regulations worldwide. This motivates the diesel engine manufacturers to gain fundamental understanding of the spray and combustion processes so as to optimize these processes and reduce engine emissions. Strategies being investigated to reduce engine's raw emissions include advancements in fuel injection systems, efficient nozzle orifice design, injection and combustion control strategies, exhaust gas recirculation, use of alternative fuels such as biodiesel etc. This thesis explores several of these approaches (such as nozzle orifice design, injection control strategy, and biodiesel use) by performing computer modeling of diesel engine processes. Fuel atomization characteristics are known to have a significant effect on the combustion and emission processes in diesel engines. Primary fuel atomization is induced by aerodynamics in the near nozzle region as well as cavitation and turbulence from the injector nozzle. The breakup models that are currently used in diesel engine simulations generally consider aerodynamically induced breakup using the Kelvin-Helmholtz (KH) instability model, but do not account for inner nozzle flow effects. An improved primary breakup (KH-ACT) model incorporating cavitation and turbulence effects along with aerodynamically induced breakup is developed and incorporated in the computational fluid dynamics code CONVERGE. The spray simulations using KH-ACT model are "quasi-dynamically" coupled with inner nozzle flow (using FLUENT) computations. This presents a novel tool to capture the influence of inner nozzle flow effects such as cavitation and turbulence on spray, combustion, and emission processes. Extensive validation is performed against the non-evaporating spray data from Argonne National Laboratory. Performance of the KH and KH-ACT models is compared against the evaporating and combusting data from Sandia National Laboratory. The KH-ACT model is observed to provide better predictions for spray dispersion, axial velocity decay, sauter mean diameter, and liquid and lift-off length interplay which is attributed to the enhanced primary breakup predicted by this model. In addition, experimentally observed trends with changing nozzle conicity could only be captured by the KH-ACT model. Results further indicate that the combustion under diesel engine conditions is characterized by a double-flame structure with a rich premixed reaction zone near the flame stabilization region and a non-premixed reaction zone further downstream. Finally, the differences in inner nozzle flow and spray characteristics of petrodiesel and biodiesel are quantified. The improved modeling capability developed in this work can be used for extensive diesel engine simulations to further optimize injection, spray, combustion, and emission processes.
Search for Spatially Extended Fermi-LAT Sources Using Two Years of Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lande, Joshua; Ackermann, Markus; Allafort, Alice
2012-07-13
Spatial extension is an important characteristic for correctly associating {gamma}-ray-emitting sources with their counterparts at other wavelengths and for obtaining an unbiased model of their spectra. We present a new method for quantifying the spatial extension of sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi). We perform a series of Monte Carlo simulations to validate this tool and calculate the LAT threshold for detecting the spatial extension of sources. We then test all sources in the second Fermi -LAT catalog (2FGL) for extension. We report the detection of sevenmore » new spatially extended sources.« less
Learning optimal embedded cascades.
Saberian, Mohammad Javad; Vasconcelos, Nuno
2012-10-01
The problem of automatic and optimal design of embedded object detector cascades is considered. Two main challenges are identified: optimization of the cascade configuration and optimization of individual cascade stages, so as to achieve the best tradeoff between classification accuracy and speed, under a detection rate constraint. Two novel boosting algorithms are proposed to address these problems. The first, RCBoost, formulates boosting as a constrained optimization problem which is solved with a barrier penalty method. The constraint is the target detection rate, which is met at all iterations of the boosting process. This enables the design of embedded cascades of known configuration without extensive cross validation or heuristics. The second, ECBoost, searches over cascade configurations to achieve the optimal tradeoff between classification risk and speed. The two algorithms are combined into an overall boosting procedure, RCECBoost, which optimizes both the cascade configuration and its stages under a detection rate constraint, in a fully automated manner. Extensive experiments in face, car, pedestrian, and panda detection show that the resulting detectors achieve an accuracy versus speed tradeoff superior to those of previous methods.
Fall 2012 Graduate Engineering Internship Summary
NASA Technical Reports Server (NTRS)
Ehrlich, Joshua
2013-01-01
In the fall of 2012, I participated in the National Aeronautics and Space Administration (NASA) Pathways Intern Employment Program at the Kennedy Space Center (KSC) in Florida. This was my second internship opportunity with NASA, a consecutive extension from a summer 2012 internship. During my four-month tenure, I gained valuable knowledge and extensive hands-on experience with payload design and testing as well as composite fabrication for repair design on future space vehicle structures. As a systems engineer, I supported the systems engineering and integration team with the testing of scientific payloads such as the Vegetable Production System (Veggie). Verification and validation (V&V) of the Veggie was carried out prior to qualification testing of the payload, which incorporated a lengthy process of confirming design requirements that were integrated through one or more validatjon methods: inspection, analysis, demonstration, and testing. Additionally, I provided assistance in verifying design requirements outlined in the V&V plan with the requirements outlined by the scientists in the Science Requirements Envelope Document (SRED). The purpose of the SRED was to define experiment requirements intended for the payload to meet and carry out.
Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B
2012-01-01
Objective There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. Materials and methods We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. Results An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. Discussion A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. Conclusion We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries. PMID:22319176
Diffusive deposition of aerosols in Phebus containment during FPT-2 test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontautas, A.; Urbonavicius, E.
2012-07-01
At present the lumped-parameter codes is the main tool to investigate the complex response of the containment of Nuclear Power Plant in case of an accident. Continuous development and validation of the codes is required to perform realistic investigation of the processes that determine the possible source term of radioactive products to the environment. Validation of the codes is based on the comparison of the calculated results with the measurements performed in experimental facilities. The most extensive experimental program to investigate fission product release from the molten fuel, transport through the cooling circuit and deposition in the containment is performedmore » in PHEBUS test facility. Test FPT-2 performed in this facility is considered for analysis of processes taking place in containment. Earlier performed investigations using COCOSYS code showed that the code could be successfully used for analysis of thermal-hydraulic processes and deposition of aerosols, but there was also noticed that diffusive deposition on the vertical walls does not fit well with the measured results. In the CPA module of ASTEC code there is implemented different model for diffusive deposition, therefore the PHEBUS containment model was transferred from COCOSYS code to ASTEC-CPA to investigate the influence of the diffusive deposition modelling. Analysis was performed using PHEBUS containment model of 16 nodes. The calculated thermal-hydraulic parameters are in good agreement with measured results, which gives basis for realistic simulation of aerosol transport and deposition processes. Performed investigations showed that diffusive deposition model has influence on the aerosol deposition distribution on different surfaces in the test facility. (authors)« less
Lin, Steve; Turgulov, Anuar; Taher, Ahmed; Buick, Jason E; Byers, Adam; Drennan, Ian R; Hu, Samantha; J Morrison, Laurie
2016-10-01
Cardiopulmonary resuscitation (CPR) process measures research and quality assurance has traditionally been limited to the first 5 minutes of resuscitation due to significant costs in time, resources, and personnel from manual data abstraction. CPR performance may change over time during prolonged resuscitations, which represents a significant knowledge gap. Moreover, currently available commercial software output of CPR process measures are difficult to analyze. The objective was to develop and validate a software program to help automate the abstraction and transfer of CPR process measures data from electronic defibrillators for complete episodes of cardiac arrest resuscitation. We developed a software program to facilitate and help automate CPR data abstraction and transfer from electronic defibrillators for entire resuscitation episodes. Using an intermediary Extensible Markup Language export file, the automated software transfers CPR process measures data (electrocardiogram [ECG] number, CPR start time, number of ventilations, number of chest compressions, compression rate per minute, compression depth per minute, compression fraction, and end-tidal CO 2 per minute). We performed an internal validation of the software program on 50 randomly selected cardiac arrest cases with resuscitation durations between 15 and 60 minutes. CPR process measures were manually abstracted and transferred independently by two trained data abstractors and by the automated software program, followed by manual interpretation of raw ECG tracings, treatment interventions, and patient events. Error rates and the time needed for data abstraction, transfer, and interpretation were measured for both manual and automated methods, compared to an additional independent reviewer. A total of 9,826 data points were each abstracted by the two abstractors and by the software program. Manual data abstraction resulted in a total of six errors (0.06%) compared to zero errors by the software program. The mean ± SD time measured per case for manual data abstraction was 20.3 ± 2.7 minutes compared to 5.3 ± 1.4 minutes using the software program (p = 0.003). We developed and validated an automated software program that efficiently abstracts and transfers CPR process measures data from electronic defibrillators for complete cardiac arrest episodes. This software will enable future cardiac arrest studies and quality assurance programs to evaluate the impact of CPR process measures during prolonged resuscitations. © 2016 by the Society for Academic Emergency Medicine.
Chamorro, Claudio; Armijo-Olivo, Susan; De la Fuente, Carlos; Fuentes, Javiera; Javier Chirosa, Luis
2017-01-01
Abstract The purpose of the study is to establish absolute reliability and concurrent validity between hand-held dynamometers (HHDs) and isokinetic dynamometers (IDs) in lower extremity peak torque assessment. Medline, Embase, CINAHL databases were searched for studies related to psychometric properties in muscle dynamometry. Studies considering standard error of measurement SEM (%) or limit of agreement LOA (%) expressed as percentage of the mean, were considered to establish absolute reliability while studies using intra-class correlation coefficient (ICC) were considered to establish concurrent validity between dynamometers. In total, 17 studies were included in the meta-analysis. The COSMIN checklist classified them between fair and poor. Using HHDs, knee extension LOA (%) was 33.59%, 95% confidence interval (CI) 23.91 to 43.26 and ankle plantar flexion LOA (%) was 48.87%, CI 35.19 to 62.56. Using IDs, hip adduction and extension; knee flexion and extension; and ankle dorsiflexion showed LOA (%) under 15%. Lower hip, knee, and ankle LOA (%) were obtained using an ID compared to HHD. ICC between devices ranged between 0.62, CI (0.37 to 0.87) for ankle dorsiflexion to 0.94, IC (0.91to 0.98) for hip adduction. Very high correlation were found for hip adductors and hip flexors and moderate correlations for knee flexors/extensors and ankle plantar/dorsiflexors. PMID:29071305
Towards interoperable and reproducible QSAR analyses: Exchange of datasets.
Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es
2010-06-30
QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.
Towards interoperable and reproducible QSAR analyses: Exchange of datasets
2010-01-01
Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161
NASA Technical Reports Server (NTRS)
Waters, Eric D.
2013-01-01
Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.
Buschmann, Dominik; Haberberger, Anna; Kirchner, Benedikt; Spornraft, Melanie; Riedmaier, Irmgard; Schelling, Gustav; Pfaffl, Michael W.
2016-01-01
Small RNA-Seq has emerged as a powerful tool in transcriptomics, gene expression profiling and biomarker discovery. Sequencing cell-free nucleic acids, particularly microRNA (miRNA), from liquid biopsies additionally provides exciting possibilities for molecular diagnostics, and might help establish disease-specific biomarker signatures. The complexity of the small RNA-Seq workflow, however, bears challenges and biases that researchers need to be aware of in order to generate high-quality data. Rigorous standardization and extensive validation are required to guarantee reliability, reproducibility and comparability of research findings. Hypotheses based on flawed experimental conditions can be inconsistent and even misleading. Comparable to the well-established MIQE guidelines for qPCR experiments, this work aims at establishing guidelines for experimental design and pre-analytical sample processing, standardization of library preparation and sequencing reactions, as well as facilitating data analysis. We highlight bottlenecks in small RNA-Seq experiments, point out the importance of stringent quality control and validation, and provide a primer for differential expression analysis and biomarker discovery. Following our recommendations will encourage better sequencing practice, increase experimental transparency and lead to more reproducible small RNA-Seq results. This will ultimately enhance the validity of biomarker signatures, and allow reliable and robust clinical predictions. PMID:27317696
Hara, Tomohiko; Nakanishi, Hiroyuki; Nakagawa, Tohru; Komiyama, Motokiyo; Kawahara, Takashi; Manabe, Tomoko; Miyake, Mototaka; Arai, Eri; Kanai, Yae; Fujimoto, Hiroyuki
2013-10-01
Recent studies have shown an improvement in prostate cancer diagnosis with the use of 3.0-Tesla magnetic resonance imaging. We retrospectively assessed the ability of this imaging technique to predict side-specific extracapsular extension of prostate cancer. From October 2007 to August 2011, prostatectomy was carried out in 396 patients after preoperative 3.0-Tesla magnetic resonance imaging. Among these, 132 (primary sample) and 134 patients (validation sample) underwent 12-core prostate biopsy at the National Cancer Center Hospital of Tokyo, Japan, and at other institutions, respectively. In the primary dataset, univariate and multivariate analyses were carried out to predict side-specific extracapsular extension using variables determined preoperatively, including 3.0-Tesla magnetic resonance imaging findings (T2-weighted and diffusion-weighted imaging). A prediction model was then constructed and applied to the validation study sample. Multivariate analysis identified four significant independent predictors (P < 0.05), including a biopsy Gleason score of ≥8, positive 3.0-Tesla diffusion-weighted magnetic resonance imaging findings, ≥2 positive biopsy cores on each side and a maximum percentage of positive cores ≥31% on each side. The negative predictive value was 93.9% in the combination model with these four predictors, meanwhile the positive predictive value was 33.8%. Good reproducibility of these four significant predictors and the combination model was observed in the validation study sample. The side-specific extracapsular extension prediction by the biopsy Gleason score and factors associated with tumor location, including a positive 3.0-Tesla diffusion-weighted magnetic resonance imaging finding, have a high negative predictive value, but a low positive predictive value. © 2013 The Japanese Urological Association.
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
21 CFR 820.75 - Process validation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment
NASA Technical Reports Server (NTRS)
Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)
2017-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.
ERIC Educational Resources Information Center
Heneman, Herbert G., III; Kimball, Steven; Milanowski, Anthony
2006-01-01
The present study contributes to knowledge of the construct validity of the short form of the Teacher Sense of Efficacy Scale (and by extension, given their similar content and psychometric properties, to the long form). The authors' research involves: (1) examining the psychometric properties of the TSES on a large sample of elementary, middle,…
1986-04-29
COMPILER VALIDATION SUMMARY REPORT: International Business Machines Corporation IBM Development System for the Ada Language for VM/CMS, Version 1.0 IBM 4381...tested using command scripts provided by International Business Machines Corporation. These scripts were reviewed by the validation team. Test.s were run...s): IBM 4381 (System/370) Operating System: VM/CMS, release 3.6 International Business Machines Corporation has made no deliberate extensions to the
Conversion of Radiology Reporting Templates to the MRRT Standard.
Kahn, Charles E; Genereaux, Brad; Langlotz, Curtis P
2015-10-01
In 2013, the Integrating the Healthcare Enterprise (IHE) Radiology workgroup developed the Management of Radiology Report Templates (MRRT) profile, which defines both the format of radiology reporting templates using an extension of Hypertext Markup Language version 5 (HTML5), and the transportation mechanism to query, retrieve, and store these templates. Of 200 English-language report templates published by the Radiological Society of North America (RSNA), initially encoded as text and in an XML schema language, 168 have been converted successfully into MRRT using a combination of automated processes and manual editing; conversion of the remaining 32 templates is in progress. The automated conversion process applied Extensible Stylesheet Language Transformation (XSLT) scripts, an XML parsing engine, and a Java servlet. The templates were validated for proper HTML5 and MRRT syntax using web-based services. The MRRT templates allow radiologists to share best-practice templates across organizations and have been uploaded to the template library to supersede the prior XML-format templates. By using MRRT transactions and MRRT-format templates, radiologists will be able to directly import and apply templates from the RSNA Report Template Library in their own MRRT-compatible vendor systems. The availability of MRRT-format reporting templates will stimulate adoption of the MRRT standard and is expected to advance the sharing and use of templates to improve the quality of radiology reports.
A Bayesian model for highly accelerated phase-contrast MRI.
Rich, Adam; Potter, Lee C; Jin, Ning; Ash, Joshua; Simonetti, Orlando P; Ahmad, Rizwan
2016-08-01
Phase-contrast magnetic resonance imaging is a noninvasive tool to assess cardiovascular disease by quantifying blood flow; however, low data acquisition efficiency limits the spatial and temporal resolutions, real-time application, and extensions to four-dimensional flow imaging in clinical settings. We propose a new data processing approach called Reconstructing Velocity Encoded MRI with Approximate message passing aLgorithms (ReVEAL) that accelerates the acquisition by exploiting data structure unique to phase-contrast magnetic resonance imaging. The proposed approach models physical correlations across space, time, and velocity encodings. The proposed Bayesian approach exploits the relationships in both magnitude and phase among velocity encodings. A fast iterative recovery algorithm is introduced based on message passing. For validation, prospectively undersampled data are processed from a pulsatile flow phantom and five healthy volunteers. The proposed approach is in good agreement, quantified by peak velocity and stroke volume (SV), with reference data for acceleration rates R≤10. For SV, Pearson r≥0.99 for phantom imaging (n = 24) and r≥0.96 for prospectively accelerated in vivo imaging (n = 10) for R≤10. The proposed approach enables accurate quantification of blood flow from highly undersampled data. The technique is extensible to four-dimensional flow imaging, where higher acceleration may be possible due to additional redundancy. Magn Reson Med 76:689-701, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
The Doe Water Cycle Pilot Study.
NASA Astrophysics Data System (ADS)
Miller, N. L.; King, A. W.; Miller, M. A.; Springer, E. P.; Wesely, M. L.; Bashford, K. E.; Conrad, M. E.; Costigan, K.; Foster, P. N.; Gibbs, H. K.; Jin, J.; Klazura, J.; Lesht, B. M.; Machavaram, M. V.; Pan, F.; Song, J.; Troyan, D.; Washington-Allen, R. A.
2005-03-01
A Department of Energy (DOE) multilaboratory Water Cycle Pilot Study (WCPS) investigated components of the local water budget at the Walnut River watershed in Kansas to study the relative importance of various processes and to determine the feasibility of observational water budget closure. An extensive database of local meteorological time series and land surface characteristics was compiled. Numerical simulations of water budget components were generated and, to the extent possible, validated for three nested domains within the Southern Great Plains-the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Cloud Atmospheric Radiation Testbed (CART), the Walnut River watershed (WRW), and the Whitewater watershed (WW), in Kansas.A 2-month intensive observation period (IOP) was conducted to gather extensive observations relevant to specific details of the water budget, including finescale precipitation, streamflow, and soil moisture measurements that were not made routinely by other programs. Event and seasonal water isotope (d18O, dD) sampling in rainwater, streams, soils, lakes, and wells provided a means of tracing sources and sinks within and external to the WW, WRW, and the ARM CART domains. The WCPS measured changes in the leaf area index for several vegetation types, deep groundwater variations at two wells, and meteorological variables at a number of sites in the WRW. Additional activities of the WCPS include code development toward a regional climate model that includes water isotope processes, soil moisture transect measurements, and water-level measurements in groundwater wells.
NASA Astrophysics Data System (ADS)
Nagothu, U. S.
2016-12-01
Agricultural extension services, among others, contribute to improving rural livelihoods and enhancing economic development. Knowledge development and transfer from the cognitive science point of view, is about, how farmers use and apply their experiential knowledge as well as acquired new knowledge to solve new problems. This depends on the models adopted, the way knowledge is generated and delivered. New extension models based on ICT platforms and smart phones are promising. Results from a 5-year project (www.climaadapt.org) in India shows that farmer led-on farm validations of technologies and knowledge exchange through ICT based platforms outperformed state operated linear extension programs. Innovation here depends on the connectivity, net-working between stakeholders that are involved in generating, transferring and using the knowledge. Key words: Smallholders, Knowledge, Extension, Innovation, India
Beating Landauer's Bound: Tradeoff between Accuracy and Heat Dissipation
NASA Astrophysics Data System (ADS)
Talukdar, Saurav; Bhaban, Shreyas; Salapaka, Murti
The Landauer's Principle states that erasing of one bit of stored information is necessarily accompanied by heat dissipation of at least kb Tln 2 per bit. However, this is true only if the erasure process is always successful. We demonstrate that if the erasure process has a success probability p, the minimum heat dissipation per bit is given by kb T(plnp + (1 - p) ln (1 - p) + ln 2), referred to as the Generalized Landauer Bound, which is kb Tln 2 if the erasure process is always successful and decreases to zero as p reduces to 0.5. We present a model for a one-bit memory based on a Brownian particle in a double well potential motivated from optical tweezers and achieve erasure by manipulation of the optical fields. The method uniquely provides with a handle on the success proportion of the erasure. The thermodynamics framework for Langevin dynamics developed by Sekimoto is used for computation of heat dissipation in each realization of the erasure process. Using extensive Monte Carlo simulations, we demonstrate that the Landauer Bound of kb Tln 2 is violated by compromising on the success of the erasure process, while validating the existence of the Generalized Landauer Bound.
Validation & Safety Constraints: What We Want to Do… What We Can Do
NASA Astrophysics Data System (ADS)
Yepez, Amaya Atenicia; Peiro, Belen Martin; Bory, Stephane
2010-09-01
Autonomous safety critical systems require an exhaustive validation in order to guarantee robustness from different perspectives(SW, HW and algorithm design). In this paper we are presenting a performance validation approach dealing with an extensive list of difficulties, as lessons learnt from the space projects developed by GMV(e.g. within EGNOS and Galileo Programs). We will strongly recommend that the selected validation strategy is decided from the early stages of the system definition and it is carried out listening to the opinions and demands of all parties. In fact, to agree on the final solution, a trade-off will be needed in order to validate the requirements with the available means, in terms of amount of data and resources.
An RL10A-3-3A rocket engine model using the rocket engine transient simulator (ROCETS) software
NASA Technical Reports Server (NTRS)
Binder, Michael
1993-01-01
Steady-state and transient computer models of the RL10A-3-3A rocket engine have been created using the Rocket Engine Transient Simulation (ROCETS) code. These models were created for several purposes. The RL10 engine is a critical component of past, present, and future space missions; the model will give NASA an in-house capability to simulate the performance of the engine under various operating conditions and mission profiles. The RL10 simulation activity is also an opportunity to further validate the ROCETS program. The ROCETS code is an important tool for modeling rocket engine systems at NASA Lewis. ROCETS provides a modular and general framework for simulating the steady-state and transient behavior of any desired propulsion system. Although the ROCETS code is being used in a number of different analysis and design projects within NASA, it has not been extensively validated for any system using actual test data. The RL10A-3-3A has a ten year history of test and flight applications; it should provide sufficient data to validate the ROCETS program capability. The ROCETS models of the RL10 system were created using design information provided by Pratt & Whitney, the engine manufacturer. These models are in the process of being validated using test-stand and flight data. This paper includes a brief description of the models and comparison of preliminary simulation output against flight and test-stand data.
Law, Lily N. C.; Zentner, Marcel
2012-01-01
A common approach for determining musical competence is to rely on information about individuals’ extent of musical training, but relying on musicianship status fails to identify musically untrained individuals with musical skill, as well as those who, despite extensive musical training, may not be as skilled. To counteract this limitation, we developed a new test battery (Profile of Music Perception Skills; PROMS) that measures perceptual musical skills across multiple domains: tonal (melody, pitch), qualitative (timbre, tuning), temporal (rhythm, rhythm-to-melody, accent, tempo), and dynamic (loudness). The PROMS has satisfactory psychometric properties for the composite score (internal consistency and test-retest r>.85) and fair to good coefficients for the individual subtests (.56 to.85). Convergent validity was established with the relevant dimensions of Gordon’s Advanced Measures of Music Audiation and Musical Aptitude Profile (melody, rhythm, tempo), the Musical Ear Test (rhythm), and sample instrumental sounds (timbre). Criterion validity was evidenced by consistently sizeable and significant relationships between test performance and external musical proficiency indicators in all three studies (.38 to.62, p<.05 to p<.01). An absence of correlations between test scores and a nonmusical auditory discrimination task supports the battery’s discriminant validity (−.05, ns). The interrelationships among the various subtests could be accounted for by two higher order factors, sequential and sensory music processing. A brief version of the full PROMS is introduced as a time-efficient approximation of the full version of the battery. PMID:23285071
Factor structure of the Hooper Visual Organization Test: a cross-cultural replication and extension.
Merten, Thomas
2005-01-01
To investigate construct validity of the Hooper Visual Organization Test (VOT), a principal-axis analysis was performed on the neuropsychological test results of 200 German-speaking neurological patients who received a comprehensive battery, encompassing tests of visuospatial functions, memory, attention, executive functions, naming ability, and vocabulary. A four-factor solution was obtained with substantial loadings of the VOT only on the first factor, interpreted as a global dimension of non-verbal cognitive functions. This factor loaded significantly on numerous measures of visuospatial processing and attention (with particularly high loadings on WAIS-R Block Design, Trails A and B, and Raven's Standard Progressive Matrices). The remaining three factors were interpreted as memory, verbal abilities (vocabulary), and a separate factor of naming abilities.
Formulating Spatially Varying Performance in the Statistical Fusion Framework
Landman, Bennett A.
2012-01-01
To date, label fusion methods have primarily relied either on global (e.g. STAPLE, globally weighted vote) or voxelwise (e.g. locally weighted vote) performance models. Optimality of the statistical fusion framework hinges upon the validity of the stochastic model of how a rater errs (i.e., the labeling process model). Hitherto, approaches have tended to focus on the extremes of potential models. Herein, we propose an extension to the STAPLE approach to seamlessly account for spatially varying performance by extending the performance level parameters to account for a smooth, voxelwise performance level field that is unique to each rater. This approach, Spatial STAPLE, provides significant improvements over state-of-the-art label fusion algorithms in both simulated and empirical data sets. PMID:22438513
Model selection for anomaly detection
NASA Astrophysics Data System (ADS)
Burnaev, E.; Erofeev, P.; Smolyakov, D.
2015-12-01
Anomaly detection based on one-class classification algorithms is broadly used in many applied domains like image processing (e.g. detection of whether a patient is "cancerous" or "healthy" from mammography image), network intrusion detection, etc. Performance of an anomaly detection algorithm crucially depends on a kernel, used to measure similarity in a feature space. The standard approaches (e.g. cross-validation) for kernel selection, used in two-class classification problems, can not be used directly due to the specific nature of a data (absence of a second, abnormal, class data). In this paper we generalize several kernel selection methods from binary-class case to the case of one-class classification and perform extensive comparison of these approaches using both synthetic and real-world data.
ERIC Educational Resources Information Center
Mustian, R. David; And Others
This module is the second in an inservice education series for extension professionals that consists of seven independent training modules. It is an introduction to, and guided practice in, the premises, concepts, and processes of nonformal extension education--planning, designing and implementing, and evaluating and accounting for extension…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-25
... elements of process validation for the manufacture of human and animal drug and biological products... process validation for the manufacture of human and animal drug and biological products, including APIs. This guidance describes process validation activities in three stages: In Stage 1, Process Design, the...
Additional extensions to the NASCAP computer code, volume 2
NASA Technical Reports Server (NTRS)
Stannard, P. R.; Katz, I.; Mandell, M. J.
1982-01-01
Particular attention is given to comparison of the actural response of the SCATHA (Spacecraft Charging AT High Altitudes) P78-2 satellite with theoretical (NASCAP) predictions. Extensive comparisons for a variety of environmental conditions confirm the validity of the NASCAP model. A summary of the capabilities and range of validity of NASCAP is presented, with extensive reference to previously published applications. It is shown that NASCAP is capable of providing quantitatively accurate results when the object and environment are adequately represented and fall within the range of conditions for which NASCAP was intended. Three dimensional electric field affects play an important role in determining the potential of dielectric surfaces and electrically isolated conducting surfaces, particularly in the presence of artificially imposed high voltages. A theory for such phenomena is presented and applied to the active control experiments carried out in SCATHA, as well as other space and laboratory experiments. Finally, some preliminary work toward modeling large spacecraft in polar Earth orbit is presented. An initial physical model is presented including charge emission. A simple code based upon the model is described along with code test results.
Pourahmadi, Mohammad Reza; Jannati, Elham; Mohseni-Bandpei, Mohammad Ali; Ebrahimi Takamjani, Ismail; Rajabzadeh, Fatemeh
2016-01-01
Background Measurement of lumbar spine range of motion (ROM) is often considered to be an essential component of lumbar spine physiotherapy and orthopedic assessment. The measurement can be carried out through various instruments such as inclinometers, goniometers, and etc. Recent smartphones have been equipped with accelerometers and magnetometers, which, through specific software applications (apps) can be used for inclinometric functions. Purpose The main purpose was to investigate the reliability and validity of an iPhone® app (TiltMeter© -advanced level and inclinometer) for measuring standing lumbar spine flexion–extension ROM in asymptomatic subjects. Design A cross-sectional study was carried out. Setting This study was conducted in a physiotherapy clinic located at School of Rehabilitation Sciences, Iran University of Medical Science and Health Services, Tehran, Iran. Subjects A convenience sample of 30 asymptomatic adults (15 males; 15 females; age range = 18–55 years) was recruited between August 2015 and December 2015. Methods Following a 2–minute warm-up, the subjects were asked to stand in a relaxed position and their skin was marked at the T12–L1 and S1–S2 spinal levels. From this position, they were asked to perform maximum lumbar flexion followed by maximum lumbar extension with their knees straight. Two blinded raters each used an inclinometer and the iPhone ® app to measure lumbar spine flexion–extension ROM. A third rater read the measured angles. To calculate total lumbar spine flexion–extension ROM, the measurement from S1–S2 was subtracted from T12–L1. The second (2 hours later) and third (48 hours later) sessions were carried out in the same manner as the first session. All of the measurements were conducted 3 times and the mean value of 3 repetitions for each measurement was used for analysis. Intraclass correlation coefficient (ICC) models (3, k) and (2, k) were used to determine the intra-rater and inter-rater reliability, respectively. The Pearson correlation coefficients were used to establish concurrent validity of the iPhone® app. Furthermore, minimum detectable change at the 95% confidence level (MDC95) was computed as 1.96 × standard error of measurement × \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\sqrt{2}$\\end{document}2. Results Good to excellent intra-rater and inter-rater reliability were demonstrated for both the gravity-based inclinometer with ICC values of ≥0.84 and ≥0.77 and the iPhone® app with ICC values of ≥0.85 and ≥0.85, respectively. The MDC95 ranged from 5.82°to 8.18°for the intra-rater analysis and from 7.38°to 8.66° for the inter-rater analysis. The concurrent validity for flexion and extension between the 2 instruments was 0.85 and 0.91, respectively. Conclusions The iPhone®app possesses good to excellent intra-rater and inter-rater reliability and concurrent validity. It seems that the iPhone® app can be used for the measurement of lumbar spine flexion–extension ROM. Level of evidence IIb. PMID:27635328
Improved patch-based learning for image deblurring
NASA Astrophysics Data System (ADS)
Dong, Bo; Jiang, Zhiguo; Zhang, Haopeng
2015-05-01
Most recent image deblurring methods only use valid information found in input image as the clue to fill the deblurring region. These methods usually have the defects of insufficient prior information and relatively poor adaptiveness. Patch-based method not only uses the valid information of the input image itself, but also utilizes the prior information of the sample images to improve the adaptiveness. However the cost function of this method is quite time-consuming and the method may also produce ringing artifacts. In this paper, we propose an improved non-blind deblurring algorithm based on learning patch likelihoods. On one hand, we consider the effect of the Gaussian mixture model with different weights and normalize the weight values, which can optimize the cost function and reduce running time. On the other hand, a post processing method is proposed to solve the ringing artifacts produced by traditional patch-based method. Extensive experiments are performed. Experimental results verify that our method can effectively reduce the execution time, suppress the ringing artifacts effectively, and keep the quality of deblurred image.
Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569
Development of an Expressed Sequence Tag (EST) Resource for Wheat (Triticum aestivum L.)
Lazo, G. R.; Chao, S.; Hummel, D. D.; Edwards, H.; Crossman, C. C.; Lui, N.; Matthews, D. E.; Carollo, V. L.; Hane, D. L.; You, F. M.; Butler, G. E.; Miller, R. E.; Close, T. J.; Peng, J. H.; Lapitan, N. L. V.; Gustafson, J. P.; Qi, L. L.; Echalier, B.; Gill, B. S.; Dilbirligi, M.; Randhawa, H. S.; Gill, K. S.; Greene, R. A.; Sorrells, M. E.; Akhunov, E. D.; Dvořák, J.; Linkiewicz, A. M.; Dubcovsky, J.; Hossain, K. G.; Kalavacharla, V.; Kianian, S. F.; Mahmoud, A. A.; Miftahudin; Ma, X.-F.; Conley, E. J.; Anderson, J. A.; Pathan, M. S.; Nguyen, H. T.; McGuire, P. E.; Qualset, C. O.; Anderson, O. D.
2004-01-01
This report describes the rationale, approaches, organization, and resource development leading to a large-scale deletion bin map of the hexaploid (2n = 6x = 42) wheat genome (Triticum aestivum L.). Accompanying reports in this issue detail results from chromosome bin-mapping of expressed sequence tags (ESTs) representing genes onto the seven homoeologous chromosome groups and a global analysis of the entire mapped wheat EST data set. Among the resources developed were the first extensive public wheat EST collection (113,220 ESTs). Described are protocols for sequencing, sequence processing, EST nomenclature, and the assembly of ESTs into contigs. These contigs plus singletons (unassembled ESTs) were used for selection of distinct sequence motif unigenes. Selected ESTs were rearrayed, validated by 5′ and 3′ sequencing, and amplified for probing a series of wheat aneuploid and deletion stocks. Images and data for all Southern hybridizations were deposited in databases and were used by the coordinators for each of the seven homoeologous chromosome groups to validate the mapping results. Results from this project have established the foundation for future developments in wheat genomics. PMID:15514037
Validation and Verification (V&V) of Safety-Critical Systems Operating Under Off-Nominal Conditions
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.
2012-01-01
Loss of control (LOC) remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft LOC accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or more often in combination. Hence, there is no single intervention strategy to prevent these accidents. Research is underway at the National Aeronautics and Space Administration (NASA) in the development of advanced onboard system technologies for preventing or recovering from loss of vehicle control and for assuring safe operation under off-nominal conditions associated with aircraft LOC accidents. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V&V) and ultimate certification. The V&V of complex integrated systems poses highly significant technical challenges and is the subject of a parallel research effort at NASA. This chapter summarizes the V&V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft LOC accidents. A summary of recent research accomplishments in this effort is referenced.
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.
2010-01-01
Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.
NASA Technical Reports Server (NTRS)
Aiken, James; Hooker, Stanford
1997-01-01
Twice a year, the Royal Research Ship (RRS) James Clark Ross (JCR) steams a meridional transect of the atlantic Ocean between Grimsly (UK) and Stanley (Falkland Islands) with a port call in Montevideo (Uruguay), as part of the annual research activities of the British Antarctic Survey (BAS). In September, the JCR sails from the UK, and the following April it makes the return trip. The ship is operated by the BAS for the Natural Environment Research Council (NERC). The Atlantic Meridional Transect (AMT) Program exploits the passage of the JCR from approximately 50 deg. N to 50 deg. S with a primary objective to investigate physical and biological processes, as well as to measure the mesi-to-basin-scale bio-optical properties of the atlantic Ocean. The calibration and validation of remotely sensed observations of ocean colour is an inherent objective of these studies: first, by relating in situ measurements of water leaving radiance to satellite measurement, and second, by measuring the bio-optically active constituents of the water.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core
Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Jarrett, P Gary
2006-01-01
The primary purpose of this study is to undertake a diagnostic investigation of the international health care logistical environment and determine whether regulatory policies or industry procedures have hindered the implementation of just-in-time (JIT) systems and then to recommend operational improvements to be achieved by implementing JIT Systems. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT. An extensive literature review was conducted. In this particular study the cost and benefit outcomes achieved from a health care JIT implementation were compared with those achieved by the manufacturing, service, and retail industries. Chiefly, it was found that the health service market must be restructured to encourage greater price competition among priorities. A new standardization process should eliminate duplication of products and realize substantial savings. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT.
An XML-based interchange format for genotype-phenotype data.
Whirl-Carrillo, M; Woon, M; Thorn, C F; Klein, T E; Altman, R B
2008-02-01
Recent advances in high-throughput genotyping and phenotyping have accelerated the creation of pharmacogenomic data. Consequently, the community requires standard formats to exchange large amounts of diverse information. To facilitate the transfer of pharmacogenomics data between databases and analysis packages, we have created a standard XML (eXtensible Markup Language) schema that describes both genotype and phenotype data as well as associated metadata. The schema accommodates information regarding genes, drugs, diseases, experimental methods, genomic/RNA/protein sequences, subjects, subject groups, and literature. The Pharmacogenetics and Pharmacogenomics Knowledge Base (PharmGKB; www.pharmgkb.org) has used this XML schema for more than 5 years to accept and process submissions containing more than 1,814,139 SNPs on 20,797 subjects using 8,975 assays. Although developed in the context of pharmacogenomics, the schema is of general utility for exchange of genotype and phenotype data. We have written syntactic and semantic validators to check documents using this format. The schema and code for validation is available to the community at http://www.pharmgkb.org/schema/index.html (last accessed: 8 October 2007). (c) 2007 Wiley-Liss, Inc.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
De Groef, An; Van Kampen, Marijke; Moortgat, Peter; Anthonissen, Mieke; Van den Kerckhove, Eric; Christiaens, Marie-Rose; Neven, Patrick; Geraerts, Inge; Devoogdt, Nele
2018-01-01
To investigate the concurrent, face and content validity of an evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool). 1) Concurrent validity of the MAP-BC evaluation tool was investigated by exploring correlations (Spearman's rank Correlation Coefficient) between the subjective scores (0 -no adhesions to 3 -very strong adhesions) of the skin level using the MAP-BC evaluation tool and objective elasticity parameters (maximal skin extension and gross elasticity) generated by the Cutometer Dual MPA 580. Nine different examination points on and around the mastectomy scar were evaluated. 2) Face and content validity were explored by questioning therapists experienced with myofascial therapy in breast cancer patients about the comprehensibility and comprehensiveness of the MAP-BC evaluation tool. 1) Only three meaningful correlations were found on the mastectomy scar. For the most lateral examination point on the mastectomy scar a moderate negative correlation (-0.44, p = 0.01) with the maximal skin extension and a moderate positive correlation with the resistance versus ability of returning or 'gross elasticity' (0.42, p = 0.02) were found. For the middle point on the mastectomy scar an almost moderate positive correlation with gross elasticity was found as well (0.38, p = 0.04) 2) Content and face validity have been found to be good. Eighty-nine percent of the respondent found the instructions understandable and 98% found the scoring system obvious. Thirty-seven percent of the therapists suggested to add the possibility to evaluate additional anatomical locations in case of reconstructive and/or bilateral surgery. The MAP-BC evaluation tool for myofascial adhesions in breast cancer patients has good face and content validity. Evidence for good concurrent validity of the skin level was found only on the mastectomy scar itself.
Larsson, Helena; Tegern, Matthias; Monnier, Andreas; Skoglund, Jörgen; Helander, Charlotte; Persson, Emelie; Malm, Christer; Broman, Lisbet; Aasa, Ulrika
2015-01-01
The objective of this study was to examine the content validity of commonly used muscle performance tests in military personnel and to investigate the reliability of a proposed test battery. For the content validity investigation, thirty selected tests were those described in the literature and/or commonly used in the Nordic and North Atlantic Treaty Organization (NATO) countries. Nine selected experts rated, on a four-point Likert scale, the relevance of these tests in relation to five different work tasks: lifting, carrying equipment on the body or in the hands, climbing, and digging. Thereafter, a content validity index (CVI) was calculated for each work task. The result showed excellent CVI (≥0.78) for sixteen tests, which comprised of one or more of the military work tasks. Three of the tests; the functional lower-limb loading test (the Ranger test), dead-lift with kettlebells, and back extension, showed excellent content validity for four of the work tasks. For the development of a new muscle strength/endurance test battery, these three tests were further supplemented with two other tests, namely, the chins and side-bridge test. The inter-rater reliability was high (intraclass correlation coefficient, ICC2,1 0.99) for all five tests. The intra-rater reliability was good to high (ICC3,1 0.82–0.96) with an acceptable standard error of mean (SEM), except for the side-bridge test (SEM%>15). Thus, the final suggested test battery for a valid and reliable evaluation of soldiers’ muscle performance comprised the following four tests; the Ranger test, dead-lift with kettlebells, chins, and back extension test. The criterion-related validity of the test battery should be further evaluated for soldiers exposed to varying physical workload. PMID:26177030
Concept analysis and validation of the nursing diagnosis, delayed surgical recovery.
Appoloni, Aline Helena; Herdman, T Heather; Napoleão, Anamaria Alves; Campos de Carvalho, Emilia; Hortense, Priscilla
2013-10-01
To analyze the human response of delayed surgical recovery, approved by NANDA-I, and to validate its defining characteristics (DCs) and related factors (RFs). This was a two-part study using a concept analysis based on the method of Walker and Avant, and diagnostic content validation based on Fehring's model. Three of the original DCs, and three proposed DCs identified from the concept analysis, were validated in this study; five of the original RFs and four proposed RFs were validated. A revision of the concept studied is suggested, incorporating the validation of some of the DCs and RFs presented by NANDA-I, and the insertion of new, validated DCs and RFs. This study may enable the extension of the use of this diagnosis and contribute to quality surgical care of clients. © 2013, The Authors. International Journal of Nursing Knowledge © 2013, NANDA International.
Extension of the firefly algorithm and preference rules for solving MINLP problems
NASA Astrophysics Data System (ADS)
Costa, M. Fernanda P.; Francisco, Rogério B.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.
2017-07-01
An extension of the firefly algorithm (FA) for solving mixed-integer nonlinear programming (MINLP) problems is presented. Although penalty functions are nowadays frequently used to handle integrality conditions and inequality and equality constraints, this paper proposes the implementation within the FA of a simple rounded-based heuristic and four preference rules to find and converge to MINLP feasible solutions. Preliminary numerical experiments are carried out to validate the proposed methodology.
Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed
NASA Technical Reports Server (NTRS)
Tian, Ye; Song, Qi; Cattafesta, Louis
2005-01-01
This report summarizes the activities on "Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed." The work summarized consists primarily of two parts. The first part summarizes our previous work and the extensions to adaptive ID and control algorithms. The second part concentrates on the validation of adaptive algorithms by applying them to a vibration beam test bed. Extensions to flow control problems are discussed.
NASA Technical Reports Server (NTRS)
Deshler, Terry; Hervig, Mark E.
1998-01-01
The efforts envisioned within the original proposal (accepted February 1994) and the extension of this proposal (accepted February 1997) included measurement validations, the retrieval of aerosol size distributions and distribution moments, aerosol correction studies, and investigations of polar stratospheric clouds. A majority of the results from this grant have been published. The principal results from this grant are discussed.
Zhanqing Li; Feng Niu; Kwon-Ho Lee; Jinyuan Xin; Wei Min Hao; Bryce L. Nordgren; Yuesi Wang; Pucai Wang
2007-01-01
The Moderate Resolution Imaging Spectroradiometer (MODIS) currently provides the most extensive aerosol retrievals on a global basis, but validation is limited to a small number of ground stations. This study presents a comprehensive evaluation of Collection 4 and 5 MODIS aerosol products using ground measurements from the Chinese Sun Hazemeter Network (CSHNET). The...
A Cost Analysis Model for Army Sponsored Graduate Dental Education Programs.
1997-04-01
characteristics of a good measurement tool ? Cooper and Emory in their textbook, Business Research Methods, state there are three major criteria for evaluating...a measurement tool : validity, reliability, and practicality (Cooper and Emory 1995). Validity can be compartmentalized into internal and external...tremendous expense? The AEGD-1 year program is used extensively as a recruiting tool to encourage senior dental students to join the Army Dental Corps. The
Impact of imaging measurements on response assessment in glioblastoma clinical trials
Reardon, David A.; Ballman, Karla V.; Buckner, Jan C.; Chang, Susan M.; Ellingson, Benjamin M.
2014-01-01
We provide historical and scientific guidance on imaging response assessment for incorporation into clinical trials to stimulate effective and expedited drug development for recurrent glioblastoma by addressing 3 fundamental questions: (i) What is the current validation status of imaging response assessment, and when are we confident assessing response using today's technology? (ii) What imaging technology and/or response assessment paradigms can be validated and implemented soon, and how will these technologies provide benefit? (iii) Which imaging technologies need extensive testing, and how can they be prospectively validated? Assessment of T1 +/− contrast, T2/FLAIR, diffusion, and perfusion-imaging sequences are routine and provide important insight into underlying tumor activity. Nonetheless, utility of these data within and across patients, as well as across institutions, are limited by challenges in quantifying measurements accurately and lack of consistent and standardized image acquisition parameters. Currently, there exists a critical need to generate guidelines optimizing and standardizing MRI sequences for neuro-oncology patients. Additionally, more accurate differentiation of confounding factors (pseudoprogression or pseudoresponse) may be valuable. Although promising, diffusion MRI, perfusion MRI, MR spectroscopy, and amino acid PET require extensive standardization and validation. Finally, additional techniques to enhance response assessment, such as digital T1 subtraction maps, warrant further investigation. PMID:25313236
Anderst, William; Baillargeon, Emma; Donaldson, William; Lee, Joon; Kang, James
2013-01-01
Study Design Case-control. Objective To characterize the motion path of the instant center of rotation (ICR) at each cervical motion segment from C2 to C7 during dynamic flexion-extension in asymptomatic subjects. To compare asymptomatic and single-level arthrodesis patient ICR paths. Summary of Background Data The ICR has been proposed as an alternative to range of motion (ROM) for evaluating the quality of spine movement and for identifying abnormal midrange kinematics. The motion path of the ICR during dynamic motion has not been reported. Methods 20 asymptomatic controls, 12 C5/C6 and 5 C6/C7 arthrodesis patients performed full ROM flexion-extension while biplane radiographs were collected at 30 Hz. A previously validated tracking process determined three-dimensional vertebral position with sub-millimeter accuracy. The finite helical axis method was used to calculate the ICR between adjacent vertebrae. A linear mixed-model analysis identified differences in the ICR path among motion segments and between controls and arthrodesis patients. Results From C2/C3 to C6/C7, the mean ICR location moved superior for each successive motion segment (p < .001). The AP change in ICR location per degree of flexion-extension decreased from the C2/C3 motion segment to the C6/C7 motion segment (p < .001). Asymptomatic subject variability (95% CI) in the ICR location averaged ±1.2 mm in the SI direction and ±1.9 mm in the AP direction over all motion segments and flexion-extension angles. Asymptomatic and arthrodesis groups were not significantly different in terms of average ICR position (all p ≥ .091) or in terms of the change in ICR location per degree of flexion-extension (all p ≥ .249). Conclusions To replicate asymptomatic in vivo cervical motion, disc replacements should account for level-specific differences in the location and motion path of ICR. Single-level anterior arthrodesis does not appear to affect cervical motion quality during flexion-extension. PMID:23429677
How Many Batches Are Needed for Process Validation under the New FDA Guidance?
Yang, Harry
2013-01-01
The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.
Szymańska, Anna; Szymański, Marcin; Czekajska-Chehab, Elżbieta; Szczerbo-Trojanowska, Małgorzata
2015-01-01
Juvenile nasopharyngeal angiofibroma is a benign, locally aggressive nasopharyngeal tumor. Apart from anterior lateral extension to the pterygopalatine fossa, it may spread laterally posterior to the pterygoid process, showing posterior lateral growth pattern, which is less common and more difficult to identify during surgery. We analyzed the routes of lateral spread, modalities useful in its diagnosis, the incidence of lateral extension and its influence on outcomes of surgical treatment. The records of 37 patients with laterally extending JNA treated at our institution between 1987 and 2011 were retrospectively evaluated. Computed tomography was performed in all patients and magnetic resonance imaging in 17 (46 %) patients. CT and MRI were evaluated to determine routes and extension of JNA lateral spread. Anterior lateral extension to the pterygopalatine fossa occurred in 36 (97 %) patients and further to the infratemporal fossa in 20 (54 %) patients. In 16 (43 %) cases posterior lateral spread was observed: posterior to the pterygoid process and/or between its plates. The recurrence rate was 29.7 % (11/37). The majority of residual lesions was located behind the pterygoid process (7/11). Recurrent disease occurred in 3/21 patients with anterior lateral extension, in 7/15 patients with both types of lateral extensions and in 1 patient with posterior lateral extension. JNA posterior lateral extension may spread behind the pterygoid process or between its plates. The recurrence rate in patients with anterior and/or posterior lateral extension is significantly higher than in patients with anterior lateral extension only. Both CT and MRI allow identification of the anterior and posterior lateral extensions.
NASA Technical Reports Server (NTRS)
Zenie, Alexandre; Luguern, Jean-Pierre
1987-01-01
The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.
Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K
2012-01-01
The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.
The New Millenium Program: Serving Earth and Space Sciences
NASA Technical Reports Server (NTRS)
Li, Fuk K.
2000-01-01
NASA has exciting plans for space science and Earth observations during the next decade. A broad range of advanced spacecraft and measurement technologies will be needed to support these plans within the existing budget and schedule constraints. Many of these technology needs are common to both NASA's Office of Earth Science (OES) and Office of Space Sciences (OSS). Even though some breakthrough technologies have been identified to address these needs, project managers have traditionally been reluctant to incorporate them into flight programs because their inherent development risk. To accelerate the infusion of new technologies into its OES and OSS missions, NASA established the New Millennium Program (NMP). This program analyzes the capability needs of these enterprises, identifies candidate technologies to address these needs, incorporates advanced technology suites into validation flights, validates them in the relevant space environment, and then proactively infuses the validated technologies into future missions to enhance their capabilities while reducing their life cycle cost. The NMP employs a cross-enterprise Science Working Group, the NASA Enterprise science and technology roadmaps to define the capabilities needed by future Earth and Space science missions. Additional input from the science community is gathered through open workshops and peer-reviewed NASA Research Announcement (NRAs) for advanced measurement concepts. Technology development inputs from the technology organizations within NASA, other government agencies, federally funded research and development centers (FFRDC's), U.S. industry, and academia are sought to identify breakthrough technologies that might address these needs. This approach significantly extends NASA's technology infrastructure. To complement other flight test programs that develop or validate of individual components, the NMP places its highest priority on system-level validations of technology suites in the relevant space environment. This approach is not needed for all technologies, but it is usually essential to validate advanced system architectures or new measurement concepts. The NMP has recently revised its processes for defining candidate validation flights, and selecting technologies for these flights. The NMP now employs integrated project formulation teams, 'Which include scientists, technologists, and mission planners, to incorporate technology suites into candidate validation flights. These teams develop competing concepts, which can be rigorously evaluated prior to selection for flight. The technology providers for each concept are selected through an open, competitive, process during the project formulation phase. If their concept is selected for flight, they are incorporated into the Project Implementation Team, which develops, integrates, tests, launches, and operates the technology validation flight. Throughout the project implementation phase, the Implementation Team will document and disseminate their validation results to facilitate the infusion of their validated technologies into future OSS and OES science missions. The NMP has successfully launched its first two Deep Space flights for the OSS, and is currently implementing its first two Earth Orbiting flights for the OES. The next OSS and OES flights are currently being defined. Even though these flights are focused on specific Space Science and Earth Science themes, they are designed to validate a range of technologies that could benefit both enterprises, including advanced propulsion, communications, autonomous operations and navigation, multifunctional structures, microelectronics, and advanced instruments. Specific examples of these technologies will be provided in our presentation. The processes developed by the NMP also provide benefits across the Space and Earth Science enterprises. In particular, the extensive, nation-wide technology infrastructure developed by the NMP enhances the access to breakthrough technologies for both enterprises.
UTCI-Fiala multi-node model of human heat transfer and temperature regulation
NASA Astrophysics Data System (ADS)
Fiala, Dusan; Havenith, George; Bröde, Peter; Kampmann, Bernhard; Jendritzky, Gerd
2012-05-01
The UTCI-Fiala mathematical model of human temperature regulation forms the basis of the new Universal Thermal Climate Index (UTC). Following extensive validation tests, adaptations and extensions, such as the inclusion of an adaptive clothing model, the model was used to predict human temperature and regulatory responses for combinations of the prevailing outdoor climate conditions. This paper provides an overview of the underlying algorithms and methods that constitute the multi-node dynamic UTCI-Fiala model of human thermal physiology and comfort. Treated topics include modelling heat and mass transfer within the body, numerical techniques, modelling environmental heat exchanges, thermoregulatory reactions of the central nervous system, and perceptual responses. Other contributions of this special issue describe the validation of the UTCI-Fiala model against measured data and the development of the adaptive clothing model for outdoor climates.
Use of a hardware token for Grid authentication by the MICE data distribution framework
NASA Astrophysics Data System (ADS)
Nebrensky, JJ; Martyniak, J.
2017-10-01
The international Muon Ionization Cooling Experiment (MICE) is designed to demonstrate the principle of muon ionisation cooling for the first time. Data distribution and archiving, batch reprocessing, and simulation are all carried out using the EGI Grid infrastructure, in particular the facilities provided by GridPP in the UK. To prevent interference - especially accidental data deletion - these activities are separated by different VOMS roles. Data acquisition, in particular, can involve 24/7 operation for a number of weeks and so for moving the data out of the MICE Local Control Room at the experiment a valid, VOMS-enabled, Grid proxy must be made available continuously over that time. The MICE "Data Mover" agent is now using a robot certificate stored on a hardware token (Feitian ePass2003) from which a cron job generates a “plain” proxy to which the VOMS authorisation extensions are added in a separate transaction. A valid short-lifetime proxy is thus continuously available to the Data Mover process. The Feitian ePass2003 was chosen because it was both significantly cheaper and easier to actually purchase than the token commonly referred to in the community at that time; however there was no software support for the hardware. This paper describes the software packages, process and commands used to deploy the token into production.
Neural network modeling of the kinetics of SO2 removal by fly ash-based sorbent.
Raymond-Ooi, E H; Lee, K T; Mohamed, A R; Chu, K H
2006-01-01
The mechanistic modeling of the sulfation reaction between fly ash-based sorbent and SO2 is a challenging task due to a variety reasons including the complexity of the reaction itself and the inability to measure some of the key parameters of the reaction. In this work, the possibility of modeling the sulfation reaction kinetics using a purely data-driven neural network was investigated. Experiments on SO2 removal by a sorbent prepared from coal fly ash/CaO/CaSO4 were conducted using a fixed bed reactor to generate a database to train and validate the neural network model. Extensive SO2 removal data points were obtained by varying three process variables, namely, SO2 inlet concentration (500-2000 mg/L), reaction temperature (60-80 degreesC), and relative humidity (50-70%), as a function of reaction time (0-60 min). Modeling results show that the neural network can provide excellent fits to the SO2 removal data after considerable training and can be successfully used to predict the extent of SO2 removal as a function of time even when the process variables are outside the training domain. From a modeling standpoint, the suitably trained and validated neural network with excellent interpolation and extrapolation properties could have immediate practical benefits in the absence of a theoretical model.
LLCEDATA and LLCECALC for Windows version 1.0, Volume 3: Software verification and validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFadden, J.G.
1998-09-04
LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file(EDF) that represents a snapshot of both the LLCE and the tank from which it originates. LLCECALC reads the EDF and the gamma assay file (AV2) that is produced by the flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, which discusses system limitations and provides recommendations to the LLCE process. Volume 3 documents LLCEDATA and LLCECALC`s verification and validation. Two of the three installation test cases, from Volume 1, are independently confirmed. Data bases used in LLCEDATA are verified and referenced. Both phases of LLCECALC process gamma and characterization, are extensively tested to verify that the methodology and algorithms used are correct.« less
Etiological and Clinical Features of Childhood Psychotic Symptoms
Polanczyk, Guilherme; Moffitt, Terrie E.; Arseneault, Louise; Cannon, Mary; Ambler, Antony; Keefe, Richard S. E.; Houts, Renate; Odgers, Candice L.; Caspi, Avshalom
2013-01-01
Context It has been reported that childhood psychotic symptoms are common in the general population and may signal neurodevelopmental processes that lead to schizophrenia. However, it is not clear whether these symptoms are associated with the same extensive risk factors established for adult schizophrenia. Objective To examine the construct validity of children’s self-reported psychotic symptoms by testing whether these symptoms share the risk factors and clinical features of adult schizophrenia. Design Prospective, longitudinal cohort study of a nationally representative birth cohort in Great Britain. Participants A total of 2232 twelve-year-old children followed up since age 5 years (retention, 96%). Main Outcome Measure Children’s self-reported hallucinations and delusions. Results Children’s psychotic symptoms are familial and heritable and are associated with social risk factors (eg, urbanicity); cognitive impairments at age 5; home-rearing risk factors (eg, maternal expressed emotion); behavioral, emotional, and educational problems at age 5; and comorbid conditions, including self-harm. Conclusions The results provide a comprehensive picture of the construct validity of children’s self-reported psychotic symptoms. For researchers, the findings indicate that children who have psychotic symptoms can be recruited for neuroscience research to determine the pathogenesis of schizophrenia. For clinicians, the findings indicate that psychotic symptoms in childhood are often a marker of an impaired developmental process and should be actively assessed. PMID:20368509
Ravi, Daniele; Fabelo, Himar; Callic, Gustavo Marrero; Yang, Guang-Zhong
2017-09-01
Recent advances in hyperspectral imaging have made it a promising solution for intra-operative tissue characterization, with the advantages of being non-contact, non-ionizing, and non-invasive. Working with hyperspectral images in vivo, however, is not straightforward as the high dimensionality of the data makes real-time processing challenging. In this paper, a novel dimensionality reduction scheme and a new processing pipeline are introduced to obtain a detailed tumor classification map for intra-operative margin definition during brain surgery. However, existing approaches to dimensionality reduction based on manifold embedding can be time consuming and may not guarantee a consistent result, thus hindering final tissue classification. The proposed framework aims to overcome these problems through a process divided into two steps: dimensionality reduction based on an extension of the T-distributed stochastic neighbor approach is first performed and then a semantic segmentation technique is applied to the embedded results by using a Semantic Texton Forest for tissue classification. Detailed in vivo validation of the proposed method has been performed to demonstrate the potential clinical value of the system.
NASA Astrophysics Data System (ADS)
Sidiropoulos, Panagiotis; Muller, Jan-Peter; Watson, Gillian; Michael, Gregory; Walter, Sebastian
2018-02-01
This work presents the coregistered, orthorectified and mosaiced high-resolution products of the MC11 quadrangle of Mars, which have been processed using novel, fully automatic, techniques. We discuss the development of a pipeline that achieves fully automatic and parameter independent geometric alignment of high-resolution planetary images, starting from raw input images in NASA PDS format and following all required steps to produce a coregistered geotiff image, a corresponding footprint and useful metadata. Additionally, we describe the development of a radiometric calibration technique that post-processes coregistered images to make them radiometrically consistent. Finally, we present a batch-mode application of the developed techniques over the MC11 quadrangle to validate their potential, as well as to generate end products, which are released to the planetary science community, thus assisting in the analysis of Mars static and dynamic features. This case study is a step towards the full automation of signal processing tasks that are essential to increase the usability of planetary data, but currently, require the extensive use of human resources.
Physical Processes and Applications of the Monte Carlo Radiative Energy Deposition (MRED) Code
NASA Astrophysics Data System (ADS)
Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Fleetwood, Daniel M.; Warren, Kevin M.; Sierawski, Brian D.; King, Michael P.; Schrimpf, Ronald D.; Auden, Elizabeth C.
2015-08-01
MRED is a Python-language scriptable computer application that simulates radiation transport. It is the computational engine for the on-line tool CRÈME-MC. MRED is based on c++ code from Geant4 with additional Fortran components to simulate electron transport and nuclear reactions with high precision. We provide a detailed description of the structure of MRED and the implementation of the simulation of physical processes used to simulate radiation effects in electronic devices and circuits. Extensive discussion and references are provided that illustrate the validation of models used to implement specific simulations of relevant physical processes. Several applications of MRED are summarized that demonstrate its ability to predict and describe basic physical phenomena associated with irradiation of electronic circuits and devices. These include effects from single particle radiation (including both direct ionization and indirect ionization effects), dose enhancement effects, and displacement damage effects. MRED simulations have also helped to identify new single event upset mechanisms not previously observed by experiment, but since confirmed, including upsets due to muons and energetic electrons.
Rashid, Haroon; Sheikh, Zeeshan; Misbahuddin, Syed; Kazmi, Murtaza Raza; Qureshi, Sameer; Uddin, Muhammad Zuhaib
2016-01-01
Tooth wear is a process that is usually a result of tooth to tooth and/or tooth and restoration contact. The process of wear essentially becomes accelerated by the introduction of restorations inside the oral cavity, especially in case of opposing ceramic restorations. The newest materials have vastly contributed toward the interest in esthetic dental restorations and have been extensively studied in laboratories. However, despite the recent technological advancements, there has not been a valid in vivo method of evaluation involving clinical wear caused due to ceramics upon restored teeth and natural dentition. The aim of this paper is to review the latest advancements in all-ceramic materials, and their effect on the wear of opposing dentition. The descriptive review has been written after a thorough MEDLINE/PubMed search by the authors. It is imperative that clinicians are aware of recent advancements and that they should always consider the type of ceramic restorative materials used to maintain a stable occlusal relation. The ceramic restorations should be adequately finished and polished after the chair-side adjustment process of occlusal surfaces. PMID:28042280
Song, Mingkai; Cui, Linlin; Kuang, Han; Zhou, Jingwei; Yang, Pengpeng; Zhuang, Wei; Chen, Yong; Liu, Dong; Zhu, Chenjie; Chen, Xiaochun; Ying, Hanjie; Wu, Jinglan
2018-08-10
An intermittent simulated moving bed (3F-ISMB) operation scheme, the extension of the 3W-ISMB to the non-linear adsorption region, has been introduced for separation of glucose, lactic acid and acetic acid ternary-mixture. This work focuses on exploring the feasibility of the proposed process theoretically and experimentally. Firstly, the real 3F-ISMB model coupled with the transport dispersive model (TDM) and the Modified-Langmuir isotherm was established to build up the separation parameter plane. Subsequently, three operating conditions were selected from the plane to run the 3F-ISMB unit. The experimental results were used to verify the model. Afterwards, the influences of the various flow rates on the separation performances were investigated systematically by means of the validated 3F-ISMB model. The intermittent-retained component lactic acid was finally obtained with the purity of 98.5%, recovery of 95.5% and the average concentration of 38 g/L. The proposed 3F-ISMB process can efficiently separate the mixture with low selectivity into three fractions. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voltolini, Marco; Kwon, Tae-Hyuk; Ajo-Franklin, Jonathan
Pore-scale distribution of supercritical CO 2 (scCO 2) exerts significant control on a variety of key hydrologic as well as geochemical processes, including residual trapping and dissolution. Despite such importance, only a small number of experiments have directly characterized the three-dimensional distribution of scCO 2 in geologic materials during the invasion (drainage) process. Here, we present a study which couples dynamic high-resolution synchrotron X-ray micro-computed tomography imaging of a scCO 2/brine system at in situ pressure/temperature conditions with quantitative pore-scale modeling to allow direct validation of a pore-scale description of scCO2 distribution. The experiment combines high-speed synchrotron radiography with tomographymore » to characterize the brine saturated sample, the scCO 2 breakthrough process, and the partially saturated state of a sandstone sample from the Domengine Formation, a regionally extensive unit within the Sacramento Basin (California, USA). The availability of a 3D dataset allowed us to examine correlations between grains and pores morphometric parameters and the actual distribution of scCO 2 in the sample, including the examination of the role of small-scale sedimentary structure on CO2 distribution. The segmented scCO 2/brine volume was also used to validate a simple computational model based on the local thickness concept, able to accurately simulate the distribution of scCO 2 after drainage. The same method was also used to simulate Hg capillary pressure curves with satisfactory results when compared to the measured ones. Finally, this predictive approach, requiring only a tomographic scan of the dry sample, proved to be an effective route for studying processes related to CO 2 invasion structure in geological samples at the pore scale.« less
Voltolini, Marco; Kwon, Tae-Hyuk; Ajo-Franklin, Jonathan
2017-10-21
Pore-scale distribution of supercritical CO 2 (scCO 2) exerts significant control on a variety of key hydrologic as well as geochemical processes, including residual trapping and dissolution. Despite such importance, only a small number of experiments have directly characterized the three-dimensional distribution of scCO 2 in geologic materials during the invasion (drainage) process. Here, we present a study which couples dynamic high-resolution synchrotron X-ray micro-computed tomography imaging of a scCO 2/brine system at in situ pressure/temperature conditions with quantitative pore-scale modeling to allow direct validation of a pore-scale description of scCO2 distribution. The experiment combines high-speed synchrotron radiography with tomographymore » to characterize the brine saturated sample, the scCO 2 breakthrough process, and the partially saturated state of a sandstone sample from the Domengine Formation, a regionally extensive unit within the Sacramento Basin (California, USA). The availability of a 3D dataset allowed us to examine correlations between grains and pores morphometric parameters and the actual distribution of scCO 2 in the sample, including the examination of the role of small-scale sedimentary structure on CO2 distribution. The segmented scCO 2/brine volume was also used to validate a simple computational model based on the local thickness concept, able to accurately simulate the distribution of scCO 2 after drainage. The same method was also used to simulate Hg capillary pressure curves with satisfactory results when compared to the measured ones. Finally, this predictive approach, requiring only a tomographic scan of the dry sample, proved to be an effective route for studying processes related to CO 2 invasion structure in geological samples at the pore scale.« less
Predictive Validation of an Influenza Spread Model
Hyder, Ayaz; Buckeridge, David L.; Leung, Brian
2013-01-01
Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive ability. PMID:23755236
Modeling of phosphorus loads in sugarcane in a low-relief landscape using ontology-based simulation.
Kwon, Ho-Young; Grunwald, Sabine; Beck, Howard W; Jung, Yunchul; Daroub, Samira H; Lang, Timothy A; Morgan, Kelly T
2010-01-01
Water flow and P dynamics in a low-relief landscape manipulated by extensive canal and ditch drainage systems were modeled utilizing an ontology-based simulation model. In the model, soil water flux and processes between three soil inorganic P pools (labile, active, and stable) and organic P are represented as database objects. And user-defined relationships among objects are used to automatically generate computer code (Java) for running the simulation of discharge and P loads. Our objectives were to develop ontology-based descriptions of soil P dynamics within sugarcane- (Saccharum officinarum L.) grown farm basins of the Everglades Agricultural Area (EAA) and to calibrate and validate such processes with water quality monitoring data collected at one farm basin (1244 ha). In the calibration phase (water year [WY] 99-00), observed discharge totaled 11,114 m3 ha(-1) and dissolved P 0.23 kg P ha(-1); and in the validation phase (WY 02-03), discharge was 10,397 m3 ha(-1) and dissolved P 0.11 kg P ha(-). During WY 99-00 the root mean square error (RMSE) for monthly discharge was 188 m3 ha(-1) and for monthly dissolved P 0.0077 kg P ha(-1); whereas during WY 02-03 the RMSE for monthly discharge was 195 m3 ha(-1) and monthly dissolved P 0.0022 kg P ha(-1). These results were confirmed by Nash-Sutcliffe Coefficient of 0.69 (calibration) and 0.81 (validation) comparing measured and simulated P loads. The good model performance suggests that our model has promise to simulate P dynamics, which may be useful as a management tool to reduce P loads in other similar low-relief areas.
Characterization and Validation of Transiting Planets in the TESS SPOC Pipeline
NASA Astrophysics Data System (ADS)
Twicken, Joseph D.; Caldwell, Douglas A.; Davies, Misty; Jenkins, Jon Michael; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Wohler, Bill
2018-06-01
Light curves for Transiting Exoplanet Survey Satellite (TESS) target stars will be extracted and searched for transiting planet signatures in the Science Processing Operations Center (SPOC) Science Pipeline at NASA Ames Research Center. Targets for which the transiting planet detection threshold is exceeded will be processed in the Data Validation (DV) component of the Pipeline. The primary functions of DV are to (1) characterize planets identified in the transiting planet search, (2) search for additional transiting planet signatures in light curves after modeled transit signatures have been removed, and (3) perform a comprehensive suite of diagnostic tests to aid in discrimination between true transiting planets and false positive detections. DV data products include extensive reports by target, one-page summaries by planet candidate, and tabulated transit model fit and diagnostic test results. DV products may be employed by humans and automated systems to vet planet candidates identified in the Pipeline. TESS will launch in 2018 and survey the full sky for transiting exoplanets over a period of two years. The SPOC pipeline was ported from the Kepler Science Operations Center (SOC) codebase and extended for TESS after the mission was selected for flight in the NASA Astrophysics Explorer program. We describe the Data Validation component of the SPOC Pipeline. The diagnostic tests exploit the flux (i.e., light curve) and pixel time series associated with each target to support the determination of the origin of each purported transiting planet signature. We also highlight the differences between the DV components for Kepler and TESS. Candidate planet detections and data products will be delivered to the Mikulski Archive for Space Telescopes (MAST); the MAST URL is archive.stsci.edu/tess. Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
Chatlapalli, S; Nazeran, H; Melarkod, V; Krishnam, R; Estrada, E; Pamula, Y; Cabrera, S
2004-01-01
The electrocardiogram (ECG) signal is used extensively as a low cost diagnostic tool to provide information concerning the heart's state of health. Accurate determination of the QRS complex, in particular, reliable detection of the R wave peak, is essential in computer based ECG analysis. ECG data from Physionet's Sleep-Apnea database were used to develop, test, and validate a robust heart rate variability (HRV) signal derivation algorithm. The HRV signal was derived from pre-processed ECG signals by developing an enhanced Hilbert transform (EHT) algorithm with built-in missing beat detection capability for reliable QRS detection. The performance of the EHT algorithm was then compared against that of a popular Hilbert transform-based (HT) QRS detection algorithm. Autoregressive (AR) modeling of the HRV power spectrum for both EHT- and HT-derived HRV signals was achieved and different parameters from their power spectra as well as approximate entropy were derived for comparison. Poincare plots were then used as a visualization tool to highlight the detection of the missing beats in the EHT method After validation of the EHT algorithm on ECG data from the Physionet, the algorithm was further tested and validated on a dataset obtained from children undergoing polysomnography for detection of sleep disordered breathing (SDB). Sensitive measures of accurate HRV signals were then derived to be used in detecting and diagnosing sleep disordered breathing in children. All signal processing algorithms were implemented in MATLAB. We present a description of the EHT algorithm and analyze pilot data for eight children undergoing nocturnal polysomnography. The pilot data demonstrated that the EHT method provides an accurate way of deriving the HRV signal and plays an important role in extraction of reliable measures to distinguish between periods of normal and sleep disordered breathing (SDB) in children.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jerban, Saeed, E-mail: saeed.jerban@usherbrooke.ca
2016-08-15
The pore interconnection size of β-tricalcium phosphate scaffolds plays an essential role in the bone repair process. Although, the μCT technique is widely used in the biomaterial community, it is rarely used to measure the interconnection size because of the lack of algorithms. In addition, discrete nature of the μCT introduces large systematic errors due to the convex geometry of interconnections. We proposed, verified and validated a novel pore-level algorithm to accurately characterize the individual pores and interconnections. Specifically, pores and interconnections were isolated, labeled, and individually analyzed with high accuracy. The technique was verified thoroughly by visually inspecting andmore » verifying over 3474 properties of randomly selected pores. This extensive verification process has passed a one-percent accuracy criterion. Scanning errors inherent in the discretization, which lead to both dummy and significantly overestimated interconnections, have been examined using computer-based simulations and additional high-resolution scanning. Then accurate correction charts were developed and used to reduce the scanning errors. Only after the corrections, both the μCT and SEM-based results converged, and the novel algorithm was validated. Material scientists with access to all geometrical properties of individual pores and interconnections, using the novel algorithm, will have a more-detailed and accurate description of the substitute architecture and a potentially deeper understanding of the link between the geometric and biological interaction. - Highlights: •An algorithm is developed to analyze individually all pores and interconnections. •After pore isolating, the discretization errors in interconnections were corrected. •Dummy interconnections and overestimated sizes were due to thin material walls. •The isolating algorithm was verified through visual inspection (99% accurate). •After correcting for the systematic errors, algorithm was validated successfully.« less
Validating modelled variable surface saturation in the riparian zone with thermal infrared images
NASA Astrophysics Data System (ADS)
Glaser, Barbara; Klaus, Julian; Frei, Sven; Frentress, Jay; Pfister, Laurent; Hopp, Luisa
2015-04-01
Variable contributing areas and hydrological connectivity have become prominent new concepts for hydrologic process understanding in recent years. The dynamic connectivity within the hillslope-riparian-stream (HRS) system is known to have a first order control on discharge generation and especially the riparian zone functions as runoff buffering or producing zone. However, despite their importance, the highly dynamic processes of contraction and extension of saturation within the riparian zone and its impact on runoff generation still remain not fully understood. In this study, we analysed the potential of a distributed, fully coupled and physically based model (HydroGeoSphere) to represent the spatial and temporal water flux dynamics of a forested headwater HRS system (6 ha) in western Luxembourg. The model was set up and parameterised under consideration of experimentally-derived knowledge of catchment structure and was run for a period of four years (October 2010 to August 2014). For model evaluation, we especially focused on the temporally varying spatial patterns of surface saturation. We used ground-based thermal infrared (TIR) imagery to map surface saturation with a high spatial and temporal resolution and collected 20 panoramic snapshots of the riparian zone (ca. 10 by 20 m) under different hydrologic conditions. These TIR panoramas were used in addition to several classical discharge and soil moisture time series for a spatially-distributed model validation. In a manual calibration process we optimised model parameters (e.g. porosity, saturated hydraulic conductivity, evaporation depth) to achieve a better agreement between observed and modelled discharges and soil moistures. The subsequent validation of surface saturation patterns by a visual comparison of processed TIR panoramas and corresponding model output panoramas revealed an overall good accordance for all but one region that was always too dry in the model. However, quantitative comparisons of modelled and observed saturated pixel percentages and of their modelled and measured relationships to concurrent discharges revealed remarkable similarities. During the calibration process we observed that surface saturation patterns were mostly affected by changing the soil properties of the topsoil in the riparian zone, but that the discharge behaviour did not change substantially at the same time. This effect of various spatial patterns occurring concomitant to a nearly unchanged integrated response demonstrates the importance of spatially distributed validation data. Our study clearly benefited from using different kinds of data - spatially integrated and distributed, temporally continuous and discrete - for the model evaluation procedure.
Using a Delphi process to establish consensus on emergency medicine clerkship competencies.
Penciner, Rick; Langhan, Trevor; Lee, Richard; McEwen, Jill; Woods, Robert A; Bandiera, Glen
2011-01-01
Currently, there is no consensus on the core competencies required for emergency medicine (EM) clerkships in Canada. Existing EM curricula have been developed through informal consensus or local efforts. The Delphi process has been used extensively as a means for establishing consensus. The purpose of this project was to define core competencies for EM clerkships in Canada, to validate a Delphi process in the context of national curriculum development, and to demonstrate the adoption of the CanMEDS physician competency paradigm in the undergraduate medical education realm. Using a modified Delphi process, we developed a consensus amongst a panel of expert emergency physicians from across Canada utilizing the CanMEDS 2005 Physician Competency Framework. Thirty experts from nine different medical schools across Canada participated on the panel. The initial list consisted of 152 competencies organized in the seven domains of the CanMEDS 2005 Physician Competency Framework. After the second round of the Delphi process, the list of competencies was reduced to 62 (59% reduction). This study demonstrated that a modified Delphi process can result in a strong consensus around a realistic number of core competencies for EM clerkships. We propose that such a method could be used by other medical specialties and health professions to develop rotation-specific core competencies.
Xiao, Xinqing; Fu, Zetian; Qi, Lin; Mira, Trebar; Zhang, Xiaoshuan
2015-10-01
The main export varieties in China are brand-name, high-quality bred aquatic products. Among them, tilapia has become the most important and fast-growing species since extensive consumer markets in North America and Europe have evolved as a result of commodity prices, year-round availability and quality of fresh and frozen products. As the largest tilapia farming country, China has over one-third of its tilapia production devoted to further processing and meeting foreign market demand. Using by tilapia fillet processing, this paper introduces the efforts for developing and evaluating ITS-TF: an intelligent traceability system integrated with statistical process control (SPC) and fault tree analysis (FTA). Observations, literature review and expert questionnaires were used for system requirement and knowledge acquisition; scenario simulation was applied to evaluate and validate ITS-TF performance. The results show that traceability requirement is evolved from a firefighting model to a proactive model for enhancing process management capacity for food safety; ITS-TF transforms itself as an intelligent system to provide functions on early warnings and process management by integrated SPC and FTA. The valuable suggestion that automatic data acquisition and communication technology should be integrated into ITS-TF was achieved for further system optimization, perfection and performance improvement. © 2014 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Tian, Yingtao; Robson, Joseph D.; Riekehr, Stefan; Kashaev, Nikolai; Wang, Li; Lowe, Tristan; Karanika, Alexandra
2016-07-01
Laser welding of advanced Al-Li alloys has been developed to meet the increasing demand for light-weight and high-strength aerospace structures. However, welding of high-strength Al-Li alloys can be problematic due to the tendency for hot cracking. Finding suitable welding parameters and filler material for this combination currently requires extensive and costly trial and error experimentation. The present work describes a novel coupled model to predict hot crack susceptibility (HCS) in Al-Li welds. Such a model can be used to shortcut the weld development process. The coupled model combines finite element process simulation with a two-level HCS model. The finite element process model predicts thermal field data for the subsequent HCS hot cracking prediction. The model can be used to predict the influences of filler wire composition and welding parameters on HCS. The modeling results have been validated by comparing predictions with results from fully instrumented laser welds performed under a range of process parameters and analyzed using high-resolution X-ray tomography to identify weld defects. It is shown that the model is capable of accurately predicting the thermal field around the weld and the trend of HCS as a function of process parameters.
Ushiyama, Naoko; Kurobe, Yasushi; Momose, Kimito
2017-11-01
[Purpose] To determine the validity of knee extension muscle strength measurements using belt-stabilized hand-held dynamometry with and without body stabilization compared with the gold standard isokinetic dynamometry in healthy adults. [Subjects and Methods] Twenty-nine healthy adults (mean age, 21.3 years) were included. Study parameters involved right side measurements of maximal isometric knee extension strength obtained using belt-stabilized hand-held dynamometry with and without body stabilization and the gold standard. Measurements were performed in all subjects. [Results] A moderate correlation and fixed bias were found between measurements obtained using belt-stabilized hand-held dynamometry with body stabilization and the gold standard. No significant correlation and proportional bias were found between measurements obtained using belt-stabilized hand-held dynamometry without body stabilization and the gold standard. The strength identified using belt-stabilized hand-held dynamometry with body stabilization may not be commensurate with the maximum strength individuals can generate; however, it reflects such strength. In contrast, the strength identified using belt-stabilized hand-held dynamometry without body stabilization does not reflect the maximum strength. Therefore, a chair should be used to stabilize the body when performing measurements of maximal isometric knee extension strength using belt-stabilized hand-held dynamometry in healthy adults. [Conclusion] Belt-stabilized hand-held dynamometry with body stabilization is more convenient than the gold standard in clinical settings.
Two-Stage Categorization in Brand Extension Evaluation: Electrophysiological Time Course Evidence
Wang, Xiaoyi
2014-01-01
A brand name can be considered a mental category. Similarity-based categorization theory has been used to explain how consumers judge a new product as a member of a known brand, a process called brand extension evaluation. This study was an event-related potential study conducted in two experiments. The study found a two-stage categorization process reflected by the P2 and N400 components in brand extension evaluation. In experiment 1, a prime–probe paradigm was presented in a pair consisting of a brand name and a product name in three conditions, i.e., in-category extension, similar-category extension, and out-of-category extension. Although the task was unrelated to brand extension evaluation, P2 distinguished out-of-category extensions from similar-category and in-category ones, and N400 distinguished similar-category extensions from in-category ones. In experiment 2, a prime–probe paradigm with a related task was used, in which product names included subcategory and major-category product names. The N400 elicited by subcategory products was more significantly negative than that elicited by major-category products, with no salient difference in P2. We speculated that P2 could reflect the early low-level and similarity-based processing in the first stage, whereas N400 could reflect the late analytic and category-based processing in the second stage. PMID:25438152
Ruschel, Caroline; Haupenthal, Alessandro; Jacomel, Gabriel Fernandes; Fontana, Heiliane de Brito; Santos, Daniela Pacheco dos; Scoz, Robson Dias; Roesler, Helio
2015-05-20
Isometric muscle strength of knee extensors has been assessed for estimating performance, evaluating progress during physical training, and investigating the relationship between isometric and dynamic/functional performance. To assess the validity and reliability of an adapted leg-extension machine for measuring isometric knee extensor force. Validity (concurrent approach) and reliability (test and test-retest approach) study. University laboratory. 70 healthy men and women aged between 20 and 30 y (39 in the validity study and 31 in the reliability study). Intraclass correlation coefficient (ICC) values calculated for the maximum voluntary isometric torque of knee extensors at 30°, 60°, and 90°, measured with the prototype and with an isokinetic dynamometer (ICC2,1, validity study) and measured with the prototype in test and retest sessions, scheduled from 48 h to 72 h apart (ICC1,1, reliability study). In the validity analysis, the prototype showed good agreement for measurements at 30° (ICC2,1 = .75, SEM = 18.2 Nm) and excellent agreement for measurements at 60° (ICC2,1 = .93, SEM = 9.6 Nm) and at 90° (ICC2,1 = .94, SEM = 8.9 Nm). Regarding the reliability analysis, between-days' ICC1,1 were good to excellent, ranging from .88 to .93. Standard error of measurement and minimal detectable difference based on test-retest ranged from 11.7 Nm to 18.1 Nm and 32.5 Nm to 50.1 Nm, respectively, for the 3 analyzed knee angles. The analysis of validity and repeatability of the prototype for measuring isometric muscle strength has shown to be good or excellent, depending on the knee joint angle analyzed. The new instrument, which presents a relative low cost and easiness of transportation when compared with an isokinetic dynamometer, is valid and provides consistent data concerning isometric strength of knee extensors and, for this reason, can be used for practical, clinical, and research purposes.
NASA Astrophysics Data System (ADS)
Riveiro, B.; DeJong, M.; Conde, B.
2016-06-01
Despite the tremendous advantages of the laser scanning technology for the geometric characterization of built constructions, there are important limitations preventing more widespread implementation in the structural engineering domain. Even though the technology provides extensive and accurate information to perform structural assessment and health monitoring, many people are resistant to the technology due to the processing times involved. Thus, new methods that can automatically process LiDAR data and subsequently provide an automatic and organized interpretation are required. This paper presents a new method for fully automated point cloud segmentation of masonry arch bridges. The method efficiently creates segmented, spatially related and organized point clouds, which each contain the relevant geometric data for a particular component (pier, arch, spandrel wall, etc.) of the structure. The segmentation procedure comprises a heuristic approach for the separation of different vertical walls, and later image processing tools adapted to voxel structures allows the efficient segmentation of the main structural elements of the bridge. The proposed methodology provides the essential processed data required for structural assessment of masonry arch bridges based on geometric anomalies. The method is validated using a representative sample of masonry arch bridges in Spain.
Alternative to Nitric Acid for Passivation of Stainless Steel Alloys
NASA Technical Reports Server (NTRS)
Lewis, Pattie L.; Kolody, Mark; Curran, Jerry
2013-01-01
Corrosion is an extensive problem that affects the Department of Defense (DoD) and National Aeronautics and Space Administration (NASA). The deleterious effects of corrosion result in steep costs, asset downtime affecting mission readiness, and safety risks to personnel. Consequently, it is vital to reduce corrosion costs and risks in a sustainable manner. The DoD and NASA have numerous structures and equipment that are fabricated from stainless steel. The standard practice for protection of stainless steel is a process called passivation. Typical passivation procedures call for the use of nitric acid; however, there are a number of environmental, worker safety, and operational issues associated with its use. Citric acid offers a variety of benefits including increased safety for personnel, reduced environmental impact, and reduced operational cost. DoD and NASA agreed to collaborate to validate citric acid as an acceptable passivating agent for stainless steel. This paper details our investigation of prior work developing the citric acid passivation process, development of the test plan, optimization of the process for specific stainless steel alloys, ongoing and planned testing to elucidate the process' resistance to corrosion in comparison to nitric acid, and preliminary results.
Low-cost single-crystal turbine blades, volume 2
NASA Technical Reports Server (NTRS)
Strangman, T. E.; Dennis, R. E.; Heath, B. R.
1984-01-01
The overall objectives of Project 3 were to develop the exothermic casting process to produce uncooled single-crystal (SC) HP turbine blades in MAR-M 247 and higher strength derivative alloys and to validate the materials process and components through extensive mechanical property testing, rig testing, and 200 hours of endurance engine testing. These Program objectives were achieved. The exothermic casting process was successfully developed into a low-cost nonproperietary method for producing single-crystal castings. Single-crystal MAR-M 247 and two derivatives DS alloys developed during this project, NASAIR 100 and SC Alloy 3, were fully characterized through mechanical property testing. SC MAR-M 247 shows no significant improvement in strength over directionally solidified (DS) MAR-M 247, but the derivative alloys, NASAIR 100 and Alloy 3, show significant tensile and fatigue improvements. Firtree testing, holography, and strain-gauge rig testing were used to determine the effects of the anisotropic characteristics of single-crystal materials. No undesirable characteristics were found. In general, the single-crystal material behaved similarly to DS MAR-M 247. Two complete engine sets of SC HP turbine blades were cast using the exothermic casting process and fully machined. These blades were successfully engine-tested.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-22
... Special Claims Processing. Description of Information Collection: This is an extension of a currently... Extension of a Currently Approved Information Collection to OMB; Comment Request Applications for Housing Assistance Payments and Special Claims Processing AGENCY: Office of Program Systems Management. ACTION...
Extension in Planned Social Change, the Indian Experience.
ERIC Educational Resources Information Center
Rudramoorthy, B.
Extension, the process of extending the knowledge of recent advances in science and technology to the people who need it, has been emphasized in India since the introduction of the Community Development Programme in 1952. Community development involves two distinct processes--extension education and community organization--and has had four…
NASA Astrophysics Data System (ADS)
Schrön, Martin; Köhli, Markus; Scheiffele, Lena; Iwema, Joost; Bogena, Heye R.; Lv, Ling; Martini, Edoardo; Baroni, Gabriele; Rosolem, Rafael; Weimar, Jannis; Mai, Juliane; Cuntz, Matthias; Rebmann, Corinna; Oswald, Sascha E.; Dietrich, Peter; Schmidt, Ulrich; Zacharias, Steffen
2017-10-01
In the last few years the method of cosmic-ray neutron sensing (CRNS) has gained popularity among hydrologists, physicists, and land-surface modelers. The sensor provides continuous soil moisture data, averaged over several hectares and tens of decimeters in depth. However, the signal still may contain unidentified features of hydrological processes, and many calibration datasets are often required in order to find reliable relations between neutron intensity and water dynamics. Recent insights into environmental neutrons accurately described the spatial sensitivity of the sensor and thus allowed one to quantify the contribution of individual sample locations to the CRNS signal. Consequently, data points of calibration and validation datasets are suggested to be averaged using a more physically based weighting approach. In this work, a revised sensitivity function is used to calculate weighted averages of point data. The function is different from the simple exponential convention by the extraordinary sensitivity to the first few meters around the probe, and by dependencies on air pressure, air humidity, soil moisture, and vegetation. The approach is extensively tested at six distinct monitoring sites: two sites with multiple calibration datasets and four sites with continuous time series datasets. In all cases, the revised averaging method improved the performance of the CRNS products. The revised approach further helped to reveal hidden hydrological processes which otherwise remained unexplained in the data or were lost in the process of overcalibration. The presented weighting approach increases the overall accuracy of CRNS products and will have an impact on all their applications in agriculture, hydrology, and modeling.
NASA Astrophysics Data System (ADS)
Belfort, Benjamin; Weill, Sylvain; Lehmann, François
2017-07-01
A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.
Do lightning positive leaders really "step"?
NASA Astrophysics Data System (ADS)
Petersen, D.
2015-12-01
It has been known for some time that positive leaders exhibit impulsive charge motion and optical emissions as they extend. However, laboratory and field observations have not produced any evidence of a process analogous to the space leader mechanism of negative leader extension. Instead, observations have suggested that the positive leader tip undergoes a continuous to intermittent series of corona streamer bursts, each burst resulting in a small forward extension of the positive leader channel. Traditionally, it has been held that lightning positive leaders extend in a continuous or quasi-continuous fashion. Lately, however, many have become concerned that this position is incongruous with observations of impulsive activity during lightning positive leader extension. It is increasingly suggested that this impulsive activity is evidence that positive leaders also undergo "stepping". There are two issues that must be addressed. The first issue concerns whether or not the physical processes underlying impulsive extension in negative and positive leaders are distinct. We argue that these processes are in fact physically distinct, and offer new high-speed video evidence to support this position. The second issue regards the proper use of the term "step" as an identifier for the impulsive forward extension of a leader. Traditional use of this term has been applied only to negative leaders, due primarily to their stronger impulsive charge motions and photographic evidence of clearly discontinuous forward progression of the luminous channel. Recently, due to the increasing understanding of the distinct "space leader" process of negative leader extension, the term "step" has increasingly come to be associated with the space leader process itself. Should this emerging association, "step" = space leader attachment, be canonized? If not, then it seems reasonable to use the term "step" to describe impulsive positive leader extension. If, however, we do wish to associate the term "step" with space leader attachment, a process unique to negative leaders, should we devise a term for those process(es) that underly impulsive positive leader extension?
Emulsion droplet interactions: a front-tracking treatment
NASA Astrophysics Data System (ADS)
Mason, Lachlan; Juric, Damir; Chergui, Jalel; Shin, Seungwon; Craster, Richard V.; Matar, Omar K.
2017-11-01
Emulsion coalescence influences a multitude of industrial applications including solvent extraction, oil recovery and the manufacture of fast-moving consumer goods. Droplet interaction models are vital for the design and scale-up of processing systems, however predictive modelling at the droplet-scale remains a research challenge. This study simulates industrially relevant moderate-inertia collisions for which a high degree of droplet deformation occurs. A hybrid front-tracking/level-set approach is used to automatically account for interface merging without the need for `bookkeeping' of interface connectivity. The model is implemented in Code BLUE using a parallel multi-grid solver, allowing both film and droplet-scale dynamics to be resolved efficiently. Droplet interaction simulations are validated using experimental sequences from the literature in the presence and absence of background turbulence. The framework is readily extensible for modelling the influence of surfactants and non-Newtonian fluids on droplet interaction processes. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM), PETRONAS.
An Integrative Theory of Psychotherapy: Research and Practice
Epstein, Seymour; Epstein, Martha L.
2016-01-01
A dual-process personality theory and supporting research are presented. The dual processes comprise an experiential system and a rational system. The experiential system is an adaptive, associative learning system that humans share with other higher-order animals. The rational system is a uniquely human, primarily verbal, reasoning system. It is assumed that when humans developed language they did not abandon their previous ways of adapting, they simply added language to their experiential system. The two systems are assumed to operate in parallel and are bi-directionally interactive. The validity of these assumptions is supported by extensive research. Of particular relevance for psychotherapy, the experiential system, which is compatible with evolutionary theory, replaces the Freudian maladaptive unconscious system that is indefensible from an evolutionary perspective, as sub-human animals would then have only a single system that is maladaptive. The aim of psychotherapy is to produce constructive changes in the experiential system. Changes in the rational system are useful only to the extent that they contribute to constructive changes in the experiential system. PMID:27672302
A CANDLE for a deeper in vivo insight
Coupé, Pierrick; Munz, Martin; Manjón, Jose V; Ruthazer, Edward S; Louis Collins, D.
2012-01-01
A new Collaborative Approach for eNhanced Denoising under Low-light Excitation (CANDLE) is introduced for the processing of 3D laser scanning multiphoton microscopy images. CANDLE is designed to be robust for low signal-to-noise ratio (SNR) conditions typically encountered when imaging deep in scattering biological specimens. Based on an optimized non-local means filter involving the comparison of filtered patches, CANDLE locally adapts the amount of smoothing in order to deal with the noise inhomogeneity inherent to laser scanning fluorescence microscopy images. An extensive validation on synthetic data, images acquired on microspheres and in vivo images is presented. These experiments show that the CANDLE filter obtained competitive results compared to a state-of-the-art method and a locally adaptive optimized nonlocal means filter, especially under low SNR conditions (PSNR<8dB). Finally, the deeper imaging capabilities enabled by the proposed filter are demonstrated on deep tissue in vivo images of neurons and fine axonal processes in the Xenopus tadpole brain. PMID:22341767
An improved exceedance theory for combined random stresses
NASA Technical Reports Server (NTRS)
Lester, H. C.
1974-01-01
An extension is presented of Rice's classic solution for the exceedances of a constant level by a single random process to its counterpart for an n-dimensional vector process. An interaction boundary, analogous to the constant level considered by Rice for the one-dimensional case, is assumed in the form of a hypersurface. The theory for the numbers of boundary exceedances is developed by using a joint statistical approach which fully accounts for all cross-correlation effects. An exact expression is derived for the n-dimensional exceedance density function, which is valid for an arbitrary interaction boundary. For application to biaxial states of combined random stress, the general theory is reduced to the two-dimensional case. An elliptical stress interaction boundary is assumed and the exact expression for the density function is presented. The equations are expressed in a format which facilitates calculating the exceedances by numerically evaluating a line integral. The behavior of the density function for the two-dimensional case is briefly discussed.
An Integrative Theory of Psychotherapy: Research and Practice.
Epstein, Seymour; Epstein, Martha L
2016-06-01
A dual-process personality theory and supporting research are presented. The dual processes comprise an experiential system and a rational system. The experiential system is an adaptive, associative learning system that humans share with other higher-order animals. The rational system is a uniquely human, primarily verbal, reasoning system. It is assumed that when humans developed language they did not abandon their previous ways of adapting, they simply added language to their experiential system. The two systems are assumed to operate in parallel and are bi-directionally interactive. The validity of these assumptions is supported by extensive research. Of particular relevance for psychotherapy, the experiential system, which is compatible with evolutionary theory, replaces the Freudian maladaptive unconscious system that is indefensible from an evolutionary perspective, as sub-human animals would then have only a single system that is maladaptive. The aim of psychotherapy is to produce constructive changes in the experiential system. Changes in the rational system are useful only to the extent that they contribute to constructive changes in the experiential system.
Tashjian, Sarah M; Weissman, David G; Guyer, Amanda E; Galván, Adriana
2018-04-01
Adolescence is characterized by extensive neural development and sensitivity to social context, both of which contribute to engaging in prosocial behaviors. Although it is established that prosocial behaviors are linked to positive outcomes in adulthood, little is known about the neural correlates of adolescents' prosociality. Identifying whether the brain is differentially responsive to varying types of social input may be important for fostering prosocial behavior. We report pilot results using new stimuli and an ecologically valid donation paradigm indicating (1) brain regions typically recruited during socioemotional processing evinced differential activation when adolescents evaluated prosocial compared with social or noninteractive scenes (N = 20, ages 13-17 years, M Age = 15.30 years), and (2) individual differences in temporoparietal junction recruitment when viewing others' prosocial behaviors were related to adolescents' own charitable giving. These novel findings have significant implications for understanding how the adolescent brain processes prosocial acts and for informing ways to support adolescents to engage in prosocial behaviors in their daily lives.
A methodology for extending domain coverage in SemRep.
Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C
2013-12-01
We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen
Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less
Organic Scintillator Detector Response Simulations with DRiFT
Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen; ...
2016-06-11
Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less
Converting CSV Files to RKSML Files
NASA Technical Reports Server (NTRS)
Trebi-Ollennu, Ashitey; Liebersbach, Robert
2009-01-01
A computer program converts, into a format suitable for processing on Earth, files of downlinked telemetric data pertaining to the operation of the Instrument Deployment Device (IDD), which is a robot arm on either of the Mars Explorer Rovers (MERs). The raw downlinked data files are in comma-separated- value (CSV) format. The present program converts the files into Rover Kinematics State Markup Language (RKSML), which is an Extensible Markup Language (XML) format that facilitates representation of operations of the IDD and enables analysis of the operations by means of the Rover Sequencing Validation Program (RSVP), which is used to build sequences of commanded operations for the MERs. After conversion by means of the present program, the downlinked data can be processed by RSVP, enabling the MER downlink operations team to play back the actual IDD activity represented by the telemetric data against the planned IDD activity. Thus, the present program enhances the diagnosis of anomalies that manifest themselves as differences between actual and planned IDD activities.
Organic scintillator detector response simulations with DRiFT
NASA Astrophysics Data System (ADS)
Andrews, M. T.; Bates, C. R.; McKigney, E. A.; Solomon, C. J.; Sood, A.
2016-09-01
This work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNP® output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed-field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNP® 6 , which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discrimination plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.
NASA Astrophysics Data System (ADS)
Bao, Yi; Valipour, Mahdi; Meng, Weina; Khayat, Kamal H.; Chen, Genda
2017-08-01
This study develops a delamination detection system for smart ultra-high-performance concrete (UHPC) overlays using a fully distributed fiber optic sensor. Three 450 mm (length) × 200 mm (width) × 25 mm (thickness) UHPC overlays were cast over an existing 200 mm thick concrete substrate. The initiation and propagation of delamination due to early-age shrinkage of the UHPC overlay were detected as sudden increases and their extension in spatial distribution of shrinkage-induced strains measured from the sensor based on pulse pre-pump Brillouin optical time domain analysis. The distributed sensor is demonstrated effective in detecting delamination openings from microns to hundreds of microns. A three-dimensional finite element model with experimental material properties is proposed to understand the complete delamination process measured from the distributed sensor. The model is validated using the distributed sensor data. The finite element model with cohesive elements for the overlay-substrate interface can predict the complete delamination process.
GapFiller: a de novo assembly approach to fill the gap within paired reads
2012-01-01
Background Next Generation Sequencing technologies are able to provide high genome coverages at a relatively low cost. However, due to limited reads' length (from 30 bp up to 200 bp), specific bioinformatics problems have become even more difficult to solve. De novo assembly with short reads, for example, is more complicated at least for two reasons: first, the overall amount of "noisy" data to cope with increased and, second, as the reads' length decreases the number of unsolvable repeats grows. Our work's aim is to go at the root of the problem by providing a pre-processing tool capable to produce (in-silico) longer and highly accurate sequences from a collection of Next Generation Sequencing reads. Results In this paper a seed-and-extend local assembler is presented. The kernel algorithm is a loop that, starting from a read used as seed, keeps extending it using heuristics whose main goal is to produce a collection of error-free and longer sequences. In particular, GapFiller carefully detects reliable overlaps and operates clustering similar reads in order to reconstruct the missing part between the two ends of the same insert. Our tool's output has been validated on 24 experiments using both simulated and real paired reads datasets. The output sequences are declared correct when the seed-mate is found. In the experiments performed, GapFiller was able to extend high percentages of the processed seeds and find their mates, with a false positives rate that turned out to be nearly negligible. Conclusions GapFiller, starting from a sufficiently high short reads coverage, is able to produce high coverages of accurate longer sequences (from 300 bp up to 3500 bp). The procedure to perform safe extensions, together with the mate-found check, turned out to be a powerful criterion to guarantee contigs' correctness. GapFiller has further potential, as it could be applied in a number of different scenarios, including the post-processing validation of insertions/deletions detection pipelines, pre-processing routines on datasets for de novo assembly pipelines, or in any hierarchical approach designed to assemble, analyse or validate pools of sequences. PMID:23095524
Logo Effects on Brand Extension Evaluations from the Electrophysiological Perspective.
Shang, Qian; Pei, Guanxiong; Dai, Shenyi; Wang, Xiaoyi
2017-01-01
Brand extension typically has two strategies: brand name extension (BN) and brand logo extension (BL). The current study explored which strategy (BN or BL) better enhanced the success of dissimilar brand extension and product promotion in enterprises. Event-related potentials (ERPs) were used to investigate electrophysiological processes when subjects evaluated their acceptance of the brand extension using a combined picture of S1 and S2. S1 was a famous brand presented by two identity signs (brand name and brand logo). S2 was a picture of an extension product that belonged to a dissimilar product category than S1. The behavior data showed that BL was more acceptable than BN in the dissimilar brand extension. The neurophysiology process was reflected by a less negative N2 component and a larger P300 component in the BL than in the BN. We suggested that N2 reflected a whole conflict between the brand-product combination and the long-term memory and that P300 could be regarded as the reflection of the categorization process in the working memory.
Logo Effects on Brand Extension Evaluations from the Electrophysiological Perspective
Shang, Qian; Pei, Guanxiong; Dai, Shenyi; Wang, Xiaoyi
2017-01-01
Brand extension typically has two strategies: brand name extension (BN) and brand logo extension (BL). The current study explored which strategy (BN or BL) better enhanced the success of dissimilar brand extension and product promotion in enterprises. Event-related potentials (ERPs) were used to investigate electrophysiological processes when subjects evaluated their acceptance of the brand extension using a combined picture of S1 and S2. S1 was a famous brand presented by two identity signs (brand name and brand logo). S2 was a picture of an extension product that belonged to a dissimilar product category than S1. The behavior data showed that BL was more acceptable than BN in the dissimilar brand extension. The neurophysiology process was reflected by a less negative N2 component and a larger P300 component in the BL than in the BN. We suggested that N2 reflected a whole conflict between the brand-product combination and the long-term memory and that P300 could be regarded as the reflection of the categorization process in the working memory. PMID:28337121
NASA Technical Reports Server (NTRS)
Liechty, Derek S.
2014-01-01
The ability to compute rarefied, ionized hypersonic flows is becoming more important as missions such as Earth reentry, landing high mass payloads on Mars, and the exploration of the outer planets and their satellites are being considered. Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties are extended in the current work to include electronic energy level transitions and reactions involving charged particles. These extensions are shown to agree favorably with reported transition and reaction rates from the literature for near-equilibrium conditions. Also, the extensions are applied to the second flight of the Project FIRE flight experiment at 1634 seconds with a Knudsen number of 0.001 at an altitude of 76.4 km. In order to accomplish this, NASA's direct simulation Monte Carlo code DAC was rewritten to include the ability to simulate charge-neutral ionized flows, take advantage of the recently introduced chemistry model, and to include the extensions presented in this work. The 1634 second data point was chosen for comparisons to be made in order to include a CFD solution. The Knudsen number at this point in time is such that the DSMC simulations are still tractable and the CFD computations are at the edge of what is considered valid because, although near-transitional, the flow is still considered to be continuum. It is shown that the inclusion of electronic energy levels in the DSMC simulation is necessary for flows of this nature and is required for comparison to the CFD solution. The flow field solutions are also post-processed by the nonequilibrium radiation code HARA to compute the radiative portion.
NASA Astrophysics Data System (ADS)
Tuttle, L. F., II; Wernette, P. A.; Houser, C.
2016-12-01
Framework geology has been demonstrated to influence the geomorphology and affect the response of barrier islands to extreme storm events. Therefore, it is vital that we understand the framework geology before we can accurately assess the vulnerability and resiliency of the coast. Geophysical surveys consisting of ground-penetrating radar (GPR) and electromagnetic inductance (EMI) were collected along the length of Padre Island National Seashore (PAIS) to map subsurface infilled paleochannels identified in previous research. The most extensive published survey of PAIS framework geology was conducted in the 1950s as part of dredging the Intracoastal Waterway through Laguna Madre. Using cores and seismic surveys the previous study identified a series of relict infilled paleochannels in dissecting PAIS. The sediment cores presented in our poster were collected in Fall 2016 with a Geoprobe 6712DT. Cores were stored and processed using an X-ray fluorescence (XRF) scanner at the International Ocean Discovery Program repository in College Station, Texas. The XRF data was used to examine mineralogical differences that provide valuable insight into the evolutionary history of the island. This poster presents results from sediment cores collected to validate the geophysical survey data. The broader purpose of this research is to validate the subsurface framework geology features (i.e. infilled paleochannels) in order to more accurately predict future changes to the environmental and economic longevity of PAIS.
Validation of Cross Sections for Monte Carlo Simulation of the Photoelectric Effect
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Kim, Han Sung; Pia, Maria Grazia; Basaglia, Tullio; Batič, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo
2016-04-01
Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library (EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebel's parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthill's parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.
SlicerRT: radiation therapy research toolkit for 3D Slicer.
Pinter, Csaba; Lasso, Andras; Wang, An; Jaffray, David; Fichtinger, Gabor
2012-10-01
Interest in adaptive radiation therapy research is constantly growing, but software tools available for researchers are mostly either expensive, closed proprietary applications, or free open-source packages with limited scope, extensibility, reliability, or user support. To address these limitations, we propose SlicerRT, a customizable, free, and open-source radiation therapy research toolkit. SlicerRT aspires to be an open-source toolkit for RT research, providing fast computations, convenient workflows for researchers, and a general image-guided therapy infrastructure to assist clinical translation of experimental therapeutic approaches. It is a medium into which RT researchers can integrate their methods and algorithms, and conduct comparative testing. SlicerRT was implemented as an extension for the widely used 3D Slicer medical image visualization and analysis application platform. SlicerRT provides functionality specifically designed for radiation therapy research, in addition to the powerful tools that 3D Slicer offers for visualization, registration, segmentation, and data management. The feature set of SlicerRT was defined through consensus discussions with a large pool of RT researchers, including both radiation oncologists and medical physicists. The development processes used were similar to those of 3D Slicer to ensure software quality. Standardized mechanisms of 3D Slicer were applied for documentation, distribution, and user support. The testing and validation environment was configured to automatically launch a regression test upon each software change and to perform comparison with ground truth results provided by other RT applications. Modules have been created for importing and loading DICOM-RT data, computing and displaying dose volume histograms, creating accumulated dose volumes, comparing dose volumes, and visualizing isodose lines and surfaces. The effectiveness of using 3D Slicer with the proposed SlicerRT extension for radiation therapy research was demonstrated on multiple use cases. A new open-source software toolkit has been developed for radiation therapy research. SlicerRT can import treatment plans from various sources into 3D Slicer for visualization, analysis, comparison, and processing. The provided algorithms are extensively tested and they are accessible through a convenient graphical user interface as well as a flexible application programming interface.
NASA Astrophysics Data System (ADS)
Worqlul, Abeyou W.; Ayana, Essayas K.; Maathuis, Ben H. P.; MacAlister, Charlotte; Philpot, William D.; Osorio Leyton, Javier M.; Steenhuis, Tammo S.
2018-01-01
In many developing countries and remote areas of important ecosystems, good quality precipitation data are neither available nor readily accessible. Satellite observations and processing algorithms are being extensively used to produce satellite rainfall products (SREs). Nevertheless, these products are prone to systematic errors and need extensive validation before to be usable for streamflow simulations. In this study, we investigated and corrected the bias of Multi-Sensor Precipitation Estimate-Geostationary (MPEG) data. The corrected MPEG dataset was used as input to a semi-distributed hydrological model Hydrologiska Byråns Vattenbalansavdelning (HBV) for simulation of discharge of the Gilgel Abay and Gumara watersheds in the Upper Blue Nile basin, Ethiopia. The result indicated that the MPEG satellite rainfall captured 81% and 78% of the gauged rainfall variability with a consistent bias of underestimating the gauged rainfall by 60%. A linear bias correction applied significantly reduced the bias while maintaining the coefficient of correlation. The simulated flow using bias corrected MPEG SRE resulted in a simulated flow comparable to the gauge rainfall for both watersheds. The study indicated the potential of MPEG SRE in water budget studies after applying a linear bias correction.
Ocean acidification affects coral growth by reducing skeletal density.
Mollica, Nathaniel R; Guo, Weifu; Cohen, Anne L; Huang, Kuo-Fang; Foster, Gavin L; Donald, Hannah K; Solow, Andrew R
2018-02-20
Ocean acidification (OA) is considered an important threat to coral reef ecosystems, because it reduces the availability of carbonate ions that reef-building corals need to produce their skeletons. However, while theory predicts that coral calcification rates decline as carbonate ion concentrations decrease, this prediction is not consistently borne out in laboratory manipulation experiments or in studies of corals inhabiting naturally low-pH reefs today. The skeletal growth of corals consists of two distinct processes: extension (upward growth) and densification (lateral thickening). Here, we show that skeletal density is directly sensitive to changes in seawater carbonate ion concentration and thus, to OA, whereas extension is not. We present a numerical model of Porites skeletal growth that links skeletal density with the external seawater environment via its influence on the chemistry of coral calcifying fluid. We validate the model using existing coral skeletal datasets from six Porites species collected across five reef sites and use this framework to project the impact of 21st century OA on Porites skeletal density across the global tropics. Our model predicts that OA alone will drive up to 20.3 ± 5.4% decline in the skeletal density of reef-building Porites corals.
McKinney, Bill; Meyer, Peter A.; Crosas, Mercè; Sliz, Piotr
2016-01-01
Access to experimental X-ray diffraction image data is important for validation and reproduction of macromolecular models and indispensable for the development of structural biology processing methods. In response to the evolving needs of the structural biology community, we recently established a diffraction data publication system, the Structural Biology Data Grid (SBDG, data.sbgrid.org), to preserve primary experimental datasets supporting scientific publications. All datasets published through the SBDG are freely available to the research community under a public domain dedication license, with metadata compliant with the DataCite Schema (schema.datacite.org). A proof-of-concept study demonstrated community interest and utility. Publication of large datasets is a challenge shared by several fields, and the SBDG has begun collaborating with the Institute for Quantitative Social Science at Harvard University to extend the Dataverse (dataverse.org) open-source data repository system to structural biology datasets. Several extensions are necessary to support the size and metadata requirements for structural biology datasets. In this paper, we describe one such extension—functionality supporting preservation of filesystem structure within Dataverse—which is essential for both in-place computation and supporting non-http data transfers. PMID:27862010
Anchoring bias in online voting
NASA Astrophysics Data System (ADS)
Yang, Zimo; Zhang, Zi-Ke; Zhou, Tao
2012-12-01
Voting online with explicit ratings could largely reflect people's preferences and objects' qualities, but ratings are always irrational, because they may be affected by many unpredictable factors like mood, weather and other people's votes. By analyzing two real systems, this paper reveals a systematic bias embedding in the individual decision-making processes, namely people tend to give a low rating after a low rating, as well as a high rating following a high rating. This so-called anchoring bias is validated via extensive comparisons with null models, and numerically speaking, the extent of bias decays with voting interval in a logarithmic form. Our findings could be applied in the design of recommender systems and considered as important complementary materials to previous knowledge about anchoring effects on financial trades, performance judgments, auctions, and so on.
The MIGenAS integrated bioinformatics toolkit for web-based sequence analysis
Rampp, Markus; Soddemann, Thomas; Lederer, Hermann
2006-01-01
We describe a versatile and extensible integrated bioinformatics toolkit for the analysis of biological sequences over the Internet. The web portal offers convenient interactive access to a growing pool of chainable bioinformatics software tools and databases that are centrally installed and maintained by the RZG. Currently, supported tasks comprise sequence similarity searches in public or user-supplied databases, computation and validation of multiple sequence alignments, phylogenetic analysis and protein–structure prediction. Individual tools can be seamlessly chained into pipelines allowing the user to conveniently process complex workflows without the necessity to take care of any format conversions or tedious parsing of intermediate results. The toolkit is part of the Max-Planck Integrated Gene Analysis System (MIGenAS) of the Max Planck Society available at (click ‘Start Toolkit’). PMID:16844980
Machine learning based cloud mask algorithm driven by radiative transfer modeling
NASA Astrophysics Data System (ADS)
Chen, N.; Li, W.; Tanikawa, T.; Hori, M.; Shimada, R.; Stamnes, K. H.
2017-12-01
Cloud detection is a critically important first step required to derive many satellite data products. Traditional threshold based cloud mask algorithms require a complicated design process and fine tuning for each sensor, and have difficulty over snow/ice covered areas. With the advance of computational power and machine learning techniques, we have developed a new algorithm based on a neural network classifier driven by extensive radiative transfer modeling. Statistical validation results obtained by using collocated CALIOP and MODIS data show that its performance is consistent over different ecosystems and significantly better than the MODIS Cloud Mask (MOD35 C6) during the winter seasons over mid-latitude snow covered areas. Simulations using a reduced number of satellite channels also show satisfactory results, indicating its flexibility to be configured for different sensors.
A Study of Cloud Radiative Forcing and Feedback
NASA Technical Reports Server (NTRS)
Ramanathan, Veerabhadran
2000-01-01
The main objective of the grant proposal was to participate in the CERES (Cloud and Earth's Radiant Energy System) Satellite experiment and perform interdisciplinary investigation of NASA's Earth Observing System (EOS). During the grant period, massive amounts of scientific data from diverse platforms have been accessed, processed and archived for continuing use; several software packages have been developed for integration of different data streams for performing scientific evaluation; extensive validation studies planned have been completed culminating in the development of important algorithms that are being used presently in the operational production of data from the CERES. Contributions to the inter-disciplinary science investigations have been significantly more than originally envisioned. The results of these studies have appeared in several refereed journals and conference proceedings. They are listed at the end of this report.
Loutrari, Ariadne; Tselekidou, Freideriki; Proios, Hariklia
2018-02-27
Prosodic patterns of speech appear to make a critical contribution to memory-related processing. We considered the case of a previously unexplored prosodic feature of Greek storytelling and its effect on free recall in thirty typically developing children between the ages of 10 and 12 years, using short ecologically valid auditory stimuli. The combination of a falling pitch contour and, more notably, extensive final-syllable vowel lengthening, which gives rise to the prosodic feature in question, led to statistically significantly higher performance in comparison to neutral phrase-final prosody. Number of syllables in target words did not reveal substantial difference in performance. The current study presents a previously undocumented culturally-specific prosodic pattern and its effect on short-term memory.
Design of Flight Vehicle Management Systems
NASA Technical Reports Server (NTRS)
Meyer, George; Aiken, Edwin W. (Technical Monitor)
1994-01-01
As the operation of large systems becomes ever more dependent on extensive automation, the need for an effective solution to the problem of design and validation of the underlying software becomes more critical. Large systems possess much detailed structure, typically hierarchical, and they are hybrid. Information processing at the top of the hierarchy is by means of formal logic and sentences; on the bottom it is by means of simple scalar differential equations and functions of time; and in the middle it is by an interacting mix of nonlinear multi-axis differential equations and automata, and functions of time and discrete events. The lecture will address the overall problem as it relates to flight vehicle management, describe the middle level, and offer a design approach that is based on Differential Geometry and Discrete Event Dynamic Systems Theory.
Nonlinear Control and Discrete Event Systems
NASA Technical Reports Server (NTRS)
Meyer, George; Null, Cynthia H. (Technical Monitor)
1995-01-01
As the operation of large systems becomes ever more dependent on extensive automation, the need for an effective solution to the problem of design and validation of the underlying software becomes more critical. Large systems possesses much detailed structure, typically hierarchical, and they are hybrid. Information processing at the top of the hierarchy is by means of formal logic and sentences; on the bottom it is by means of simple scalar differential equations and functions of time; and in the middle it is by an interacting mix of nonlinear multi-axis differential equations and automata, and functions of time and discrete events. The lecture will address the overall problem as it relates to flight vehicle management, describe the middle level, and offer a design approach that is based on Differential Geometry and Discrete Event Dynamic Systems Theory.
NASA/RAE collaboration on nonlinear control using the F-8C digital fly-by-wire aircraft
NASA Technical Reports Server (NTRS)
Butler, G. F.; Corbin, M. J.; Mepham, S.; Stewart, J. F.; Larson, R. R.
1983-01-01
Design procedures are reviewed for variable integral control to optimize response (VICTOR) algorithms and results of preliminary flight tests are presented. The F-8C aircraft is operated in the remotely augmented vehicle (RAV) mode, with the control laws implemented as FORTRAN programs on a ground-based computer. Pilot commands and sensor information are telemetered to the ground, where the data are processed to form surface commands which are then telemetered back to the aircraft. The RAV mode represents a singlestring (simplex) system and is therefore vulnerable to a hardover since comparison monitoring is not possible. Hence, extensive error checking is conducted on both the ground and airborne computers to prevent the development of potentially hazardous situations. Experience with the RAV monitoring and validation procedures is described.
Applying an MVC Framework for The System Development Life Cycle with Waterfall Model Extended
NASA Astrophysics Data System (ADS)
Hardyanto, W.; Purwinarko, A.; Sujito, F.; Masturi; Alighiri, D.
2017-04-01
This paper describes the extension of the waterfall model using MVC architectural pattern for software development. The waterfall model is the based model of the most widely used in software development, yet there are still many problems in it. The general issue usually happens on data changes that cause the delays on the process itself. On the other hand, the security factor on the software as well as one of the major problems. This study uses PHP programming language for implementation. Although this model can be implemented in several programming languages with the same concept. This study is based on MVC architecture so that it can improve the performance of both software development and maintenance, especially concerning security, validation, database access, and routing.
NASA Technical Reports Server (NTRS)
Hartfield, Roy J.; Hollo, Steven D.; Mcdaniel, James C.
1990-01-01
Planar measurements of injectant mole fraction and temperature have been conducted in a nonreacting supersonic combustor configured with underexpanded injection in the base of a swept ramp. The temperature measurements were conducted with a Mach 2 test section inlet in streamwise planes perpendicular to the test section wall on which the ramp was mounted. Injection concentration measurements, conducted in cross flow planes with both Mach 2 and Mach 2.9 free stream conditions, dramatically illustrate the domination of the mixing process by streamwise vorticity generated by the ramp. These measurements, conducted using a nonintrusive optical technique (laser-induced iodine fluorescence), provide an accurate and extensive experimental data base for the validation of computation fluid dynamic codes for the calculation of highly three-dimensional supersonic combustor flow fields.
Evaporating Spray in Supersonic Streams Including Turbulence Effects
NASA Technical Reports Server (NTRS)
Balasubramanyam, M. S.; Chen, C. P.
2006-01-01
Evaporating spray plays an important role in spray combustion processes. This paper describes the development of a new finite-conductivity evaporation model, based on the two-temperature film theory, for two-phase numerical simulation using Eulerian-Lagrangian method. The model is a natural extension of the T-blob/T-TAB atomization/spray model which supplies the turbulence characteristics for estimating effective thermal diffusivity within the droplet phase. Both one-way and two-way coupled calculations were performed to investigate the performance of this model. Validation results indicate the superiority of the finite-conductivity model in low speed parallel flow evaporating sprays. High speed cross flow spray results indicate the effectiveness of the T-blob/T-TAB model and point to the needed improvements in high speed evaporating spray modeling.
Measurement of tree canopy architecture
NASA Technical Reports Server (NTRS)
Martens, S. N.; Ustin, S. L.; Norman, J. M.
1991-01-01
The lack of accurate extensive geometric data on tree canopies has retarded development and validation of radiative transfer models. A stratified sampling method was devised to measure the three-dimensional geometry of 16 walnut trees which had received irrigation treatments of either 100 or 33 per cent of evapotranspirational (ET) demand for the previous two years. Graphic reconstructions of the three-dimensional geometry were verified by 58 independent measurements. The distributions of stem- and leaf-size classes, lengths, and angle classes were determined and used to calculate leaf area index (LAI), stem area, and biomass. Reduced irrigation trees have lower biomass of stems, leaves and fruit, lower LAI, steeper leaf angles and altered biomass allocation to large stems. These data can be used in ecological models that link canopy processes with remotely sensed measurements.
Lemeunier, Nadège; da Silva-Oolup, S; Chow, N; Southerst, D; Carroll, L; Wong, J J; Shearer, H; Mastragostino, P; Cox, J; Côté, E; Murnaghan, K; Sutton, D; Côté, P
2017-09-01
To determine the reliability and validity of clinical tests to assess the anatomical integrity of the cervical spine in adults with neck pain and its associated disorders. We updated the systematic review of the 2000-2010 Bone and Joint Decade Task Force on Neck Pain and its Associated Disorders. We also searched the literature to identify studies on the reliability and validity of Doppler velocimetry for the evaluation of cervical arteries. Two independent reviewers screened and critically appraised studies. We conducted a best evidence synthesis of low risk of bias studies and ranked the phases of investigations using the classification proposed by Sackett and Haynes. We screened 9022 articles and critically appraised 8 studies; all 8 studies had low risk of bias (three reliability and five validity Phase II-III studies). Preliminary evidence suggests that the extension-rotation test may be reliable and has adequate validity to rule out pain arising from facet joints. The evidence suggests variable reliability and preliminary validity for the evaluation of cervical radiculopathy including neurological examination (manual motor testing, dermatomal sensory testing, deep tendon reflexes, and pathological reflex testing), Spurling's and the upper limb neurodynamic tests. No evidence was found for doppler velocimetry. Little evidence exists to support the use of clinical tests to evaluate the anatomical integrity of the cervical spine in adults with neck pain and its associated disorders. We found preliminary evidence to support the use of the extension-rotation test, neurological examination, Spurling's and the upper limb neurodynamic tests.
Validity of a semantically cued recall procedure for the mini-mental state examination.
Yuspeh, R L; Vanderploeg, R D; Kershaw, D A
1998-10-01
The validity of supplementing the three-item recall portion of the Mini-Mental State Examination (MMSE) with a cued recall procedure to help specify the nature of patients' memory problems was examined. Subjects were 247 individuals representing three diagnostic groups: Alzheimer's disease (AD), subcortical vascular ischemic dementia (SVaD), and normal controls. Individuals were administered a battery of neuropsychological tests, including the MMSE, as part of a comprehensive evaluation for the presence of dementia or other neurologic disorder. MMSE performance differed among groups. The three-item free recall performance also differed among groups, with post hoc analyses revealing the AD and SVaD groups were more impaired than controls but did not differ significantly from each other. Following a cued recall procedure of the MMSE three-items, groups differed, with post hoc analyses showing that AD patients failed to benefit from cues, whereas SVaD patients performed significantly better and comparable to control subjects. Significant correlations between the MMSE three-item cued recall performance and other memory measures demonstrated concurrent validity. Consistent with previous research indicating that SVaD is associated with memory encoding and retrieval deficits, whereas AD is associated with consolidation and storage problems, the present study supported the validity of the cued recall procedure of the three items on the MMSE in helping to distinguish between patients with AD and those with a vascular dementia with primarily subcortical pathology; however, despite these findings, a more extensive battery of neuropsychological measures is still recommended to consistently assess subtle diagnostic differences in these memory processes.
Expert system verification and validation study
NASA Technical Reports Server (NTRS)
French, Scott W.; Hamilton, David
1992-01-01
Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.
Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro
2016-12-15
MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.
Patry, Marc W; Magaletta, Philip R
2015-02-01
Although numerous studies have examined the psychometric properties and clinical utility of the Personality Assessment Inventory in correctional contexts, only two studies to date have specifically focused on suicide ideation. This article examines the convergent validity of the Suicide Ideation Scale and the Suicide Potential Index on the Personality Assessment Inventory in a large, nontreatment sample of male and female federal inmates (N = 1,120). The data indicated robust validity support for both the Suicide Ideation Scale and Suicide Potential Index, which were each correlated with a broad group of validity indices representing multiple assessment modalities. Recommendations for future research to build upon these findings through replication and extension are made. © The Author(s) 2014.
DBS-LC-MS/MS assay for caffeine: validation and neonatal application.
Bruschettini, Matteo; Barco, Sebastiano; Romantsik, Olga; Risso, Francesco; Gennai, Iulian; Chinea, Benito; Ramenghi, Luca A; Tripodi, Gino; Cangemi, Giuliana
2016-09-01
DBS might be an appropriate microsampling technique for therapeutic drug monitoring of caffeine in infants. Nevertheless, its application presents several issues that still limit its use. This paper describes a validated DBS-LC-MS/MS method for caffeine. The results of the method validation showed an hematocrit dependence. In the analysis of 96 paired plasma and DBS clinical samples, caffeine levels measured in DBS were statistically significantly lower than in plasma but the observed differences were independent from hematocrit. These results clearly showed the need for extensive validation with real-life samples for DBS-based methods. DBS-LC-MS/MS can be considered to be a good alternative to traditional methods for therapeutic drug monitoring or PK studies in preterm infants.
Validation Test Results for Orthogonal Probe Eddy Current Thruster Inspection System
NASA Technical Reports Server (NTRS)
Wincheski, Russell A.
2007-01-01
Recent nondestructive evaluation efforts within NASA have focused on an inspection system for the detection of intergranular cracking originating in the relief radius of Primary Reaction Control System (PCRS) Thrusters. Of particular concern is deep cracking in this area which could lead to combustion leakage in the event of through wall cracking from the relief radius into an acoustic cavity of the combustion chamber. In order to reliably detect such defects while ensuring minimal false positives during inspection, the Orthogonal Probe Eddy Current (OPEC) system has been developed and an extensive validation study performed. This report describes the validation procedure, sample set, and inspection results as well as comparing validation flaws with the response from naturally occuring damage.
77 FR 77069 - Commission Information Collection Activities (FERC-730); Comment Request; Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-31
... reliability and to reduce the cost of delivered power by reducing transmission congestion. Order No. 679 also... information, including the validity of the methodology and assumptions used; (3) ways to enhance the quality...
Asteroids as Calibration Standards in the Thermal Infrared -- Applications and Results from ISO
NASA Astrophysics Data System (ADS)
Müller, T. G.; Lagerros, J. S. V.
Asteroids have been used extensively as calibration sources for ISO. We summarise the asteroid observational parameters in the thermal infrared and explain the important modelling aspects. Ten selected asteroids were extensively used for the absolute photometric calibration of ISOPHOT in the far-IR. Additionally, the point-like and bright asteroids turned out to be of great interest for many technical tests and calibration aspects. They have been used for testing the calibration for SWS and LWS, the validation of relative spectral response functions of different bands, for colour correction and filter leak tests. Currently, there is a strong emphasis on ISO cross-calibration, where the asteroids contribute in many fields. Well known asteroids have also been seen serendipitously in the CAM Parallel Mode and the PHT Serendipity Mode, allowing for validation and improvement of the photometric calibration of these special observing modes.
Development of a Three-Dimensional, Unstructured Material Response Design Tool
NASA Technical Reports Server (NTRS)
Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia
2017-01-01
A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.
Preliminary Report on Oak Ridge National Laboratory Testing of Drake/ACSS/MA2/E3X
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irminger, Philip; King, Daniel J.; Herron, Andrew N.
2016-01-01
A key to industry acceptance of a new technology is extensive validation in field trials. The Powerline Conductor Accelerated Test facility (PCAT) at Oak Ridge National Laboratory (ORNL) is specifically designed to evaluate the performance and reliability of a new conductor technology under real world conditions. The facility is set up to capture large amounts of data during testing. General Cable used the ORNL PCAT facility to validate the performance of TransPowr with E3X Technology a standard overhead conductor with an inorganic high emissivity, low absorptivity surface coating. Extensive testing has demonstrated a significant improvement in conductor performance across amore » wide range of operating temperatures, indicating that E3X Technology can provide a reduction in temperature, a reduction in sag, and an increase in ampacity when applied to the surface of any overhead conductor. This report provides initial results of that testing.« less
NASA Astrophysics Data System (ADS)
Botti, Lorenzo; Di Pietro, Daniele A.
2018-10-01
We propose and validate a novel extension of Hybrid High-Order (HHO) methods to meshes featuring curved elements. HHO methods are based on discrete unknowns that are broken polynomials on the mesh and its skeleton. We propose here the use of physical frame polynomials over mesh elements and reference frame polynomials over mesh faces. With this choice, the degree of face unknowns must be suitably selected in order to recover on curved meshes the same convergence rates as on straight meshes. We provide an estimate of the optimal face polynomial degree depending on the element polynomial degree and on the so-called effective mapping order. The estimate is numerically validated through specifically crafted numerical tests. All test cases are conducted considering two- and three-dimensional pure diffusion problems, and include comparisons with discontinuous Galerkin discretizations. The extension to agglomerated meshes with curved boundaries is also considered.
An Agent Allocation System for the West Virginia University Extension Service
ERIC Educational Resources Information Center
Dougherty, Michael John; Eades, Daniel
2015-01-01
Extension recognizes the importance of data in guiding programming decisions at the local level. However, allocating personnel resources and specializations at the state level is a more complex process. The West Virginia University Extension Service has adopted a data-driven process to determine the number, location, and specializations of county…
Review of surface steam sterilization for validation purposes.
van Doornmalen, Joost; Kopinga, Klaas
2008-03-01
Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.
78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...
NASA Technical Reports Server (NTRS)
Fahrenthold, Eric P.; Shivarama, Ravishankar
2004-01-01
The hybrid particle-finite element method of Fahrenthold and Horban, developed for the simulation of hypervelocity impact problems, has been extended to include new formulations of the particle-element kinematics, additional constitutive models, and an improved numerical implementation. The extended formulation has been validated in three dimensional simulations of published impact experiments. The test cases demonstrate good agreement with experiment, good parallel speedup, and numerical convergence of the simulation results.
2015-03-01
domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model
Bioindicators of contaminant exposure and effect in aquatic and terrestrial monitoring
Melancon, Mark J.; Hoffman, David J.; Rattner, Barnett A.; Burton, G. Allen; Cairns, John
2003-01-01
Bioindicators of contaminant exposure presently used in environmental monitoring arc discussed. Some have been extensively field-validated and arc already in routine application. Included are (1) inhibition of brain or blood cholinesterase by anticholinesterase pesticides, (2) induction of hepatic microsomal cytochromes P450 by chemicals such as PAHs and PCBs, (3) reproductive problems such as terata and eggshell thinning, and (4) aberrations of hemoglobin synthesis, including the effects of lead and of certain chlorinated hydrocarbons. Many studies on DNA damage and of histopathological effects, particularly in the form of tumors, have already been completed. There are presently numerous other opportunities for field validation. Bile metabolites of contaminants in fish reveal exposure to contaminants that might otherwise be difficult to detect or quantify. Bile analysis is beginning to be extended to species other than fishes. Assessment of oxidative damage and immune competence appear to be valuable biomarkers. needing only additional field validation for wider use. The use of metallothioneins as biomarkers depends on the development of convenient, inexpensive methodology that provides information not available from measurements of metal ions. The use of stress proteins as biomarkers depends on development of convenient, inexpensive methodology and field validation. Gene arrays and proteomics hold promise as bioindicators for contaminant exposure or effect, particularly because of the large amount of data that could be generated, but they still need extensive development and testing.
NASA Astrophysics Data System (ADS)
Mollica, N. R.; Guo, W.; Cohen, A. L.; Huang, K. F.; Foster, G. L.; Donald, H.; Solow, A.
2017-12-01
Carbonate skeletons of scleractinian corals are important archives of ocean climate and environmental change. However, corals don't accrete their skeletons directly from ambient seawater, but from a calcifying fluid whose composition is strongly regulated. There is mounting evidence that the carbonate chemistry of this calcifying fluid significantly impacts the amount of carbonate the coral can precipitate, which in turn affects the geochemical composition of the skeleton produced. However the mechanistic link between calcifying fluid (cf) chemistry, particularly the up-regulation of pHcf and thereby aragonite saturation state (Ωcf), and coral calcification is not well understood. We explored this link by combining boron isotope measurements with in situ measurements of seawater temperature, salinity, and DIC to estimate Ωcf of nine Porites corals from four Pacific reefs. Associated calcification rates were quantified for each core via CT scanning. We do not observe a relationship between calcification rates and Ωcf or Ωsw. Instead, when we deconvolve calcification into linear extension and skeletal density, a significant correlation is observed between density and Ωcf, and also Ωsw while extension does not correlate with either. These observations are consistent with the two-step model of coral calcification, in which skeleton is secreted in two distinct phases: vertical extension creating new skeletal elements, followed by lateral thickening of existing elements that are covered by living tissue. We developed a numerical model of Porites skeletal growth that builds on this two-step model and links skeletal density with the external seawater environment via its influence on the chemistry of coral calcifying fluid. We validated the model using existing coral skeletal datasets from six Porites species collected across five reef sites, and quantified the effects of each seawater parameter (e.g. temperature, pH, DIC) on skeletal density. Our findings illustrate the sensitivity of the second phase of coral calcification to the carbonate chemistry of the calcifying fluid, and support previous coral proxy system modelling efforts by validating the two-step growth model on annual and seasonal scales.
Advances in Neutron Radiography: Application to Additive Manufacturing Inconel 718
Bilheux, Hassina Z; Song, Gian; An, Ke; ...
2016-01-01
Reactor-based neutron radiography is a non-destructive, non-invasive characterization technique that has been extensively used for engineering materials such as inspection of components, evaluation of porosity, and in-operando observations of engineering parts. Neutron radiography has flourished at reactor facilities for more than four decades and is relatively new to accelerator-based neutron sources. Recent advances in neutron source and detector technologies, such as the Spallation Neutron Source (SNS) at the Oak Ridge National Laboratory (ORNL) in Oak Ridge, TN, and the microchannel plate (MCP) detector, respectively, enable new contrast mechanisms using the neutron scattering Bragg features for crystalline information such as averagemore » lattice strain, crystalline plane orientation, and identification of phases in a neutron radiograph. Additive manufacturing (AM) processes or 3D printing have recently become very popular and have a significant potential to revolutionize the manufacturing of materials by enabling new designs with complex geometries that are not feasible using conventional manufacturing processes. However, the technique lacks standards for process optimization and control compared to conventional processes. Residual stresses are a common occurrence in materials that are machined, rolled, heat treated, welded, etc., and have a significant impact on a component s mechanical behavior and durability. They may also arise during the 3D printing process, and defects such as internal cracks can propagate over time as the component relaxes after being removed from its build plate (the base plate utilized to print materials on). Moreover, since access to the AM material is possible only after the component has been fully manufactured, it is difficult to characterize the material for defects a priori to minimize expensive re-runs. Currently, validation of the AM process and materials is mainly through expensive trial-and-error experiments at the component level, whereas in conventional processes the level of confidence in predictive computational modeling is high enough to allow process and materials optimization through computational approaches. Thus, there is a clear need for non-destructive characterization techniques and for the establishment of processing- microstructure databases that can be used for developing and validating predictive modeling tools for AM.« less
21 CFR 1271.230 - Process validation.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...
21 CFR 1271.230 - Process validation.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...
21 CFR 1271.230 - Process validation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...
21 CFR 1271.230 - Process validation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...
21 CFR 1271.230 - Process validation.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...
Verification and Validation Strategy for LWRS Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carl M. Stoots; Richard R. Schultz; Hans D. Gougar
2012-09-01
One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verifiedmore » and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.« less
True and fake information spreading over the Facebook
NASA Astrophysics Data System (ADS)
Yang, Dong; Chow, Tommy W. S.; Zhong, Lu; Tian, Zhaoyang; Zhang, Qingpeng; Chen, Guanrong
2018-09-01
Social networks have involved more and more users who search for and share information extensively and frequently. Tremendous evidence in Facebook, Twitter, Flickr and Google+ alike shows that such social networks are the major information sources as well as the most effective platforms for information transmission and exchange. The dynamic propagation of various information may gradually disseminate, drastically increase, strongly compete with each other, or slowly decrease. These observations had led to the present study of the spreading process of true and fake information over social networks, particularly the Facebook. Specifically, in this paper the topological structure of two huge-scale Facebook network datasets are investigated regarding their statistical properties. Based on that, an information model for simulating the true and fake information spreading over the Facebook is established. Through controlling the spreading parameters in extensive large-scale simulations, it is found that the final density of stiflers increases with the growth of the spreading rate, while it would decline with the increase of the removal rate. Moreover, it is found that the spreading process of the true-fake information is closely related to the node degrees on the network. Hub-individuals with high degrees have large probabilities to learn hidden information and then spread it. Interestingly, it is found that the spreading rate of the true information but not of the fake information has a great effect on the information spreading process, reflecting the human nature in believing and spreading truths in social activities. The new findings validate the proposed model to be capable of characterizing the dynamic evolution of true and fake information over the Facebook, useful and informative for future social science studies.
The Kepler Mission: From Concept to Operations
NASA Astrophysics Data System (ADS)
Koch, David G.
2011-01-01
From concept to launch and operations, what became the Kepler mission took a quarter of a century to create. We will review some of the steps along the way, the challenges, opportunities, strategic decisions and choices that had to be made that resulted in a mission that has the capability to detect and determine the frequencies of Earth-size planets in or near the habitable zone of solar-like stars. The process of going from starlight focused onto individual pixels to declaration of a planet detection is long and complex. Data for each star are recorded on the spacecraft and telemetered to the ground once per month. The raw pixel data are processed to produce light curves for each star. The light curves are processed to search for sequences of transits. A team of scientists examines the output to decide which meet the many validation criteria and qualify as candidates. Next an extensive series of ground-based follow-up observations are performed on the candidates now numbering in excess of 700. The objective is to eliminate false positive cases, while simultaneously improving our knowledge of the parent stars. Extensive analysis and modeling is performed on both the original photometric data and the newly acquired ground-based data to ascertain the true nature of each candidate. On the order of one-quarter to one-half of the candidates are rejected, mostly as some form of eclipsing binary. Of the remaining, some meet all the criteria and are submitted by the science team for peer-reviewed publications. Others may just require more data or may be left as undecided candidates for future research. An extended mission beyond 3.5 years will significantly improve the results from the Kepler mission, especially by covering the outer portion of the habitable zone for solar-like stars.
Makris, Susan L.; Raffaele, Kathleen; Allen, Sandra; Bowers, Wayne J.; Hass, Ulla; Alleva, Enrico; Calamandrei, Gemma; Sheets, Larry; Amcoff, Patric; Delrue, Nathalie; Crofton, Kevin M.
2009-01-01
Objective We conducted a review of the history and performance of developmental neurotoxicity (DNT) testing in support of the finalization and implementation of Organisation of Economic Co-operation and Development (OECD) DNT test guideline 426 (TG 426). Information sources and analysis In this review we summarize extensive scientific efforts that form the foundation for this testing paradigm, including basic neurotoxicology research, interlaboratory collaborative studies, expert workshops, and validation studies, and we address the relevance, applicability, and use of the DNT study in risk assessment. Conclusions The OECD DNT guideline represents the best available science for assessing the potential for DNT in human health risk assessment, and data generated with this protocol are relevant and reliable for the assessment of these end points. The test methods used have been subjected to an extensive history of international validation, peer review, and evaluation, which is contained in the public record. The reproducibility, reliability, and sensitivity of these methods have been demonstrated, using a wide variety of test substances, in accordance with OECD guidance on the validation and international acceptance of new or updated test methods for hazard characterization. Multiple independent, expert scientific peer reviews affirm these conclusions. PMID:19165382
Dhingra, Madhur S; Artois, Jean; Robinson, Timothy P; Linard, Catherine; Chaiban, Celia; Xenarios, Ioannis; Engler, Robin; Liechti, Robin; Kuznetsov, Dmitri; Xiao, Xiangming; Dobschuetz, Sophie Von; Claes, Filip; Newman, Scott H; Dauphin, Gwenaëlle; Gilbert, Marius
2016-01-01
Global disease suitability models are essential tools to inform surveillance systems and enable early detection. We present the first global suitability model of highly pathogenic avian influenza (HPAI) H5N1 and demonstrate that reliable predictions can be obtained at global scale. Best predictions are obtained using spatial predictor variables describing host distributions, rather than land use or eco-climatic spatial predictor variables, with a strong association with domestic duck and extensively raised chicken densities. Our results also support a more systematic use of spatial cross-validation in large-scale disease suitability modelling compared to standard random cross-validation that can lead to unreliable measure of extrapolation accuracy. A global suitability model of the H5 clade 2.3.4.4 viruses, a group of viruses that recently spread extensively in Asia and the US, shows in comparison a lower spatial extrapolation capacity than the HPAI H5N1 models, with a stronger association with intensively raised chicken densities and anthropogenic factors. DOI: http://dx.doi.org/10.7554/eLife.19571.001 PMID:27885988
Lung Reference Set A Application: LaszloTakacs - Biosystems (2010) — EDRN Public Portal
We would like to access the NCI lung cancer Combined Pre-Validation Reference Set A in order to further validate a lung cancer diagnostic test candidate. Our test is based on a panel of antibodies which have been tested on 4 different cohorts (see below, paragraph “Preliminary Data and Methods”). This Reference Set A, whose clinical setting is “Diagnosis of lung cancer”, will be used to validate the panel of monoclonal antibodies which have been demonstrated by extensive data analysis to provide the best discrimination between controls and Lung Cancer patient plasma samples, sensitivity and specificity values from ROC analyses are superior than 85 %.
Jeong, Chang Wook; Jeong, Seong Jin; Hong, Sung Kyu; Lee, Seung Bae; Ku, Ja Hyeon; Byun, Seok-Soo; Jeong, Hyeon; Kwak, Cheol; Kim, Hyeon Hoe; Lee, Eunsik; Lee, Sang Eun
2012-09-01
To develop and evaluate nomograms to predict the pathological stage of clinically localized prostate cancer after radical prostatectomy in Korean men. We reviewed the medical records of 2041 patients who had clinical stages T1c-T3a prostate cancer and were treated solely with radical prostatectomy at two hospitals. Logistic regressions were carried out to predict organ-confined disease, extraprostatic extension, seminal vesicle invasion, and lymph node metastasis using preoperative variables and resulting nomograms. Internal validations were assessed using the area under the receiver operating characteristic curve and calibration plot, and then external validations were carried out on 129 patients from another hospital. Head-to-head comparisons with 2007 Partin tables and Cancer of the Prostate Risk Assessment score were carried out using the area under the curve and decision curve analysis. The significant predictors for organ-confined disease and extraprostatic extension were clinical stage, prostate-specific antigen, Gleason score and a percent positive core of biopsy. Significant predictors for seminal vesicle invasion were prostate-specific antigen, Gleason score and percent positive core, and those for lymph node metastasis were prostate-specific antigen and percent positive core. The area under the curve of established nomograms for organ-confined disease, extraprostatic extension, seminal vesicle invasion and lymph node metastasis were 0.809, 0.804, 0.889 and 0.838, respectively. The nomograms were well calibrated and externally validated. These nomograms showed significantly higher accuracies and net benefits than two Western tools in Korean men. This is the first study to have developed and fully validated nomograms to predict the pathological stage of prostate cancer in an Asian population. These nomograms might be more accurate and useful for Korean men than other predictive models developed using Western populations. © 2012 The Japanese Urological Association.
Mejlholm, Ole; Dalgaard, Paw
2013-10-15
A new and extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. was developed and validated for processed and unprocessed products of seafood and meat. The new model was developed by refitting and expanding an existing cardinal parameter model for growth and the growth boundary of lactic acid bacteria (LAB) in processed seafood (O. Mejlholm and P. Dalgaard, J. Food Prot. 70. 2485-2497, 2007). Initially, to estimate values for the maximum specific growth rate at the reference temperature of 25 °C (μref) and the theoretical minimum temperature that prevents growth of psychrotolerant LAB (T(min)), the existing LAB model was refitted to data from experiments with seafood and meat products reported not to include nitrite or any of the four organic acids evaluated in the present study. Next, dimensionless terms modelling the antimicrobial effect of nitrite, and acetic, benzoic, citric and sorbic acids on growth of Lactobacillus sakei were added to the refitted model, together with minimum inhibitory concentrations determined for the five environmental parameters. The new model including the effect of 12 environmental parameters, as well as their interactive effects, was successfully validated using 229 growth rates (μ(max) values) for psychrotolerant Lactobacillus spp. in seafood and meat products. Average bias and accuracy factor values of 1.08 and 1.27, respectively, were obtained when observed and predicted μ(max) values of psychrotolerant Lactobacillus spp. were compared. Thus, on average μ(max) values were only overestimated by 8%. The performance of the new model was equally good for seafood and meat products, and the importance of including the effect of acetic, benzoic, citric and sorbic acids and to a lesser extent nitrite in order to accurately predict growth of psychrotolerant Lactobacillus spp. was clearly demonstrated. The new model can be used to predict growth of psychrotolerant Lactobacillus spp. in seafood and meat products e.g. prediction of the time to a critical cell concentration of bacteria is considered useful for establishing the shelf life. In addition, the high number of environmental parameters included in the new model makes it flexible and suitable for product development as the effect of substituting one combination of preservatives with another can be predicted. In general, the performance of the new model was unacceptable for other types of LAB including Carnobacterium spp., Leuconostoc spp. and Weissella spp. © 2013.
The Role of Extensive Reading in the Development of Phonological Processing
ERIC Educational Resources Information Center
Nisanci, Sinan
2017-01-01
The present study aims to investigate the role of extensive reading in the acquisition of implicit phonological knowledge. Through extensive exposure to print, L2 learners can improve their phonological processing skills, and this could contribute to their word recognition fluency. On the basis of the Oxford Placement Test, 30 9th graders and 30…
Kim, Ki-Tack; Lee, Sang-Hun; Suk, Kyung-Soo; Lee, Jung-Hee; Jeong, Bi-O
2010-06-01
The purpose of this study was to analyze the biomechanical effects of three different constrained types of an artificial disc on the implanted and adjacent segments in the lumbar spine using a finite element model (FEM). The created intact model was validated by comparing the flexion-extension response without pre-load with the corresponding results obtained from the published experimental studies. The validated intact lumbar model was tested after implantation of three artificial discs at L4-5. Each implanted model was subjected to a combination of 400 N follower load and 5 Nm of flexion/extension moments. ABAQUS version 6.5 (ABAQUS Inc., Providence, RI, USA) and FEMAP version 8.20 (Electronic Data Systems Corp., Plano, TX, USA) were used for meshing and analysis of geometry of the intact and implanted models. Under the flexion load, the intersegmental rotation angles of all the implanted models were similar to that of the intact model, but under the extension load, the values were greater than that of the intact model. The facet contact loads of three implanted models were greater than the loads observed with the intact model. Under the flexion load, three types of the implanted model at the L4-5 level showed the intersegmental rotation angle similar to the one measured with the intact model. Under the extension load, all of the artificial disc implanted models demonstrated an increased extension rotational angle at the operated level (L4-5), resulting in an increase under the facet contact load when compared with the adjacent segments. The increased facet load may lead to facet degeneration.
Eye-Tracking as a Tool in Process-Oriented Reading Test Validation
ERIC Educational Resources Information Center
Solheim, Oddny Judith; Uppstad, Per Henning
2011-01-01
The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…
The Next Generation of HLA Image Products
NASA Astrophysics Data System (ADS)
Gaffney, N. I.; Casertano, S.; Ferguson, B.
2012-09-01
We present the re-engineered pipeline based on existing and improved algorithms with the aim of improving processing quality, cross-instrument portability, data flow management, and software maintenance. The Hubble Legacy Archive (HLA) is a project to add value to the Hubble Space Telescope data archive by producing and delivering science-ready drizzled data products and source lists derived from these products. Initially, ACS, NICMOS, and WFCP2 data were combined using instrument-specific pipelines based on scripts developed to process the ACS GOODS data and a separate set of scripts to generate source extractor and DAOPhot source lists. The new pipeline, initially designed for WFC3 data, isolates instrument-specific processing and is easily extendable to other instruments and to generating wide-area mosaics. Significant improvements have been made in image combination using improved alignment, source detection, and background equalization routines. It integrates improved alignment procedures, better noise model, and source list generation within a single code base. Wherever practical, PyRAF based routines have been replaced with non-IRAF based python libraries (e.g. NumPy and PyFITS). The data formats have been modified to handle better and more consistent propagation of information from individual exposures to the combined products. A new exposure layer stores the effective exposure time for each pixel in the sky which is key in properly interpreting combined images from diverse data that were not initially planned to be mosaiced. We worked to improve the validity of the metadata within our FITS headers for these products relative to standard IRAF/PyRAF processing. Any keywords that pertain to individual exposures have been removed from the primary and extension headers and placed in a table extension for more direct and efficient perusal. This mechanism also allows for more detailed information on the processing of individual images to be stored and propagated providing a more hierarchical metadata storage system than key value pair FITS headers provide. In this poster we will discuss the changes to the pipeline processing and source list generation and the lessons learned which may be applicable to other archive projects as well as discuss our new metadata curation and preservation process.
Hierarchical atom type definitions and extensible all-atom force fields.
Jin, Zhao; Yang, Chunwei; Cao, Fenglei; Li, Feng; Jing, Zhifeng; Chen, Long; Shen, Zhe; Xin, Liang; Tong, Sijia; Sun, Huai
2016-03-15
The extensibility of force field is a key to solve the missing parameter problem commonly found in force field applications. The extensibility of conventional force fields is traditionally managed in the parameterization procedure, which becomes impractical as the coverage of the force field increases above a threshold. A hierarchical atom-type definition (HAD) scheme is proposed to make extensible atom type definitions, which ensures that the force field developed based on the definitions are extensible. To demonstrate how HAD works and to prepare a foundation for future developments, two general force fields based on AMBER and DFF functional forms are parameterized for common organic molecules. The force field parameters are derived from the same set of quantum mechanical data and experimental liquid data using an automated parameterization tool, and validated by calculating molecular and liquid properties. The hydration free energies are calculated successfully by introducing a polarization scaling factor to the dispersion term between the solvent and solute molecules. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Azamathulla, H. Md.; Jarrett, Robert D.
2013-01-01
Manning’s roughness coefficient (n) has been widely used in the estimation of flood discharges or depths of flow in natural channels. Therefore, the selection of appropriate Manning’s nvalues is of paramount importance for hydraulic engineers and hydrologists and requires considerable experience, although extensive guidelines are available. Generally, the largest source of error in post-flood estimates (termed indirect measurements) is due to estimates of Manning’s n values, particularly when there has been minimal field verification of flow resistance. This emphasizes the need to improve methods for estimating n values. The objective of this study was to develop a soft computing model in the estimation of the Manning’s n values using 75 discharge measurements on 21 high gradient streams in Colorado, USA. The data are from high gradient (S > 0.002 m/m), cobble- and boulder-bed streams for within bank flows. This study presents Gene-Expression Programming (GEP), an extension of Genetic Programming (GP), as an improved approach to estimate Manning’s roughness coefficient for high gradient streams. This study uses field data and assessed the potential of gene-expression programming (GEP) to estimate Manning’s n values. GEP is a search technique that automatically simplifies genetic programs during an evolutionary processes (or evolves) to obtain the most robust computer program (e.g., simplify mathematical expressions, decision trees, polynomial constructs, and logical expressions). Field measurements collected by Jarrett (J Hydraulic Eng ASCE 110: 1519–1539, 1984) were used to train the GEP network and evolve programs. The developed network and evolved programs were validated by using observations that were not involved in training. GEP and ANN-RBF (artificial neural network-radial basis function) models were found to be substantially more effective (e.g., R2 for testing/validation of GEP and RBF-ANN is 0.745 and 0.65, respectively) than Jarrett’s (J Hydraulic Eng ASCE 110: 1519–1539, 1984) equation (R2 for testing/validation equals 0.58) in predicting the Manning’s n.
NASA Astrophysics Data System (ADS)
Quinn, J. D.; Rosser, N. J.; Murphy, W.; Lawrence, J. A.
2010-08-01
Coastal monitoring is routinely undertaken to provide an archival record of cliff-line movement that can be used in the development and validation of predictive coast retreat and evolution models. However, coastal monitoring is often purely quantitative in nature, and financial necessity requires deployment over extensive coastal sections. As a result, for local site conditions in particular, only limited geomorphological data are available or included during the development of such predictive models. This has resulted in many current models incorporating a simplistic or generalised representation of cliff behaviour, an approach that progressively loses local credibility when deployed over extensive heterogeneous coastlines. This study addresses this situation at a site of extreme coastline retreat, Holderness, UK, through the application of intensive monitoring of six representative cliff sections nested within a general geomorphological appraisal of the wider coastline as a whole. The data from these surveys have been used to validate a finite difference-based geotechnical modelling assessment of clay cliff stability. Once validated, the geotechnical model was used to simulate a range of scenarios that were sufficient to represent the range of topographic, hydrogeological, geological, and littoral conditions exhibited throughout the region. Our assessment identified that the cliff retreat occurs through the combined influence of direct marine erosion of the cliff, with shallow, structurally controlled failures or substantial mass failures. Critically, the predisposition to any one of these failure mechanisms arises principally as a result of initial cliff height. The results of the numerical modelling have been combined into an empirical slope model that derives the rate of landslide-induced retreat that would arise from mass failures under various future scenarios. Results of this study can be used in the selection and development of retreat models at coastlines of similar physiographic setting to that found at Holderness. The results represent a key step in linking material deformation properties to the processes of cliff change and the subsequent range of landforms found on clay cliffs. As such, the results could also be used more generally to illustrate the likely cliff behaviour of other soft rock coastlines.
The Role of Structural Models in the Solar Sail Flight Validation Process
NASA Technical Reports Server (NTRS)
Johnston, John D.
2004-01-01
NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.
Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto
2015-07-01
The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.
Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation
NASA Technical Reports Server (NTRS)
Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna
2000-01-01
This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.
Expanding the Interactome of TES by Exploiting TES Modules with Different Subcellular Localizations.
Sala, Stefano; Van Troys, Marleen; Medves, Sandrine; Catillon, Marie; Timmerman, Evy; Staes, An; Schaffner-Reckinger, Elisabeth; Gevaert, Kris; Ampe, Christophe
2017-05-05
The multimodular nature of many eukaryotic proteins underlies their temporal or spatial engagement in a range of protein cocomplexes. Using the multimodule protein testin (TES), we here report a proteomics approach to increase insight in cocomplex diversity. The LIM-domain containing and tumor suppressor protein TES is present at different actin cytoskeleton adhesion structures in cells and influences cell migration, adhesion and spreading. TES module accessibility has been proposed to vary due to conformational switching and variants of TES lacking specific domains target to different subcellular locations. By applying iMixPro AP-MS ("intelligent Mixing of Proteomes"-affinity purification-mass spectrometry) to a set of tagged-TES modular variants, we identified proteins residing in module-specific cocomplexes. The obtained distinct module-specific interactomes combine to a global TES interactome that becomes more extensive and richer in information. Applying pathway analysis to the module interactomes revealed expected actin-related canonical pathways and also less expected pathways. We validated two new TES cocomplex partners: TGFB1I1 and a short form of the glucocorticoid receptor. TES and TGFB1I1 are shown to oppositely affect cell spreading providing biological validity for their copresence in complexes since they act in similar processes.
Multi-criteria anomaly detection in urban noise sensor networks.
Dauwe, Samuel; Oldoni, Damiano; De Baets, Bernard; Van Renterghem, Timothy; Botteldooren, Dick; Dhoedt, Bart
2014-01-01
The growing concern of citizens about the quality of their living environment and the emergence of low-cost microphones and data acquisition systems triggered the deployment of numerous noise monitoring networks spread over large geographical areas. Due to the local character of noise pollution in an urban environment, a dense measurement network is needed in order to accurately assess the spatial and temporal variations. The use of consumer grade microphones in this context appears to be very cost-efficient compared to the use of measurement microphones. However, the lower reliability of these sensing units requires a strong quality control of the measured data. To automatically validate sensor (microphone) data, prior to their use in further processing, a multi-criteria measurement quality assessment model for detecting anomalies such as microphone breakdowns, drifts and critical outliers was developed. Each of the criteria results in a quality score between 0 and 1. An ordered weighted average (OWA) operator combines these individual scores into a global quality score. The model is validated on datasets acquired from a real-world, extensive noise monitoring network consisting of more than 50 microphones. Over a period of more than a year, the proposed approach successfully detected several microphone faults and anomalies.
Understanding socio-economic impacts of geohazards aided by cyber-enabled systems
NASA Astrophysics Data System (ADS)
Klose, C. D.; Webersik, C.
2008-12-01
Due to an increase in the volume of geohazards worldwide, not only are impoverished regions in less developed countries such as Haiti, vulnerable to risk but also low income regions in industrialized countries, e.g. USA, as well. This has been exemplified once again by Hurricanes Gustav, Hanna and Ike and the impact on the Caribbean countries during the summer of 2008. To date, extensive research has been conducted to improve the monitoring of human-nature coupled systems. However, there is little emphasis on improving and developing methodologies to a) interpret multi-dimensional and complex data and b) validate prediction and modeling results. This presentation tries to motivate more research initiatives to address the aforementioned issues, bringing together two academic disciplines, earth and social sciences, to research the relationship between natural and socio-economic processes. Results are presented where cyber-enabled methods based on artificial intelligence are applied to different geohazards and regions in the world. They include 1) modeling of public health risks associated with volcanic gas hazards, 2) prediction and validation of potential areas of mining-triggered earthquakes, and 3) modeling of socio-economic risks associated with tropical storms in Haiti and the Dominican Republic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paone, Jeffrey R; Bolme, David S; Ferrell, Regina Kay
Keeping a driver focused on the road is one of the most critical steps in insuring the safe operation of a vehicle. The Strategic Highway Research Program 2 (SHRP2) has over 3,100 recorded videos of volunteer drivers during a period of 2 years. This extensive naturalistic driving study (NDS) contains over one million hours of video and associated data that could aid safety researchers in understanding where the driver s attention is focused. Manual analysis of this data is infeasible, therefore efforts are underway to develop automated feature extraction algorithms to process and characterize the data. The real-world nature, volume,more » and acquisition conditions are unmatched in the transportation community, but there are also challenges because the data has relatively low resolution, high compression rates, and differing illumination conditions. A smaller dataset, the head pose validation study, is available which used the same recording equipment as SHRP2 but is more easily accessible with less privacy constraints. In this work we report initial head pose accuracy using commercial and open source face pose estimation algorithms on the head pose validation data set.« less
3-D Characterization of Seismic Properties at the Smart Weapons Test Range, YPG
NASA Astrophysics Data System (ADS)
Miller, Richard D.; Anderson, Thomas S.; Davis, John C.; Steeples, Don W.; Moran, Mark L.
2001-10-01
The Smart Weapons Test Range (SWTR) lies within the Yuma Proving Ground (YPG), Arizona. SWTR is a new facility constructed specifically for the development and testing of futuristic intelligent battlefield sensor networks. In this paper, results are presented for an extensive high-resolution geophysical characterization study at the SWTR site along with validation using 3-D modeling. In this study, several shallow seismic methods and novel processing techniques were used to generate a 3-D grid of earth seismic properties, including compressional (P) and shear (S) body-wave speeds (Vp and Vs), and their associated body-wave attenuation parameters (Qp, and Qs). These experiments covered a volume of earth measuring 1500 m by 300 m by 25 m deep (11 million cubic meters), centered on the vehicle test track at the SWTR site. The study has resulted in detailed characterizations of key geophysical properties. To our knowledge, results of this kind have not been previously achieved, nor have the innovative methods developed for this effort been reported elsewhere. In addition to supporting materiel developers with important geophysical information at this test range, the data from this study will be used to validate sophisticated 3-D seismic signature models for moving vehicles.
Cheng, Zhongzhe; Zhou, Xing; Li, Wenyi; Hu, Bingying; Zhang, Yang; Xu, Yong; Zhang, Lin; Jiang, Hongliang
2016-11-30
Capilliposide B, a novel oleanane triterpenoid saponin isolated from Lysimachia capillipes Hemsl, showed significant anti-tumor activities in recent studies. To characterize the excretion of Capilliposide B, a reliable liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed and validated for simultaneous determination of Capilliposide B and its active metabolite, Capilliposide A in rat urine and feces. Sample preparation using a solid-phase extraction procedure was optimized by acidification of samples at various degrees, providing extensive sample clean-up with a high extraction recovery. In addition, rat urinary samples were pretreated with CHAPS, an anti-adsorptive agent, for overcoming nonspecific analytes adsorption during sample storage and process. The method validation was conducted over the curve range of 10.0-5000ng/ml for both analytes. The intra- and inter-day precision and accuracy of the QC samples showed ≤11.0% RSD and -10.4-12.8% relative error. The method was successfully applied to an excretion study of Capilliposide B following intravenous administration. Copyright © 2016 Elsevier B.V. All rights reserved.
Discriminative Prediction of A-To-I RNA Editing Events from DNA Sequence
Sun, Jiangming; Singh, Pratibha; Bagge, Annika; Valtat, Bérengère; Vikman, Petter; Spégel, Peter; Mulder, Hindrik
2016-01-01
RNA editing is a post-transcriptional alteration of RNA sequences that, via insertions, deletions or base substitutions, can affect protein structure as well as RNA and protein expression. Recently, it has been suggested that RNA editing may be more frequent than previously thought. A great impediment, however, to a deeper understanding of this process is the paramount sequencing effort that needs to be undertaken to identify RNA editing events. Here, we describe an in silico approach, based on machine learning, that ameliorates this problem. Using 41 nucleotide long DNA sequences, we show that novel A-to-I RNA editing events can be predicted from known A-to-I RNA editing events intra- and interspecies. The validity of the proposed method was verified in an independent experimental dataset. Using our approach, 203 202 putative A-to-I RNA editing events were predicted in the whole human genome. Out of these, 9% were previously reported. The remaining sites require further validation, e.g., by targeted deep sequencing. In conclusion, the approach described here is a useful tool to identify potential A-to-I RNA editing events without the requirement of extensive RNA sequencing. PMID:27764195
Satija, Ambika; Rimm, Eric B.; Spiegelman, Donna; Sampson, Laura; Rosner, Bernard; Camargo, Carlos A.; Stampfer, Meir; Willett, Walter C.
2016-01-01
Objectives. To review the contribution of the Nurses’ Health Studies (NHSs) to diet assessment methods and evidence-based nutritional policies and guidelines. Methods. We performed a narrative review of the publications of the NHS and NHS II between 1976 and 2016. Results. Through periodic assessment of diet by validated dietary questionnaires over 40 years, the NHSs have identified dietary determinants of diseases such as breast and other cancers; obesity; type 2 diabetes; cardiovascular, respiratory, and eye diseases; and neurodegenerative and mental health disorders. Nutritional biomarkers were assessed using blood, urine, and toenail samples. Robust findings, from the NHSs, together with evidence from other large cohorts and randomized dietary intervention trials, have contributed to the evidence base for developing dietary guidelines and nutritional policies to reduce intakes of trans fat, saturated fat, sugar-sweetened beverages, red and processed meats, and refined carbohydrates while promoting higher intake of healthy fats and carbohydrates and overall healthful dietary patterns. Conclusions. The long-term, periodically collected dietary data in the NHSs, with documented reliability and validity, have contributed extensively to our understanding of the dietary determinants of various diseases, informing dietary guidelines and shaping nutritional policy. PMID:27459459
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2018-03-09
Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.
Segregating photoelastic particles in free-surface granular flows
NASA Astrophysics Data System (ADS)
Thomas, Amalia; Vriend, Nathalie; Environmental; Industrial Fluid Dynamics Team
2017-11-01
We present results from a novel experimental set-up creating 2D avalanches of photoelastic discs. Two distinct hoppers supply either monodisperse or bidisperse particles at adjustable flow-rates into a 2 meter long, narrow acrylic chute inclined at 20°. For 20-40 seconds the avalanche maintains a steady-state that accelerates and thins downstream. The chute basal roughness is variable, allowing for different flow profiles. Using a set of polarizers and a high-speed camera, we visualize and quantify the forces due to dynamic interactions between the discs using photoelastic theory. Velocity and density profiles are derived from particle tracking at different distances from the discharge point and are coarse-grained to obtain continuous fields. With the access to both force information and dynamical properties via particle-tracking, we can experimentally validate existing mu(I) and non-local rheologies. As an extension, we probe the effect of granular segregation in bimodal mixtures by using the two separate inflow hoppers. We derive the state of segregation along the avalanche channel and measure the segregation velocities of each species. This provides insight in, and a unique validation of, the fundamental physical processes that drive segregation in avalanching geometries.
Forces directing germ-band extension in Drosophila embryos.
Kong, Deqing; Wolf, Fred; Großhans, Jörg
2017-04-01
Body axis elongation by convergent extension is a conserved developmental process found in all metazoans. Drosophila embryonic germ-band extension is an important morphogenetic process during embryogenesis, by which the length of the germ-band is more than doubled along the anterior-posterior axis. This lengthening is achieved by typical convergent extension, i.e. narrowing the lateral epidermis along the dorsal-ventral axis and simultaneous extension along the anterior-posterior axis. Germ-band extension is largely driven by cell intercalation, whose directionality is determined by the planar polarity of the tissue and ultimately by the anterior-posterior patterning system. In addition, extrinsic tensile forces originating from the invaginating endoderm induce cell shape changes, which transiently contribute to germ-band extension. Here, we review recent progress in understanding of the role of mechanical forces in germ-band extension. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Process. 905.37 Section 905.37 Energy DEPARTMENT OF ENERGY ENERGY PLANNING AND MANAGEMENT PROGRAM Power Marketing Initiative § 905.37 Process. Modified contractual language shall be required to place resource extensions under contract. Resource extensions and allocations...
Logistics: Implementation of Performance - Based Logistics for the Javelin Weapon System
2005-03-07
the c.ontext of each lice within the Automated Cost 24 Batimating-hTasgraled Tools ( ACEIT ) mode], the Army’s standard cost model, containing the EA was...fully validated the EA, The Javelin E.A was valihdted through an extensive review of the EA cost documentation in (te ACEIT file in coordination with... ACEIT file of the system cost estimate- This documentation was conndered to be suflicienT by the CEAC Director once the EA was determinmd to be valid
[Computerized system validation of clinical researches].
Yan, Charles; Chen, Feng; Xia, Jia-lai; Zheng, Qing-shan; Liu, Daniel
2015-11-01
Validation is a documented process that provides a high degree of assurance. The computer system does exactly and consistently what it is designed to do in a controlled manner throughout the life. The validation process begins with the system proposal/requirements definition, and continues application and maintenance until system retirement and retention of the e-records based on regulatory rules. The objective to do so is to clearly specify that each application of information technology fulfills its purpose. The computer system validation (CSV) is essential in clinical studies according to the GCP standard, meeting product's pre-determined attributes of the specifications, quality, safety and traceability. This paper describes how to perform the validation process and determine relevant stakeholders within an organization in the light of validation SOPs. Although a specific accountability in the implementation of the validation process might be outsourced, the ultimate responsibility of the CSV remains on the shoulder of the business process owner-sponsor. In order to show that the compliance of the system validation has been properly attained, it is essential to set up comprehensive validation procedures and maintain adequate documentations as well as training records. Quality of the system validation should be controlled using both QC and QA means.
Control of Wheel/Rail Noise and Vibration
DOT National Transportation Integrated Search
1982-04-01
An analytical model of the generation of wheel/rail noise has been developed and validated through an extensive series of field tests carried out at the Transportation Test Center using the State of the Art Car. A sensitivity analysis has been perfor...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... currently valid OMB control number. No person shall be subject to any penalty for failing to comply with a... contains rules and regulations addressing the nation's Emergency Alert System (EAS). The EAS provides the...
NASA Astrophysics Data System (ADS)
Lund, M.; Zona, D.; Jackowicz-Korczynski, M.; Xu, X.
2017-12-01
The eddy covariance methodology is the primary tool for studying landscape-scale land-atmosphere exchange of greenhouse gases. Since the choice of instrumental setup and processing algorithms may influence the results, efforts within the international flux community have been made towards methodological harmonization and standardization. Performing eddy covariance measurements in high-latitude, Arctic tundra sites involves several challenges, related not only to remoteness and harsh climate conditions but also to the choice of processing algorithms. Partitioning of net ecosystem exchange (NEE) of CO2 into gross primary production (GPP) and ecosystem respiration (Reco) in the FLUXNET2015 dataset is made using either Nighttime or Daytime methods. These variables, GPP and Reco, are essential for calibration and validation of Earth system models. North of the Arctic Circle, sun remains visible at local midnight for a period of time, the number of days per year with midnight sun being dependent on latitude. The absence of nighttime conditions during Arctic summers renders the Nighttime method uncertain, however, no extensive assessment on the implications for flux partitioning has yet been made. In this study, we will assess the performance and validity of both partitioning methods along a latitudinal transect of northern sites included in the FLUXNET2015 dataset. We will evaluate the partitioned flux components against model simulations using the Community Land Model (CLM). Our results will be valuable for users interested in simulating Arctic and global carbon cycling.
Baschung Pfister, Pierrette; Sterkele, Iris; Maurer, Britta; de Bie, Rob A.; Knols, Ruud H.
2018-01-01
Manual muscle testing (MMT) and hand-held dynamometry (HHD) are commonly used in people with inflammatory myopathy (IM), but their clinimetric properties have not yet been sufficiently studied. To evaluate the reliability and validity of MMT and HHD, maximum isometric strength was measured in eight muscle groups across three measurement events. To evaluate reliability of HHD, intra-class correlation coefficients (ICC), the standard error of measurements (SEM) and smallest detectable changes (SDC) were calculated. To measure reliability of MMT linear Cohen`s Kappa was computed for single muscle groups and ICC for total score. Additionally, correlations between MMT8 and HHD were evaluated with Spearman Correlation Coefficients. Fifty people with myositis (56±14 years, 76% female) were included in the study. Intra-and interrater reliability of HHD yielded excellent ICCs (0.75–0.97) for all muscle groups, except for interrater reliability of ankle extension (0.61). The corresponding SEMs% ranged from 8 to 28% and the SDCs% from 23 to 65%. MMT8 total score revealed excellent intra-and interrater reliability (ICC>0.9). Intrarater reliability of single muscle groups was substantial for shoulder and hip abduction, elbow and neck flexion, and hip extension (0.64–0.69); moderate for wrist (0.53) and knee extension (0.49) and fair for ankle extension (0.35). Interrater reliability was moderate for neck flexion (0.54) and hip abduction (0.44); fair for shoulder abduction, elbow flexion, wrist and ankle extension (0.20–0.33); and slight for knee extension (0.08). Correlations between the two tests were low for wrist, knee, ankle, and hip extension; moderate for elbow flexion, neck flexion and hip abduction; and good for shoulder abduction. In conclusion, the MMT8 total score is a reliable assessment to consider general muscle weakness in people with myositis but not for single muscle groups. In contrast, our results confirm that HHD can be recommended to evaluate strength of single muscle groups. PMID:29596450
An Extensible Information Grid for Risk Management
NASA Technical Reports Server (NTRS)
Maluf, David A.; Bell, David G.
2003-01-01
This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.
Elliott, Paul; Peakman, Tim C
2008-04-01
UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is completed on whole blood from all participants, since such assays need to be conducted on fresh samples (whereas other assays can be done on stored samples). By the end of the recruitment phase, 15 million sample aliquots will be stored in two geographically separate archives: 9.5 million in a -80 degrees C automated archive and 5.5 million in a manual liquid nitrogen archive at -180 degrees C. Because of the size of the study and the numbers of samples obtained from participants, the protocol stipulates a highly automated approach for the processing and storage of samples. Implementation of the processes, technology, systems and facilities has followed best practices used in manufacturing industry to reduce project risk and to build in quality and robustness. The data produced from sample collection, processing and storage are highly complex and are managed by a commercially available LIMS system fully integrated with the entire process. The sample handling and storage protocol adopted by UK Biobank provides quality assured and validated methods that are feasible within the available funding and reflect the size and aims of the project. Experience from recruiting and processing the first 40,000 participants to the study demonstrates that the adopted methods and technologies are fit-for-purpose and robust.
Pernambuco, Leandro; Espelt, Albert; Magalhães, Hipólito Virgílio; Lima, Kenio Costa de
2017-06-08
to present a guide with recommendations for translation, adaptation, elaboration and process of validation of tests in Speech and Language Pathology. the recommendations were based on international guidelines with a focus on the elaboration, translation, cross-cultural adaptation and validation process of tests. the recommendations were grouped into two Charts, one of them with procedures for translation and transcultural adaptation and the other for obtaining evidence of validity, reliability and measures of accuracy of the tests. a guide with norms for the organization and systematization of the process of elaboration, translation, cross-cultural adaptation and validation process of tests in Speech and Language Pathology was created.
Validation of gamma irradiator controls for quality and regulatory compliance
NASA Astrophysics Data System (ADS)
Harding, Rorry B.; Pinteric, Francis J. A.
1995-09-01
Since 1978 the U.S. Food and Drug Administration (FDA) has had both the legal authority and the Current Good Manufacturing Practice (CGMP) regulations in place to require irradiator owners who process medical devices to produce evidence of Irradiation Process Validation. One of the key components of Irradiation Process Validation is the validation of the irradiator controls. However, it is only recently that FDA audits have focused on this component of the process validation. What is Irradiator Control System Validation? What constitutes evidence of control? How do owners obtain evidence? What is the irradiator supplier's role in validation? How does the ISO 9000 Quality Standard relate to the FDA's CGMP requirement for evidence of Control System Validation? This paper presents answers to these questions based on the recent experiences of Nordion's engineering and product management staff who have worked with several US-based irradiator owners. This topic — Validation of Irradiator Controls — is a significant regulatory compliance and operations issue within the irradiator suppliers' and users' community.
Shpielberg, O; Akkermans, E
2016-06-17
A stability analysis is presented for boundary-driven and out-of-equilibrium systems in the framework of the hydrodynamic macroscopic fluctuation theory. A Hamiltonian description is proposed which allows us to thermodynamically interpret the additivity principle. A necessary and sufficient condition for the validity of the additivity principle is obtained as an extension of the Le Chatelier principle. These stability conditions result from a diagonal quadratic form obtained using the cumulant generating function. This approach allows us to provide a proof for the stability of the weakly asymmetric exclusion process and to reduce the search for stability to the solution of two coupled linear ordinary differential equations instead of nonlinear partial differential equations. Additional potential applications of these results are discussed in the realm of classical and quantum systems.
NASA Astrophysics Data System (ADS)
Shpielberg, O.; Akkermans, E.
2016-06-01
A stability analysis is presented for boundary-driven and out-of-equilibrium systems in the framework of the hydrodynamic macroscopic fluctuation theory. A Hamiltonian description is proposed which allows us to thermodynamically interpret the additivity principle. A necessary and sufficient condition for the validity of the additivity principle is obtained as an extension of the Le Chatelier principle. These stability conditions result from a diagonal quadratic form obtained using the cumulant generating function. This approach allows us to provide a proof for the stability of the weakly asymmetric exclusion process and to reduce the search for stability to the solution of two coupled linear ordinary differential equations instead of nonlinear partial differential equations. Additional potential applications of these results are discussed in the realm of classical and quantum systems.
Aydin, Sevcan
2016-06-01
As a result of developments in molecular technologies and the use of sequencing technologies, the analyses of the anaerobic microbial community in biological treatment process has become increasingly prevalent. This review examines the ways in which microbial sequencing methods can be applied to achieve an extensive understanding of the phylogenetic and functional characteristics of microbial assemblages in anaerobic reactor if the substrate is contaminated by antibiotics which is one of the most important toxic compounds. It will discuss some of the advantages and disadvantages associated with microbial sequencing techniques that are more commonly employed and will assess how a combination of the existing methods may be applied to develop a more comprehensive understanding of microbial communities and improve the validity and depth of the results for the enhancement of the stability of anaerobic reactors.
Chemical fractionation of siderophile elements in impactites from Australian meteorite craters
NASA Technical Reports Server (NTRS)
Attrep, A., Jr.; Orth, C. J.; Quintana, L. R.; Shoemaker, C. S.; Shoemaker, E. M.; Taylor, S. R.
1991-01-01
The abundance pattern of siderophile elements in terrestrial and lunar impact melt rocks was used extensively to infer the nature of the impacting projectiles. An implicit assumption made is that the siderophile abundance ratios of the projectiles are approximately preserved during mixing of the projectile constituents with the impact melts. As this mixture occurs during flow of strongly shocked materials at high temperatures, however there are grounds for suspecting that the underlying assumption is not always valid. In particular, fractionation of the melted and partly vaporized material of the projectile might be expected because of differences in volatility, solubility in silicate melts, and other characteristics of the constituent elements. Impactites from craters with associated meteorites offer special opportunities to test the assumptions on which projectile identifications are based and to study chemical fractionation that occurred during the impact process.
Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm.
Pickett, Stephen D; Green, Darren V S; Hunt, David L; Pardoe, David A; Hughes, Ian
2011-01-13
Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure-activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods.
Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm
2010-01-01
Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure−activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods. PMID:24900251
Microstructure Modeling of 3rd Generation Disk Alloy
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2008-01-01
The objective of this initiative, funded by NASA's Aviation Safety Program, is to model, validate, and predict, with high fidelity, the microstructural evolution of third-generation high-refractory Ni-based disc superalloys during heat treating and service conditions. This initiative is a natural extension of the DARPA-AIM (Accelerated Insertion of Materials) initiative with GE/Pratt-Whitney and with other process simulation tools. Strong collaboration with the NASA Glenn Research Center (GRC) is a key component of this initiative and the focus of this program is on industrially relevant disk alloys and heat treatment processes identified by GRC. Employing QuesTek s Computational Materials Dynamics technology and PrecipiCalc precipitation simulator, physics-based models are being used to achieve high predictive accuracy and precision. Combining these models with experimental data and probabilistic analysis, "virtual alloy design" can be performed. The predicted microstructures can be optimized to promote desirable features and concurrently eliminate nondesirable phases that can limit the reliability and durability of the alloys. The well-calibrated and well-integrated software tools that are being applied under the proposed program will help gas turbine disk alloy manufacturers, processing facilities, and NASA, to efficiently and effectively improve the performance of current and future disk materials.
Crone, Damien L; Bode, Stefan; Murawski, Carsten; Laham, Simon M
2018-01-01
A major obstacle for the design of rigorous, reproducible studies in moral psychology is the lack of suitable stimulus sets. Here, we present the Socio-Moral Image Database (SMID), the largest standardized moral stimulus set assembled to date, containing 2,941 freely available photographic images, representing a wide range of morally (and affectively) positive, negative and neutral content. The SMID was validated with over 820,525 individual judgments from 2,716 participants, with normative ratings currently available for all images on affective valence and arousal, moral wrongness, and relevance to each of the five moral values posited by Moral Foundations Theory. We present a thorough analysis of the SMID regarding (1) inter-rater consensus, (2) rating precision, and (3) breadth and variability of moral content. Additionally, we provide recommendations for use aimed at efficient study design and reproducibility, and outline planned extensions to the database. We anticipate that the SMID will serve as a useful resource for psychological, neuroscientific and computational (e.g., natural language processing or computer vision) investigations of social, moral and affective processes. The SMID images, along with associated normative data and additional resources are available at https://osf.io/2rqad/.
Unconventional Liquid Flow in Low-Permeability Media: Theory and Revisiting Darcy's Law
NASA Astrophysics Data System (ADS)
Liu, H. H.; Chen, J.
2017-12-01
About 80% of fracturing fluid remains in shale formations after hydraulic fracturing and the flow back process. It is critical to understand and accurately model the flow process of fracturing fluids in a shale formation, because the flow has many practical applications for shale gas recovery. Owing to the strong solid-liquid interaction in low-permeability media, Darcy's law is not always adequate for describing liquid flow process in a shale formation. This non-Darcy flow behavior (characterized by nonlinearity of the relationship between liquid flux and hydraulic gradient), however, has not been given enough attention in the shale gas community. The current study develops a systematic methodology to address this important issue. We developed a phenomenological model for liquid flow in shale (in which liquid flux is a power function of pressure gradient), an extension of the conventional Darcy's law, and also a methodology to estimate parameters for the phenomenological model from spontaneous imbibition tests. The validity of our new developments is verified by satisfactory comparisons of theoretical results and observations from our and other research groups. The relative importance of this non-Darcy liquid flow for hydrocarbon production in unconventional reservoirs remains an issue that needs to be further investigated.
Analysis of latency performance of bluetooth low energy (BLE) networks.
Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun
2014-12-23
Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes.
Analysis of Latency Performance of Bluetooth Low Energy (BLE) Networks
Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun
2015-01-01
Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes. PMID:25545266
Modeling Molecular and Cellular Aspects of Human Disease using the Nematode Caenorhabditis elegans
Silverman, Gary A.; Luke, Cliff J.; Bhatia, Sangeeta R.; Long, Olivia S.; Vetica, Anne C.; Perlmutter, David H.; Pak, Stephen C.
2009-01-01
As an experimental system, Caenorhabditis elegans, offers a unique opportunity to interrogate in vivo the genetic and molecular functions of human disease-related genes. For example, C. elegans has provided crucial insights into fundamental biological processes such as cell death and cell fate determinations, as well as pathological processes such as neurodegeneration and microbial susceptibility. The C. elegans model has several distinct advantages including a completely sequenced genome that shares extensive homology with that of mammals, ease of cultivation and storage, a relatively short lifespan and techniques for generating null and transgenic animals. However, the ability to conduct unbiased forward and reverse genetic screens in C. elegans remains one of the most powerful experimental paradigms for discovering the biochemical pathways underlying human disease phenotypes. The identification of these pathways leads to a better understanding of the molecular interactions that perturb cellular physiology, and forms the foundation for designing mechanism-based therapies. To this end, the ability to process large numbers of isogenic animals through automated work stations suggests that C. elegans, manifesting different aspects of human disease phenotypes, will become the platform of choice for in vivo drug discovery and target validation using high-throughput/content screening technologies. PMID:18852689
Xu, Hua; AbdelRahman, Samir; Lu, Yanxin; Denny, Joshua C.; Doan, Son
2011-01-01
Semantic-based sublanguage grammars have been shown to be an efficient method for medical language processing. However, given the complexity of the medical domain, parsers using such grammars inevitably encounter ambiguous sentences, which could be interpreted by different groups of production rules and consequently result in two or more parse trees. One possible solution, which has not been extensively explored previously, is to augment productions in medical sublanguage grammars with probabilities to resolve the ambiguity. In this study, we associated probabilities with production rules in a semantic-based grammar for medication findings and evaluated its performance on reducing parsing ambiguity. Using the existing data set from 2009 i2b2 NLP (Natural Language Processing) challenge for medication extraction, we developed a semantic-based CFG (Context Free Grammar) for parsing medication sentences and manually created a Treebank of 4,564 medication sentences from discharge summaries. Using the Treebank, we derived a semantic-based PCFG (probabilistic Context Free Grammar) for parsing medication sentences. Our evaluation using a 10-fold cross validation showed that the PCFG parser dramatically improved parsing performance when compared to the CFG parser. PMID:21856440
Show me the data: advances in multi-model benchmarking, assimilation, and forecasting
NASA Astrophysics Data System (ADS)
Dietze, M.; Raiho, A.; Fer, I.; Cowdery, E.; Kooper, R.; Kelly, R.; Shiklomanov, A. N.; Desai, A. R.; Simkins, J.; Gardella, A.; Serbin, S.
2016-12-01
Researchers want their data to inform carbon cycle predictions, but there are considerable bottlenecks between data collection and the use of data to calibrate and validate earth system models and inform predictions. This talk highlights recent advancements in the PEcAn project aimed at it making it easier for individual researchers to confront models with their own data: (1) The development of an easily extensible site-scale benchmarking system aimed at ensuring that models capture process rather than just reproducing pattern; (2) Efficient emulator-based Bayesian parameter data assimilation to constrain model parameters; (3) A novel, generalized approach to ensemble data assimilation to estimate carbon pools and fluxes and quantify process error; (4) automated processing and downscaling of CMIP climate scenarios to support forecasts that include driver uncertainty; (5) a large expansion in the number of models supported, with new tools for conducting multi-model and multi-site analyses; and (6) a network-based architecture that allows analyses to be shared with model developers and other collaborators. Application of these methods is illustrated with data across a wide range of time scales, from eddy-covariance to forest inventories to tree rings to paleoecological pollen proxies.
An object-based approach to weather analysis and its applications
NASA Astrophysics Data System (ADS)
Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew
2013-04-01
The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate the use of such system-oriented predictors for nowcasting. Columns of differential reflectivity ZDR measured by polarimetric weather radars are prominent signatures associated with thunderstorm updrafts. Since greater vertical velocities can loft larger drops and water-coated ice particles to higher altitudes above the environmental freezing level, the integrated ZDR column above the freezing level increases with increasing updraft intensity. Validation of atmospheric models concerning precipitation representation or prediction is usually confined to comparisons of precipitation fields or their temporal and spatial statistics. A comparison of the rain rates alone, however, does not immediately explain discrepancies between models and observations, because similar rain rates might be produced by different processes. Within the event-based approach for validation of models both observed and modeled rain events are analyzed by means of proxies of the precipitation process. Both sets of descriptors represent the basis for model validation since different leading descriptors - in a statistical sense- hint at process formulations potentially responsible for model failures.
Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M
2015-03-01
It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Campus Energy Model for Control and Performance Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-09-19
The core of the modeling platform is an extensible block library for the MATLAB/Simulink software suite. The platform enables true co-simulation (interaction at each simulation time step) with NREL's state-of-the-art modeling tools and other energy modeling software.