Tan, Edwin T.; Martin, Sarah R.; Fortier, Michelle A.; Kain, Zeev N.
2012-01-01
Objective To develop and validate a behavioral coding measure, the Children's Behavior Coding System-PACU (CBCS-P), for children's distress and nondistress behaviors while in the postanesthesia recovery unit. Methods A multidisciplinary team examined videotapes of children in the PACU and developed a coding scheme that subsequently underwent a refinement process (CBCS-P). To examine the reliability and validity of the coding system, 121 children and their parents were videotaped during their stay in the PACU. Participants were healthy children undergoing elective, outpatient surgery and general anesthesia. The CBCS-P was utilized and objective data from medical charts (analgesic consumption and pain scores) were extracted to establish validity. Results Kappa values indicated good-to-excellent (κ's > .65) interrater reliability of the individual codes. The CBCS-P had good criterion validity when compared to children's analgesic consumption and pain scores. Conclusions The CBCS-P is a reliable, observational coding method that captures children's distress and nondistress postoperative behaviors. These findings highlight the importance of considering context in both the development and application of observational coding schemes. PMID:22167123
Preliminary Assessment of Turbomachinery Codes
NASA Technical Reports Server (NTRS)
Mazumder, Quamrul H.
2007-01-01
This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.
Seng, Elizabeth K; Lovejoy, Travis I
2013-12-01
This study psychometrically evaluates the Motivational Interviewing Treatment Integrity Code (MITI) to assess fidelity to motivational interviewing to reduce sexual risk behaviors in people living with HIV/AIDS. 74 sessions from a pilot randomized controlled trial of motivational interviewing to reduce sexual risk behaviors in people living with HIV were coded with the MITI. Participants reported sexual behavior at baseline, 3-month, and 6-months. Regarding reliability, excellent inter-rater reliability was achieved for measures of behavior frequency across the 12 sessions coded by both coders; global scales demonstrated poor intraclass correlations, but adequate percent agreement. Regarding validity, principle components analyses indicated that a two-factor model accounted for an adequate amount of variance in the data. These factors were associated with decreases in sexual risk behaviors after treatment. The MITI is a reliable and valid measurement of treatment fidelity for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
Validating the BISON fuel performance code to integral LWR experiments
Williamson, R. L.; Gamble, K. A.; Perez, D. M.; ...
2016-03-24
BISON is a modern finite element-based nuclear fuel performance code that has been under development at the Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to datemore » for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Our results demonstrate that 1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, 2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and 3) comparison of rod diameter results indicates a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. In the initial rod diameter comparisons they were unsatisfactory and have lead to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.« less
Verification and Validation of the BISON Fuel Performance Code for PCMI Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, Kyle Allan Lawrence; Novascone, Stephen Rhead; Gardner, Russell James
2016-06-01
BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described. Validation for application to light water reactor (LWR) PCMI problems is assessed by comparing predicted and measured rod diameter following base irradiation andmore » power ramps. Results indicate a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. Initial rod diameter comparisons have led to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.« less
The Facial Expression Coding System (FACES): Development, Validation, and Utility
ERIC Educational Resources Information Center
Kring, Ann M.; Sloan, Denise M.
2007-01-01
This article presents information on the development and validation of the Facial Expression Coding System (FACES; A. M. Kring & D. Sloan, 1991). Grounded in a dimensional model of emotion, FACES provides information on the valence (positive, negative) of facial expressive behavior. In 5 studies, reliability and validity data from 13 diverse…
Application of the DART Code for the Assessment of Advanced Fuel Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Totev, T.
2007-07-01
The Dispersion Analysis Research Tool (DART) code is a dispersion fuel analysis code that contains mechanistically-based fuel and reaction-product swelling models, a one dimensional heat transfer analysis, and mechanical deformation models. DART has been used to simulate the irradiation behavior of uranium oxide, uranium silicide, and uranium molybdenum aluminum dispersion fuels, as well as their monolithic counterparts. The thermal-mechanical DART code has been validated against RERTR tests performed in the ATR for irradiation data on interaction thickness, fuel, matrix, and reaction product volume fractions, and plate thickness changes. The DART fission gas behavior model has been validated against UO{sub 2}more » fission gas release data as well as measured fission gas-bubble size distributions. Here DART is utilized to analyze various aspects of the observed bubble growth in U-Mo/Al interaction product. (authors)« less
Psychometric Properties of the System for Coding Couples’ Interactions in Therapy - Alcohol
Owens, Mandy D.; McCrady, Barbara S.; Borders, Adrienne Z.; Brovko, Julie M.; Pearson, Matthew R.
2014-01-01
Few systems are available for coding in-session behaviors for couples in therapy. Alcohol Behavior Couples Therapy (ABCT) is an empirically supported treatment, but little is known about its mechanisms of behavior change. In the current study, an adapted version of the Motivational Interviewing for Significant Others coding system was developed into the System for Coding Couples’ Interactions in Therapy – Alcohol (SCCIT-A), which was used to code couples’ interactions and behaviors during ABCT. Results showed good inter-rater reliability of the SCCIT-A and provided evidence that the SCCIT-A may be a promising measure for understanding couples in therapy. A three factor model of the SCCIT-A was examined (Positive, Negative, and Change Talk/Counter-Change Talk) using a confirmatory factor analysis, but model fit was poor. Due to poor model fit, ratios were computed for Positive/Negative ratings and for Change Talk/Counter-Change Talk codes based on previous research in the couples and Motivational Interviewing literature. Post-hoc analyses examined correlations between specific SCCIT-A codes and baseline characteristics and indicated some concurrent validity. Correlations were run between ratios and baseline characteristics; ratios may be an alternative to using the factors from the SCCIT-A. Reliability and validity analyses suggest that the SCCIT-A has the potential to be a useful measure for coding in-session behaviors of both partners in couples therapy and could be used to identify mechanisms of behavior change for ABCT. Additional research is needed to improve the reliability of some codes and to further develop the SCCIT-A and other measures of couples’ interactions in therapy. PMID:25528049
Dynamic Forces in Spur Gears - Measurement, Prediction, and Code Validation
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Townsend, Dennis P.; Rebbechi, Brian; Lin, Hsiang Hsi
1996-01-01
Measured and computed values for dynamic loads in spur gears were compared to validate a new version of the NASA gear dynamics code DANST-PC. Strain gage data from six gear sets with different tooth profiles were processed to determine the dynamic forces acting between the gear teeth. Results demonstrate that the analysis code successfully simulates the dynamic behavior of the gears. Differences between analysis and experiment were less than 10 percent under most conditions.
NASA Astrophysics Data System (ADS)
Class, G.; Meyder, R.; Stratmanns, E.
1985-12-01
The large data base for validation and development of computer codes for two-phase flow, generated at the COSIMA facility, is reviewed. The aim of COSIMA is to simulate the hydraulic, thermal, and mechanical conditions in the subchannel and the cladding of fuel rods in pressurized water reactors during the blowout phase of a loss of coolant accident. In terms of fuel rod behavior, it is found that during blowout under realistic conditions only small strains are reached. For cladding rupture extremely high rod internal pressures are necessary. The behavior of fuel rod simulators and the effect of thermocouples attached to the cladding outer surface are clarified. Calculations performed with the codes RELAP and DRUFAN show satisfactory agreement with experiments. This can be improved by updating the phase separation models in the codes.
Experimental Validation of an Ion Beam Optics Code with a Visualized Ion Thruster
NASA Astrophysics Data System (ADS)
Nakayama, Yoshinori; Nakano, Masakatsu
For validation of an ion beam optics code, the behavior of ion beam optics was experimentally observed and evaluated with a two-dimensional visualized ion thruster (VIT). Since the observed beam focus positions, sheath positions and measured ion beam currents were in good agreement with the numerical results, it was confirmed that the numerical model of this code was appropriated. In addition, it was also confirmed that the beam focus position was moved on center axis of grid hole according to the applied grid potentials, which differs from conventional understanding/assumption. The VIT operations may be useful not only for the validation of ion beam optics codes but also for the fundamental and intuitive understanding of the Child Law Sheath theory.
Pesch, Megan H; Lumeng, Julie C
2017-12-15
Behavioral coding of videotaped eating and feeding interactions can provide researchers with rich observational data and unique insights into eating behaviors, food intake, food selection as well as interpersonal and mealtime dynamics of children and their families. Unlike self-report measures of eating and feeding practices, the coding of videotaped eating and feeding behaviors can allow for the quantitative and qualitative examinations of behaviors and practices that participants may not self-report. While this methodology is increasingly more common, behavioral coding protocols and methodology are not widely shared in the literature. This has important implications for validity and reliability of coding schemes across settings. Additional guidance on how to design, implement, code and analyze videotaped eating and feeding behaviors could contribute to advancing the science of behavioral nutrition. The objectives of this narrative review are to review methodology for the design, operationalization, and coding of videotaped behavioral eating and feeding data in children and their families, and to highlight best practices. When capturing eating and feeding behaviors through analysis of videotapes, it is important for the study and coding to be hypothesis driven. Study design considerations include how to best capture the target behaviors through selection of a controlled experimental laboratory environment versus home mealtime, duration of video recording, number of observations to achieve reliability across eating episodes, as well as technical issues in video recording and sound quality. Study design must also take into account plans for coding the target behaviors, which may include behavior frequency, duration, categorization or qualitative descriptors. Coding scheme creation and refinement occur through an iterative process. Reliability between coders can be challenging to achieve but is paramount to the scientific rigor of the methodology. Analysis approach is dependent on the how data were coded and collapsed. Behavioral coding of videotaped eating and feeding behaviors can capture rich data "in-vivo" that is otherwise unobtainable from self-report measures. While data collection and coding are time-intensive the data yielded can be extremely valuable. Additional sharing of methodology and coding schemes around eating and feeding behaviors could advance the science and field.
Validation Data and Model Development for Fuel Assembly Response to Seismic Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardet, Philippe; Ricciardi, Guillaume
2016-01-31
Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging tasks in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWRmore » fuel bundle behavior during seismic transients.« less
Dishion, Thomas J; Mun, Chung Jung; Tein, Jenn-Yun; Kim, Hanjoe; Shaw, Daniel S; Gardner, Frances; Wilson, Melvin N; Peterson, Jenene
2017-04-01
This study examined the validity of micro social observations and macro ratings of parent-child interaction in early to middle childhood. Seven hundred and thirty-one families representing multiple ethnic groups were recruited and screened as at risk in the context of Women, Infant, and Children (WIC) Nutritional Supplement service settings. Families were randomly assigned to the Family Checkup (FCU) intervention or the control condition at age 2 and videotaped in structured interactions in the home at ages 2, 3, 4, and 5. Parent-child interaction videotapes were micro-coded using the Relationship Affect Coding System (RACS) that captures the duration of two mutual dyadic states: positive engagement and coercion. Macro ratings of parenting skills were collected after coding the videotapes to assess parent use of positive behavior support and limit setting skills (or lack thereof). Confirmatory factor analyses revealed that the measurement model of macro ratings of limit setting and positive behavior support was not supported by the data, and thus, were excluded from further analyses. However, there was moderate stability in the families' micro social dynamics across early childhood and it showed significant improvements as a function of random assignment to the FCU. Moreover, parent-child dynamics were predictive of chronic behavior problems as rated by parents in middle childhood, but not emotional problems. We conclude with a discussion of the validity of the RACS and on methodological advantages of micro social coding over the statistical limitations of macro rating observations. Future directions are discussed for observation research in prevention science.
Dunn, Madeleine J; Rodriguez, Erin M; Miller, Kimberly S; Gerhardt, Cynthia A; Vannatta, Kathryn; Saylor, Megan; Scheule, C Melanie; Compas, Bruce E
2011-06-01
To examine the acceptability and feasibility of coding observed verbal and nonverbal behavioral and emotional components of mother-child communication among families of children with cancer. Mother-child dyads (N=33, children ages 5-17 years) were asked to engage in a videotaped 15-min conversation about the child's cancer. Coding was done using the Iowa Family Interaction Rating Scale (IFIRS). Acceptability and feasibility of direct observation in this population were partially supported: 58% consented and 81% of those (47% of all eligible dyads) completed the task; trained raters achieved 78% agreement in ratings across codes. The construct validity of the IFIRS was demonstrated by expected associations within and between positive and negative behavioral/emotional code ratings and between mothers' and children's corresponding code ratings. Direct observation of mother-child communication about childhood cancer has the potential to be an acceptable and feasible method of assessing verbal and nonverbal behavior and emotion in this population.
Measuring homework completion in behavioral activation.
Busch, Andrew M; Uebelacker, Lisa A; Kalibatseva, Zornitsa; Miller, Ivan W
2010-07-01
The aim of this study was to develop and validate an observer-based coding system for the characterization and completion of homework assignments during Behavioral Activation (BA). Existing measures of homework completion are generally unsophisticated, and there is no current measure of homework completion designed to capture the particularities of BA. The tested scale sought to capture the type of assignment, realm of functioning targeted, extent of completion, and assignment difficulty. Homework assignments were drawn from 12 (mean age = 48, 83% female) clients in two trials of a 10-session BA manual targeting treatment-resistant depression in primary care. The two coders demonstrated acceptable or better reliability on most codes, and unreliable codes were dropped from the proposed scale. In addition, correlations between homework completion and outcome were strong, providing some support for construct validity. Ultimately, this line of research aims to develop a user-friendly, reliable measure of BA homework completion that can be completed by a therapist during session.
Hamilton, Clayon B; Wong, Ming-Kin; Gignac, Monique A M; Davis, Aileen M; Chesworth, Bert M
2017-01-01
To identify validated measures that capture illness perception and behavior and have been used to assess people who have knee pain/osteoarthritis. A scoping review was performed. Nine electronic databases were searched for records from inception through April 19, 2015. Search terms included illness perception, illness behavior, knee, pain, osteoarthritis, and their related terms. This review included English language publications of primary data on people with knee pain/osteoarthritis who were assessed with validated measures capturing any of 4 components of illness perception and behavior: monitor body, define and interpret symptoms, take remedial action, and utilize sources of help. Seventy-one publications included relevant measures. Two reviewers independently coded and analyzed each relevant measure within the 4 components. Sixteen measures were identified that capture components of illness perception and behavior in the target population. These measures were originally developed to capture constructs that include coping strategies/skills/styles, illness belief, illness perception, self-efficacy, and pain behavior. Coding results indicated that 5, 11, 12, and 5 of these measures included the monitor body, define and interpret symptoms, take remedial action, and utilize sources of help components, respectively. Several validated measures were interpreted as capturing some components, and only 1 measure was interpreted as capturing all of the components of illness perception and behavior in the target population. A measure that comprehensively captures illness perception and behavior could be valuable for informing and evaluating therapy for patients along a continuum of symptomatic knee osteoarthritis. © 2016 World Institute of Pain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yun, Di; Mo, Kun; Ye, Bei
2015-09-30
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL). Two major accomplishments in FY 15 are summarized in this report: (1) implementation of the FASTGRASS module in the BISON code; and (2) a Xe implantation experiment for large-grained UO 2. Both BISON AND MARMOT codes have been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. To contribute to the development of the Moose-Bison-Marmot (MBM) code suite, we have implemented the FASTGRASS fission gas model as a module inmore » the BISON code. Based on rate theory formulations, the coupled FASTGRASS module in BISON is capable of modeling LWR oxide fuel fission gas behavior and fission gas release. In addition, we conducted a Xe implantation experiment at the Argonne Tandem Linac Accelerator System (ATLAS) in order to produce the needed UO 2 samples with desired bubble morphology. With these samples, further experiments to study the fission gas diffusivity are planned to provide validation data for the Fission Gas Release Model in MARMOT codes.« less
MAVRIC Flutter Model Transonic Limit Cycle Oscillation Test
NASA Technical Reports Server (NTRS)
Edwards, John W.; Schuster, David M.; Spain, Charles V.; Keller, Donald F.; Moses, Robert W.
2001-01-01
The Models for Aeroelastic Validation Research Involving Computation semi-span wind-tunnel model (MAVRIC-I), a business jet wing-fuselage flutter model, was tested in NASA Langley's Transonic Dynamics Tunnel with the goal of obtaining experimental data suitable for Computational Aeroelasticity code validation at transonic separation onset conditions. This research model is notable for its inexpensive construction and instrumentation installation procedures. Unsteady pressures and wing responses were obtained for three wingtip configurations of clean, tipstore, and winglet. Traditional flutter boundaries were measured over the range of M = 0.6 to 0.9 and maps of Limit Cycle Oscillation (LCO) behavior were made in the range of M = 0.85 to 0.95. Effects of dynamic pressure and angle-of-attack were measured. Testing in both R134a heavy gas and air provided unique data on Reynolds number, transition effects, and the effect of speed of sound on LCO behavior. The data set provides excellent code validation test cases for the important class of flow conditions involving shock-induced transonic flow separation onset at low wing angles, including LCO behavior.
MAVRIC Flutter Model Transonic Limit Cycle Oscillation Test
NASA Technical Reports Server (NTRS)
Edwards, John W.; Schuster, David M.; Spain, Charles V.; Keller, Donald F.; Moses, Robert W.
2001-01-01
The Models for Aeroelastic Validation Research Involving Computation semi-span wind-tunnel model (MAVRIC-I), a business jet wing-fuselage flutter model, was tested in NASA Langley's Transonic Dynamics Tunnel with the goal of obtaining experimental data suitable for Computational Aeroelasticity code validation at transonic separation onset conditions. This research model is notable for its inexpensive construction and instrumentation installation procedures. Unsteady pressures and wing responses were obtained for three wingtip configurations clean, tipstore, and winglet. Traditional flutter boundaries were measured over the range of M = 0.6 to 0.9 and maps of Limit Cycle Oscillation (LCO) behavior were made in the range of M = 0.85 to 0.95. Effects of dynamic pressure and angle-of-attack were measured. Testing in both R134a heavy gas and air provided unique data on Reynolds number, transition effects, and the effect of speed of sound on LCO behavior. The data set provides excellent code validation test cases for the important class of flow conditions involving shock-induced transonic flow separation onset at low wing angles, including Limit Cycle Oscillation behavior.
Pretest aerosol code comparisons for LWR aerosol containment tests LA1 and LA2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, A.L.; Wilson, J.H.; Arwood, P.C.
The Light-Water-Reactor (LWR) Aerosol Containment Experiments (LACE) are being performed in Richland, Washington, at the Hanford Engineering Development Laboratory (HEDL) under the leadership of an international project board and the Electric Power Research Institute. These tests have two objectives: (1) to investigate, at large scale, the inherent aerosol retention behavior in LWR containments under simulated severe accident conditions, and (2) to provide an experimental data base for validating aerosol behavior and thermal-hydraulic computer codes. Aerosol computer-code comparison activities are being coordinated at the Oak Ridge National Laboratory. For each of the six LACE tests, ''pretest'' calculations (for code-to-code comparisons) andmore » ''posttest'' calculations (for code-to-test data comparisons) are being performed. The overall goals of the comparison effort are (1) to provide code users with experience in applying their codes to LWR accident-sequence conditions and (2) to evaluate and improve the code models.« less
Mun, Chung Jung; Tein, Jenn-Yun; Kim, Hanjoe; Shaw, Daniel S.; Gardner, Frances; Wilson, Melvin N.; Peterson, Jenene
2018-01-01
This study examined the validity of micro social observations and macro ratings of parent–child interaction in early to middle childhood. Seven hundred and thirty-one families representing multiple ethnic groups were recruited and screened as at risk in the context of Women, Infant, and Children (WIC) Nutritional Supplement service settings. Families were randomly assigned to the Family Checkup (FCU) intervention or the control condition at age 2 and videotaped in structured interactions in the home at ages 2, 3, 4, and 5. Parent–child interaction videotapes were microcoded using the Relationship Affect Coding System (RACS) that captures the duration of two mutual dyadic states: positive engagement and coercion. Macro ratings of parenting skills were collected after coding the videotapes to assess parent use of positive behavior support and limit setting skills (or lack thereof). Confirmatory factor analyses revealed that the measurement model of macro ratings of limit setting and positive behavior support was not supported by the data, and thus, were excluded from further analyses. However, there was moderate stability in the families’ micro social dynamics across early childhood and it showed significant improvements as a function of random assignment to the FCU. Moreover, parent–child dynamics were predictive of chronic behavior problems as rated by parents in middle childhood, but not emotional problems. We conclude with a discussion of the validity of the RACS and on methodological advantages of micro social coding over the statistical limitations of macro rating observations. Future directions are discussed for observation research in prevention science. PMID:27620623
Measuring Homework Completion in Behavioral Activation
ERIC Educational Resources Information Center
Busch, Andrew M.; Uebelacker, Lisa A.; Kalibatseva, Zornitsa; Miller, Ivan W.
2010-01-01
The aim of this study was to develop and validate an observer-based coding system for the characterization and completion of homework assignments during Behavioral Activation (BA). Existing measures of homework completion are generally unsophisticated, and there is no current measure of homework completion designed to capture the particularities…
Reliability and Validity of the Dyadic Observed Communication Scale (DOCS).
Hadley, Wendy; Stewart, Angela; Hunter, Heather L; Affleck, Katelyn; Donenberg, Geri; Diclemente, Ralph; Brown, Larry K
2013-02-01
We evaluated the reliability and validity of the Dyadic Observed Communication Scale (DOCS) coding scheme, which was developed to capture a range of communication components between parents and adolescents. Adolescents and their caregivers were recruited from mental health facilities for participation in a large, multi-site family-based HIV prevention intervention study. Seventy-one dyads were randomly selected from the larger study sample and coded using the DOCS at baseline. Preliminary validity and reliability of the DOCS was examined using various methods, such as comparing results to self-report measures and examining interrater reliability. Results suggest that the DOCS is a reliable and valid measure of observed communication among parent-adolescent dyads that captures both verbal and nonverbal communication behaviors that are typical intervention targets. The DOCS is a viable coding scheme for use by researchers and clinicians examining parent-adolescent communication. Coders can be trained to reliably capture individual and dyadic components of communication for parents and adolescents and this complex information can be obtained relatively quickly.
Evidence-Based Reading and Writing Assessment for Dyslexia in Adolescents and Young Adults
Nielsen, Kathleen; Abbott, Robert; Griffin, Whitney; Lott, Joe; Raskind, Wendy; Berninger, Virginia W.
2016-01-01
The same working memory and reading and writing achievement phenotypes (behavioral markers of genetic variants) validated in prior research with younger children and older adults in a multi-generational family genetics study of dyslexia were used to study 81 adolescent and young adults (ages 16 to 25) from that study. Dyslexia is impaired word reading and spelling skills below the population mean and ability to use oral language to express thinking. These working memory predictor measures were given and used to predict reading and writing achievement: Coding (storing and processing) heard and spoken words (phonological coding), read and written words (orthographic coding), base words and affixes (morphological coding), and accumulating words over time (syntax coding); Cross-Code Integration (phonological loop for linking phonological name and orthographic letter codes and orthographic loop for linking orthographic letter codes and finger sequencing codes), and Supervisory Attention (focused and switching attention and self-monitoring during written word finding). Multiple regressions showed that most predictors explained individual difference in at least one reading or writing outcome, but which predictors explained unique variance beyond shared variance depended on outcome. ANOVAs confirmed that research-supported criteria for dyslexia validated for younger children and their parents could be used to diagnose which adolescents and young adults did (n=31) or did not (n=50) meet research criteria for dyslexia. Findings are discussed in reference to the heterogeneity of phenotypes (behavioral markers of genetic variables) and their application to assessment for accommodations and ongoing instruction for adolescents and young adults with dyslexia. PMID:26855554
Lyons-Ruth, Karlen; Bureau, Jean-François; Riley, Caitlin D; Atlas-Corbett, Alisha F
2009-01-01
Socially indiscriminate attachment behavior has been repeatedly observed among institutionally reared children. Socially indiscriminate behavior has also been associated with aggression and hyperactivity. However, available data rely heavily on caregiver report of indiscriminate behavior. In addition, few studies have been conducted with samples of home-reared infants exposed to inadequate care. The current study aimed to develop a reliable laboratory measure of socially indiscriminate forms of attachment behavior based on direct observation and to validate the measure against assessments of early care and later behavior problems among home-reared infants. Strange Situation episodes of 75 socially at-risk mother-infant dyads were coded for infant indiscriminate attachment behavior on the newly developed Rating for Infant-Stranger Engagement. After controlling for infant insecure-organized and disorganized behavior in all analyses, extent of infant-stranger engagement at 18 months was significantly related to serious caregiving risk (maltreatment or maternal psychiatric hospitalization), observed quality of disrupted maternal affective communication, and aggressive and hyperactive behavior problems at age 5. Results are discussed in relation to the convergent and discriminant validity of the new measure and to the potential utility of a standardized observational measure of indiscriminate attachment behavior. Further validation is needed in relation to caregiver report measures of indiscriminate behavior.
Virues-Ortega, Javier; Montaño-Fidalgo, Montserrat; Froján-Parga, María Xesús; Calero-Elvira, Ana
2011-12-01
This study analyzes the interobserver agreement and hypothesis-based known-group validity of the Therapist's Verbal Behavior Category System (SISC-INTER). The SISC-INTER is a behavioral observation protocol comprised of a set of verbal categories representing putative behavioral functions of the in-session verbal behavior of a therapist (e.g., discriminative, reinforcing, punishing, and motivational operations). The complete therapeutic process of a clinical case of an individual with marital problems was recorded (10 sessions, 8 hours), and data were arranged in a temporal sequence using 10-min periods. Hypotheses based on the expected performance of the putative behavioral functions portrayed by the SISC-INTER codes across prevalent clinical activities (i.e., assessing, explaining, Socratic method, providing clinical guidance) were tested using autoregressive integrated moving average (ARIMA) models. Known-group validity analyses provided support to all hypotheses. The SISC-INTER may be a useful tool to describe therapist-client interaction in operant terms. The utility of reliable and valid protocols for the descriptive analysis of clinical practice in terms of verbal behavior is discussed. Copyright © 2011. Published by Elsevier Ltd.
Performance and Architecture Lab Modeling Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-06-19
Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less
KAVEH, MOHAMMAD HOSSEIN; MORADI, LEILA; HESAMPOUR, MARYAM; HASAN ZADEH, JAFAR
2015-01-01
Introduction Recognizing the determinants of behavior plays a major role in identification and application of effective strategies for encouraging individuals to follow the intended pattern of behavior. The present study aimed to analyze the university students’ behaviors regarding the amenability to dress code, using the theory of reasoned action (TRA). Methods In this cross sectional study, 472 students were selected through multi-stage random sampling. The data were collected using a researcher-made questionnaire whose validity was confirmed by specialists. Besides, its reliability was confirmed by conducting a pilot study revealing Cronbach’s alpha coefficients of 0.93 for attitude, 0.83 for subjective norms, 0.94 for behavioral intention and 0.77 for behavior. The data were entered into the SPSS statistical software and analyzed using descriptive and inferential statistics (Mann-Whitney, correlation and regression analysis). Results Based on the students’ self-reports, conformity of clothes to the university’s dress code was below the expected level in 28.87% of the female students and 28.55% of the male ones. The mean scores of attitude, subjective norms, and behavioral intention to comply with dress code policy were 28.78±10.08, 28.51±8.25 and 11.12±3.84, respectively. The students of different colleges were different from each other concerning TRA constructs. Yet, subjective norms played a more critical role in explaining the variance of dress code behavior among the students. Conclusion Theory of reasoned action explained the students’ dress code behaviors relatively well. The study results suggest paying attention to appropriate approaches in educational, cultural activities, including promotion of student-teacher communication. PMID:26269790
Kaveh, Mohammad Hossein; Moradi, Leila; Hesampour, Maryam; Hasan Zadeh, Jafar
2015-07-01
Recognizing the determinants of behavior plays a major role in identification and application of effective strategies for encouraging individuals to follow the intended pattern of behavior. The present study aimed to analyze the university students' behaviors regarding the amenability to dress code, using the theory of reasoned action (TRA). In this cross sectional study, 472 students were selected through multi-stage random sampling. The data were collected using a researcher-made questionnaire whose validity was confirmed by specialists. Besides, its reliability was confirmed by conducting a pilot study revealing Cronbach's alpha coefficients of 0.93 for attitude, 0.83 for subjective norms, 0.94 for behavioral intention and 0.77 for behavior. The data were entered into the SPSS statistical software and analyzed using descriptive and inferential statistics (Mann-Whitney, correlation and regression analysis). Based on the students' self-reports, conformity of clothes to the university's dress code was below the expected level in 28.87% of the female students and 28.55% of the male ones. The mean scores of attitude, subjective norms, and behavioral intention to comply with dress code policy were 28.78±10.08, 28.51±8.25 and 11.12±3.84, respectively. The students of different colleges were different from each other concerning TRA constructs. Yet, subjective norms played a more critical role in explaining the variance of dress code behavior among the students. Theory of reasoned action explained the students' dress code behaviors relatively well. The study results suggest paying attention to appropriate approaches in educational, cultural activities, including promotion of student-teacher communication.
Development of PRIME for irradiation performance analysis of U-Mo/Al dispersion fuel
NASA Astrophysics Data System (ADS)
Jeong, Gwan Yoon; Kim, Yeon Soo; Jeong, Yong Jin; Park, Jong Man; Sohn, Dong-Seong
2018-04-01
A prediction code for the thermo-mechanical performance of research reactor fuel (PRIME) has been developed with the implementation of developed models to analyze the irradiation behavior of U-Mo dispersion fuel. The code is capable of predicting the two-dimensional thermal and mechanical performance of U-Mo dispersion fuel during irradiation. A finite element method was employed to solve the governing equations for thermal and mechanical equilibria. Temperature- and burnup-dependent material properties of the fuel meat constituents and cladding were used. The numerical solution schemes in PRIME were verified by benchmarking solutions obtained using a commercial finite element analysis program (ABAQUS). The code was validated using irradiation data from RERTR, HAMP-1, and E-FUTURE tests. The measured irradiation data used in the validation were IL thickness, volume fractions of fuel meat constituents for the thermal analysis, and profiles of the plate thickness changes and fuel meat swelling for the mechanical analysis. The prediction results were in good agreement with the measurement data for both thermal and mechanical analyses, confirming the validity of the code.
Modeling and Analysis of Actinide Diffusion Behavior in Irradiated Metal Fuel
NASA Astrophysics Data System (ADS)
Edelmann, Paul G.
There have been numerous attempts to model fast reactor fuel behavior in the last 40 years. The US currently does not have a fully reliable tool to simulate the behavior of metal fuels in fast reactors. The experimental database necessary to validate the codes is also very limited. The DOE-sponsored Advanced Fuels Campaign (AFC) has performed various experiments that are ready for analysis. Current metal fuel performance codes are either not available to the AFC or have limitations and deficiencies in predicting AFC fuel performance. A modified version of a new fuel performance code, FEAST-Metal , was employed in this investigation with useful results. This work explores the modeling and analysis of AFC metallic fuels using FEAST-Metal, particularly in the area of constituent actinide diffusion behavior. The FEAST-Metal code calculations for this work were conducted at Los Alamos National Laboratory (LANL) in support of on-going activities related to sensitivity analysis of fuel performance codes. A sensitivity analysis of FEAST-Metal was completed to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. A modification was made to the FEAST-Metal constituent redistribution model to enable accommodation of newer AFC metal fuel compositions with verified results. Applicability of this modified model for sodium fast reactor metal fuel design is demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dartevelle, Sebastian
2007-10-01
Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less
ERIC Educational Resources Information Center
Grzadzinski, Rebecca; Carr, Themba; Colombi, Costanza; McGuire, Kelly; Dufek, Sarah; Pickles, Andrew; Lord, Catherine
2016-01-01
Psychometric properties and initial validity of the Brief Observation of Social Communication Change (BOSCC), a measure of treatment-response for social-communication behaviors, are described. The BOSCC coding scheme is applied to 177 video observations of 56 young children with ASD and minimal language abilities. The BOSCC has high to excellent…
SIMINOFF, LAURA A.; STEP, MARY M.
2011-01-01
Many observational coding schemes have been offered to measure communication in health care settings. These schemes fall short of capturing multiple functions of communication among providers, patients, and other participants. After a brief review of observational communication coding, the authors present a comprehensive scheme for coding communication that is (a) grounded in communication theory, (b) accounts for instrumental and relational communication, and (c) captures important contextual features with tailored coding templates: the Siminoff Communication Content & Affect Program (SCCAP). To test SCCAP reliability and validity, the authors coded data from two communication studies. The SCCAP provided reliable measurement of communication variables including tailored content areas and observer ratings of speaker immediacy, affiliation, confirmation, and disconfirmation behaviors. PMID:21213170
Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael
This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less
Patton, Susana R; Dolan, Lawrence M; Smith, Laura B; Brown, Morton B; Powers, Scott W
2013-12-01
This study examined mealtime behaviors in families of young children with type 1 diabetes (T1DM) on intensive insulin therapy. Behaviors were compared to published data for children on conventional therapy and examined for correlations with glycemic control. Thirty-nine families participated and had at least three home meals videotaped while children wore a continuous glucose monitor. Videotaped meals were coded for parent, child, and child eating behaviors using a valid coding system. A group difference was found for child request for food only. There were also associations found between children's glycemic control and child play and away. However, no associations were found between parent and child behaviors within meals and children's corresponding post-prandial glycemic control. Results reinforce existing research indicating that mealtime behavior problems exist for families of young children even in the context of intensive therapy and that some child behaviors may relate to glycemic control. © 2013.
Ladd, Benjamin O.; McCrady, Barbara S.
2016-01-01
The current study aimed to examine whether classification of couples in which one partner has an alcohol problem is similar to that reported in the general couples literature. Typologies of couples seeking Alcohol Behavioral Couple Therapy (ABCT) were developed via hierarchical cluster analysis using behavioral codes of couple interactions during their first ABCT session. Four couples types based on in-session behavior were established reliably, labeled Avoider, Validator, Hostile, and Ambivalent-Detached. These couple types resembled couples types found in previous research. Couple type was associated with baseline relationship satisfaction, but not alcohol use. Results suggest heterogeneity in couples with alcohol problems presenting to treatment; further study is needed to investigate the function of alcohol within these different types. PMID:25808432
Optical observables in stars with non-stationary atmospheres. [fireballs and cepheid models
NASA Technical Reports Server (NTRS)
Hillendahl, R. W.
1980-01-01
Experience gained by use of Cepheid modeling codes to predict the dimensional and photometric behavior of nuclear fireballs is used as a means of validating various computational techniques used in the Cepheid codes. Predicted results from Cepheid models are compared with observations of the continuum and lines in an effort to demonstrate that the atmospheric phenomena in Cepheids are quite complex but that they can be quantitatively modeled.
Behavioral analysis of children's response to induction of anesthesia.
Chorney, Jill MacLaren; Kain, Zeev N
2009-11-01
It is documented that children experience distress at anesthesia induction, but little is known about the prevalence of specific behaviors exhibited by children. Digital audiovisual recordings of 293 children undergoing outpatient elective surgery were coded using Observer XT software and the validated Revised Perioperative Child-Adult Medical Procedure Interaction Scale. Multiple pass second-by-second data recording was used to capture children's behaviors across phases of anesthesia induction. More than 40% of children aged 2-10 yr displayed some distress behavior during induction with 17% of these children displaying significant distress and more than 30% of children resisting anesthesiologists during induction. Children's distress and nondistress behaviors displayed four profiles over the course of anesthesia induction: Acute Distress, Anticipatory Distress, Early Regulating Behaviors, and Engagement with Procedure. Older children had higher scores on early regulating and engagement profiles whereas younger children had higher scores on Acute Distress. There were no differences across age in children's Anticipatory Distress. Construct validity of behavior profiles was supported via correlations of profile score (overall and on the walk to the operating room) with a validated assessment of children's anxiety at induction. Children undergoing anesthesia display a range of distress and nondistress behaviors. A group of behaviors was identified that, when displayed on the walk to the operating room, is associated with less distress at anesthesia induction. These data provide the first examination of potentially regulating behaviors of children, but more detailed sequential analysis is required to validate specific functions of these behaviors.
A Clustering-Based Approach to Enriching Code Foraging Environment.
Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu
2016-09-01
Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.
Experimental Validation of a Closed Brayton Cycle System Transient Simulation
NASA Technical Reports Server (NTRS)
Johnson, Paul K.; Hervol, David S.
2006-01-01
The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.
Reactivity Insertion Accident (RIA) Capability Status in the BISON Fuel Performance Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williamson, Richard L.; Folsom, Charles Pearson; Pastore, Giovanni
2016-05-01
One of the Challenge Problems being considered within CASL relates to modelling and simulation of Light Water Reactor LWR) fuel under Reactivity Insertion Accident (RIA) conditions. BISON is the fuel performance code used within CASL for LWR fuel under both normal operating and accident conditions, and thus must be capable of addressing the RIA challenge problem. This report outlines required BISON capabilities for RIAs and describes the current status of the code. Information on recent accident capability enhancements, application of BISON to a RIA benchmark exercise, and plans for validation to RIA behavior are included.
Two-Fluid Extensions to the M3D CDX-U Validation Study
NASA Astrophysics Data System (ADS)
Breslau, J.; Strauss, H.; Sugiyama, L.
2005-10-01
As part of a cross-code verification and validation effort, both the M3D code [1] and the NIMROD code [2] have qualitatively reproduced the nonlinear behavior of a complete sawtooth cycle in the CDX-U tokamak, chosen for the study because its low temperature and small size puts it in a parameter regime easily accessible to both codes. Initial M3D studies on this problem used a resistive MHD model with a large, empirical perpendicular heat transport value and with modest toroidal resolution (24 toroidal planes). The success of this study prompted the pursuit of more quantitatively accurate predictions by the application of more sophisticated physical models and higher numerical resolution. The results of two consequent follow-up studies are presented here. In the first, the toroidal resolution of the original run is doubled to 48 planes. The behavior of the sawtooth in this case is essentially the same as in the lower- resolution study. The sawtooth study has also been repeated using a two-fluid plasma model, with the effects of the &*circ;i term emphasized. The resulting mode rotation, as well as the effects on the reconnection rate (sawtooth crash time), sawtooth period, and overall stability are presented. [1] W. Park, et al., Phys. Plasmas 6, 1796 (1999). [2] C. Sovinec, et al., J. Comp. Phys. 195, 355 (2004).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miao, Yinbin; Mo, Kun; Jamison, Laura M.
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructure-basedmore » materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize the experimental efforts in FY16 including the following important experiments: (1) in-situ grain growth measurement of nano-grained UO 2; (2) investigation of surface morphology in micrograined UO 2; (3) Nano-indentation experiments on nano- and micro-grained UO 2. The highlight of this year is: we have successfully demonstrated our capability to in-situ measure grain size development while maintaining the stoichiometry of nano-grained UO 2 materials; the experiment is, for the first time, using synchrotron X-ray diffraction to in-situ measure grain growth behavior of UO 2.« less
A Comprehensive High Performance Predictive Tool for Fusion Liquid Metal Hydromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Peter; Chhabra, Rupanshi; Munipalli, Ramakanth
In Phase I SBIR project, HyPerComp and Texcel initiated the development of two induction-based MHD codes as a predictive tool for fusion hydro-magnetics. The newly-developed codes overcome the deficiency of other MHD codes based on the quasi static approximation by defining a more general mathematical model that utilizes the induced magnetic field rather than the electric potential as the main electromagnetic variable. The UCLA code is a finite-difference staggered-mesh code that serves as a supplementary tool to the massively-parallel finite-volume code developed by HyPerComp. As there is no suitable experimental data under blanket-relevant conditions for code validation, code-to-code comparisons andmore » comparisons against analytical solutions were successfully performed for three selected test cases: (1) lid-driven MHD flow, (2) flow in a rectangular duct in a transverse magnetic field, and (3) unsteady finite magnetic Reynolds number flow in a rectangular enclosure. The performed tests suggest that the developed codes are accurate and robust. Further work will focus on enhancing the code capabilities towards higher flow parameters and faster computations. At the conclusion of the current Phase-II Project we have completed the preliminary validation efforts in performing unsteady mixed-convection MHD flows (against limited data that is currently available in literature), and demonstrated flow behavior in large 3D channels including important geometrical features. Code enhancements such as periodic boundary conditions, unmatched mesh structures are also ready. As proposed, we have built upon these strengths and explored a much increased range of Grashof numbers and Hartmann numbers under various flow conditions, ranging from flows in a rectangular duct to prototypic blanket modules and liquid metal PFC. Parametric studies, numerical and physical model improvements to expand the scope of simulations, code demonstration, and continued validation activities have also been completed.« less
Validation of fast-ion D-alpha spectrum measurements during EAST neutral-beam heated plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, J., E-mail: juan.huang@ipp.ac.cn; Wu, C. R.; Hou, Y. M.
2016-11-15
To investigate the fast ion behavior, a fast ion D-alpha (FIDA) diagnostic system has been installed on EAST. Fast ion features can be inferred from the Doppler shifted spectrum of Balmer-alpha light from energetic hydrogenic atoms. This paper will focus on the validation of FIDA measurements performed using MHD-quiescent discharges in 2015 campaign. Two codes have been applied to calculate the D{sub α} spectrum: one is a Monte Carlo code, Fortran 90 version FIDASIM, and the other is an analytical code, Simulation of Spectra (SOS). The predicted SOS fast-ion spectrum agrees well with the measurement; however, the level of fast-ionmore » part from FIDASIM is lower. The discrepancy is possibly due to the difference between FIDASIM and SOS velocity distribution function. The details will be presented in the paper to primarily address comparisons of predicted and observed spectrum shapes/amplitudes.« less
Ladd, Benjamin O; McCrady, Barbara S
2016-01-01
This study aimed to examine whether classification of couples in which one partner has an alcohol problem is similar to that reported in the general couples literature. Typologies of couples seeking alcohol behavioral couple therapy (ABCT) were developed via hierarchical cluster analysis using behavioral codes of couple interactions during their first ABCT session. Four couples types based on in-session behavior were established reliably, labeled avoider, validator, hostile, and ambivalent-detached. These couple types resembled couples types found in previous research. Couple type was associated with baseline relationship satisfaction, but not alcohol use. Results suggest heterogeneity in couples with alcohol problems presenting to treatment; further study is needed to investigate the function of alcohol within these different types. © 2015 American Association for Marriage and Family Therapy.
STAR-CCM+ Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David
2016-09-30
The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less
Validity of an Observation Method for Assessing Pain Behavior in Individuals With Multiple Sclerosis
Cook, Karon F.; Roddey, Toni S.; Bamer, Alyssa M.; Amtmann, Dagmar; Keefe, Francis J
2012-01-01
Context Pain is a common and complex experience for individuals who live with multiple sclerosis (MS) that interferes with physical, psychological and social function. A valid and reliable tool for quantifying observed pain behaviors in MS is critical to understanding how pain behaviors contribute to pain-related disability in this clinical population. Objectives To evaluate the reliability and validity of a pain behavioral observation protocol in individuals who have MS. Methods Community-dwelling volunteers with multiple sclerosis (N=30), back pain (N=5), or arthritis (N=8) were recruited based on clinician referrals, advertisements, fliers, web postings, and participation in previous research. Participants completed measures of pain severity, pain interference, and self-reported pain behaviors and were videotaped doing typical activities (e.g., walking, sitting). Two coders independently recorded frequencies of pain behaviors by category (e.g., guarding, bracing) and inter-rater reliability statistics were calculated. Naïve observers reviewed videotapes of individuals with MS and rated their pain. Spearman correlations were calculated between pain behavior frequencies and self-reported pain and pain ratings by naïve observers. Results Inter-rater reliability estimates indicated the reliability of pain codes in the MS sample. Kappa coefficients ranged from moderate agreement (sighing = 0.40) to substantial agreement (guarding = 0.83). These values were comparable to those obtained in the combined back pain and arthritis sample. Concurrent validity was supported by correlations with self-reported pain (0.46-0.53) and with self-reports of pain behaviors (0.58). Construct validity was supported by finding of 0.87 correlation between total pain behaviors observed by coders and mean pain ratings by naïve observers. Conclusion Results support use of the pain behavior observation protocol for assessing pain behaviors of individuals with MS. Valid assessments of pain behaviors of individuals with MS in could lead to creative interventions in the management of chronic pain in this population. PMID:23159684
A Taxonomy for Mannerisms of Blind Children.
ERIC Educational Resources Information Center
Eichel, Valerie J.
1979-01-01
The investigation involving 24 blind children (2-11 years old) set out to develop and validate a coding procedure which employed a set of 34 descriptors with their corresponding definitions. The use of the taxonomy enabled a detailed, systematic study of manneristic behavior in blind children. (Author/SBH)
Assessment of MARMOT. A Mesoscale Fuel Performance Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonks, M. R.; Schwen, D.; Zhang, Y.
2015-04-01
MARMOT is the mesoscale fuel performance code under development as part of the US DOE Nuclear Energy Advanced Modeling and Simulation Program. In this report, we provide a high level summary of MARMOT, its capabilities, and its current state of validation. The purpose of MARMOT is to predict the coevolution of microstructure and material properties of nuclear fuel and cladding. It accomplished this using the phase field method coupled to solid mechanics and heat conduction. MARMOT is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE), and much of its basic capability in the areas of the phase field method, mechanics,more » and heat conduction come directly from MOOSE modules. However, additional capability specific to fuel and cladding is available in MARMOT. While some validation of MARMOT has been completed in the areas of fission gas behavior and grain growth, much more validation needs to be conducted. However, new mesoscale data needs to be obtained in order to complete this validation.« less
Grzadzinski, Rebecca; Carr, Themba; Colombi, Costanza; McGuire, Kelly; Dufek, Sarah; Pickles, Andrew; Lord, Catherine
2016-07-01
Psychometric properties and initial validity of the Brief Observation of Social Communication Change (BOSCC), a measure of treatment-response for social-communication behaviors, are described. The BOSCC coding scheme is applied to 177 video observations of 56 young children with ASD and minimal language abilities. The BOSCC has high to excellent inter-rater and test-retest reliability and shows convergent validity with measures of language and communication skills. The BOSCC Core total demonstrates statistically significant amounts of change over time compared to a no change alternative while the ADOS CSS over the same period of time did not. This work is a first step in the development of a novel outcome measure for social-communication behaviors with applications to clinical trials and longitudinal studies.
The kinetics of aerosol particle formation and removal in NPP severe accidents
NASA Astrophysics Data System (ADS)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.
2016-06-01
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.
The kinetics of aerosol particle formation and removal in NPP severe accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.
2016-06-08
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less
Validation of CFD Codes for Parawing Geometries in Subsonic to Supersonic Flows
NASA Technical Reports Server (NTRS)
Cruz-Ayoroa, Juan G.; Garcia, Joseph A.; Melton, John E.
2014-01-01
Computational Fluid Dynamic studies of a rigid parawing at Mach numbers from 0.8 to 4.65 were carried out using three established inviscid, viscous and independent panel method codes. Pressure distributions along four chordwise sections of the wing were compared to experimental wind tunnel data gathered from NASA technical reports. Results show good prediction of the overall trends and magnitudes of the pressure distributions for the inviscid and viscous solvers. Pressure results for the panel method code diverge from test data at large angles of attack due to shock interaction phenomena. Trends in the flow behavior and their effect on the integrated force and moments on this type of wing are examined in detail using the inviscid CFD code results.
New higher-order Godunov code for modelling performance of two-stage light gas guns
NASA Technical Reports Server (NTRS)
Bogdanoff, D. W.; Miller, R. J.
1995-01-01
A new quasi-one-dimensional Godunov code for modeling two-stage light gas guns is described. The code is third-order accurate in space and second-order accurate in time. A very accurate Riemann solver is used. Friction and heat transfer to the tube wall for gases and dense media are modeled and a simple nonequilibrium turbulence model is used for gas flows. The code also models gunpowder burn in the first-stage breech. Realistic equations of state (EOS) are used for all media. The code was validated against exact solutions of Riemann's shock-tube problem, impact of dense media slabs at velocities up to 20 km/sec, flow through a supersonic convergent-divergent nozzle and burning of gunpowder in a closed bomb. Excellent validation results were obtained. The code was then used to predict the performance of two light gas guns (1.5 in. and 0.28 in.) in service at the Ames Research Center. The code predictions were compared with measured pressure histories in the powder chamber and pump tube and with measured piston and projectile velocities. Very good agreement between computational fluid dynamics (CFD) predictions and measurements was obtained. Actual powder-burn rates in the gun were found to be considerably higher (60-90 percent) than predicted by the manufacturer and the behavior of the piston upon yielding appears to differ greatly from that suggested by low-strain rate tests.
Active Inference and Learning in the Cerebellum.
Friston, Karl; Herreros, Ivan
2016-09-01
This letter offers a computational account of Pavlovian conditioning in the cerebellum based on active inference and predictive coding. Using eyeblink conditioning as a canonical paradigm, we formulate a minimal generative model that can account for spontaneous blinking, startle responses, and (delay or trace) conditioning. We then establish the face validity of the model using simulated responses to unconditioned and conditioned stimuli to reproduce the sorts of behavior that are observed empirically. The scheme's anatomical validity is then addressed by associating variables in the predictive coding scheme with nuclei and neuronal populations to match the (extrinsic and intrinsic) connectivity of the cerebellar (eyeblink conditioning) system. Finally, we try to establish predictive validity by reproducing selective failures of delay conditioning, trace conditioning, and extinction using (simulated and reversible) focal lesions. Although rather metaphorical, the ensuing scheme can account for a remarkable range of anatomical and neurophysiological aspects of cerebellar circuitry-and the specificity of lesion-deficit mappings that have been established experimentally. From a computational perspective, this work shows how conditioning or learning can be formulated in terms of minimizing variational free energy (or maximizing Bayesian model evidence) using exactly the same principles that underlie predictive coding in perception.
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare Department of Health and Human Services ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
45 CFR 162.1011 - Valid code sets.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Valid code sets. 162.1011 Section 162.1011 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates...
Cook, Karon F; Roddey, Toni S; Bamer, Alyssa M; Amtmann, Dagmar; Keefe, Francis J
2013-09-01
Pain is a common and complex experience for individuals who live with multiple sclerosis (MS) and it interferes with physical, psychological, and social function. A valid and reliable tool for quantifying observed pain behaviors in MS is critical to understand how pain behaviors contribute to pain-related disability in this clinical population. To evaluate the reliability and validity of a pain behavioral observation protocol in individuals who have MS. Community-dwelling volunteers with MS (N=30), back pain (N=5), or arthritis (N=8) were recruited based on clinician referrals, advertisements, fliers, web postings, and participation in previous research. Participants completed the measures of pain severity, pain interference, and self-reported pain behaviors and were videotaped doing typical activities (e.g., walking and sitting). Two coders independently recorded frequencies of pain behaviors by category (e.g., guarding and bracing) and interrater reliability statistics were calculated. Naïve observers reviewed videotapes of individuals with MS and rated their pain. The Spearman's correlations were calculated between pain behavior frequencies and self-reported pain and pain ratings by naïve observers. Interrater reliability estimates indicated the reliability of pain codes in the MS sample. Kappa coefficients ranged from moderate (sighing=0.40) to substantial agreements (guarding=0.83). These values were comparable with those obtained in the combined back pain and arthritis sample. Concurrent validity was supported by correlations with self-reported pain (0.46-0.53) and with self-reports of pain behaviors (0.58). Construct validity was supported by a finding of 0.87 correlation between total pain behaviors observed by coders and mean pain ratings by naïve observers. Results support the use of the pain behavior observation protocol for assessing pain behaviors of individuals with MS. Valid assessments of pain behaviors of individuals with MS could lead to creative interventions in the management of chronic pain in this population. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Nadeem, Erum; Romo, Laura F.; Sigman, Marian; Lefkowitz, Eva S.; Au, Terry K.
2007-01-01
This study examined the sensitivity of an observational coding system for assessing positive and negative maternal behaviors of Latino and European American mothers toward their adolescent children. Ninety Latino (54 Spanish speaking and 35 English speaking) and 20 European American mother-adolescent dyads participated in an observational study of…
Middle Childhood Attachment Strategies: validation of an observational measure.
Brumariu, Laura E; Giuseppone, Kathryn R; Kerns, Kathryn A; Van de Walle, Magali; Bureau, Jean-François; Bosmans, Guy; Lyons-Ruth, Karlen
2018-02-05
The purpose of this study was to assess behavioral manifestations of attachment in middle childhood, and to evaluate their relations with key theoretical correlates. The sample consisted of 87 children (aged 10-12 years) and their mothers. Dyads participated in an 8-min videotaped discussion of a conflict in their relationships, later scored with the Middle Childhood Attachment Strategies Coding System (MCAS) for key features of all child attachment patterns described in previous literature (secure, ambivalent, avoidant, disorganized-disoriented, caregiving/role-confused, hostile/punitive). To assess validity, relations among MCAS dimensions and other measures of attachment, parenting, and psychological adjustment were evaluated. Results provide preliminary evidence for the psychometric properties of the MCAS in that its behaviorally assessed patterns were associated with theoretically relevant constructs, including maternal warmth/acceptance and psychological control, and children's social competence, depression, and behavioral problems. The MCAS opens new grounds for expanding our understanding of attachment and its outcomes in middle childhood.
Development of PRIME for irradiation performance analysis of U-Mo/Al dispersion fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, Gwan Yoon; Kim, Yeon Soo; Jeong, Yong Jin
A prediction code for the thermo-mechanical performance of research reactor fuel (PRIME) has been developed with the implementation of developed models to analyze the irradiation behavior of U-Mo dispersion fuel. The code is capable of predicting the two-dimensional thermal and mechanical performance of U-Mo dispersion fuel during irradiation. A finite element method was employed to solve the governing equations for thermal and mechanical equilibria. Temperature-and burnup-dependent material properties of the fuel meat constituents and cladding were used. The numerical solution schemes in PRIME were verified by benchmarking solutions obtained using a commercial finite element analysis program (ABAQUS).The code was validatedmore » using irradiation data from RERTR, HAMP-1, and E-FUTURE tests. The measured irradiation data used in the validation were IL thickness, volume fractions of fuel meat constituents for the thermal analysis, and profiles of the plate thickness changes and fuel meat swelling for the mechanical analysis. The prediction results were in good agreement with the measurement data for both thermal and mechanical analyses, confirming the validity of the code. (c) 2018 Elsevier B.V. All rights reserved.« less
Evidence-Based School Behavior Assessment of Externalizing Behavior in Young Children.
Bagner, Daniel M; Boggs, Stephen R; Eyberg, Sheila M
2010-02-01
This study examined the psychometric properties of the Revised Edition of the School Observation Coding System (REDSOCS). Participants were 68 children ages 3 to 6 who completed parent-child interaction therapy for Oppositional Defiant Disorder as part of a larger efficacy trial. Interobserver reliability on REDSOCS categories was moderate to high, with percent agreement ranging from 47% to 90% (M = 67%) and Cohen's kappa coefficients ranging from .69 to .95 (M = .82). Convergent validity of the REDSOCS categories was supported by significant correlations with the Intensity Scale of the Sutter-Eyberg Student Behavior Inventory-Revised and related subscales of the Conners' Teacher Rating Scale-Revised: Long Version (CTRS-R: L). Divergent validity was indicated by nonsignificant correlations between REDSOCS categories and scales on the CTRS-R: L expected not to relate to disruptive classroom behavior. Treatment sensitivity was demonstrated for two of the three primary REDSOCS categories by significant pre to posttreatment changes. This study provides psychometric support for the designation of REDSOCS as an evidence-based assessment procedure for young children.
Nonlinear dynamic simulation of single- and multi-spool core engines
NASA Technical Reports Server (NTRS)
Schobeiri, T.; Lippke, C.; Abouelkheir, M.
1993-01-01
In this paper a new computational method for accurate simulation of the nonlinear dynamic behavior of single- and multi-spool core engines, turbofan engines, and power generation gas turbine engines is presented. In order to perform the simulation, a modularly structured computer code has been developed which includes individual mathematical modules representing various engine components. The generic structure of the code enables the dynamic simulation of arbitrary engine configurations ranging from single-spool thrust generation to multi-spool thrust/power generation engines under adverse dynamic operating conditions. For precise simulation of turbine and compressor components, row-by-row calculation procedures were implemented that account for the specific turbine and compressor cascade and blade geometry and characteristics. The dynamic behavior of the subject engine is calculated by solving a number of systems of partial differential equations, which describe the unsteady behavior of the individual components. In order to ensure the capability, accuracy, robustness, and reliability of the code, comprehensive critical performance assessment and validation tests were performed. As representatives, three different transient cases with single- and multi-spool thrust and power generation engines were simulated. The transient cases range from operating with a prescribed fuel schedule, to extreme load changes, to generator and turbine shut down.
2013-09-01
to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS
STRING: A new drifter for HF radar validation.
NASA Astrophysics Data System (ADS)
Rammou, Anna-Maria; Zervakis, Vassilis; Bellomo, Lucio; Kokkini, Zoi; Quentin, Celine; Mantovani, Carlo; Kalampokis, Alkiviadis
2015-04-01
High-Frequency radars (HFR) are an effective mean of remotely monitoring sea-surface currents, based on recording the Doppler-shift of radio-waves backscattered on the sea surface. Validation of HFRs' measurements takes place via comparisons either with in-situ Eulerian velocity data (usually obtained by surface current-meters attached on moorings) or to Lagrangian velocity fields (recorded by surface drifters). The most common surface drifter used for this purpose is the CODE-type drifter (Davis, 1985), an industry-standard design to record the vertical average velocity of the upper 1 m layer of the water column. In this work we claim that the observed differences between the HFR-derived velocities and Lagrangian measurements can be attributed not just to the different spatial scales recorded by the above instruments but also due to the fact that while the HFR-derived velocity corresponds to exponentially weighted vertical average of the velocity field from the surface to 1 m depth (Stewart and Joy, 1974) the velocity estimated by the CODE drifters corresponds to boxcar-type weighted vertical average due to the orthogonal shape of the CODE drifters' sails. After analyzing the theoretical behavior of a drifter under the influence of wind and current, we proceed to propose a new design of exponentially-shaped sails for the drogues of CODE-based drifters, so that the HFR-derived velocities and the drifter-based velocities will be directly comparable, regarding the way of vertically averaging the velocity field.The new drifter, codenamed STRING, exhibits identical behavior to the classical CODE design under relatively homogeneous conditions in the upper 1 m layer, however it is expected to follow a significantly different track in conditions of high vertical shear and stratification. Thus, we suggest that the new design is the instrument of choice for validation of HFR installations, as it can be used in all conditions and behaves identically to CODEs when vertical shear is insignificant. Finally, we present results from three experiments using both drifter types in HFR-covered regions of the Eastern Mediterranean. More experiments are planned, incorporating design improvements dictated by the results of the preliminary field tests. This work was held in the framework of the project "Specifically Targeted for Radars INnovative Gauge (STRING)", funded by the Greek-French collaboration programme "Plato".
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epiney, A.; Canepa, S.; Zerkak, O.
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Remarks on CFD validation: A Boeing Commercial Airplane Company perspective
NASA Technical Reports Server (NTRS)
Rubbert, Paul E.
1987-01-01
Requirements and meaning of validation of computational fluid dynamics codes are discussed. Topics covered include: validating a code, validating a user, and calibrating a code. All results are presented in viewgraph format.
Lucyshyn, Joseph M.; Irvin, Larry K.; Blumberg, E. Richard; Laverty, Robelyn; Horner, Robert H.; Sprague, Jeffrey R.
2015-01-01
We conducted an observational study of parent-child interaction in home activity settings (routines) of families raising young children with developmental disabilities and problem behavior. Our aim was to empirically investigate the construct validity of coercion in typical but unsuccessful family routines. The long-term goal was to develop an expanded ecological unit of analysis that may contribute to sustainable behavioral family intervention. Ten children with autism and/or mental retardation and their families participated. Videotaped observations were conducted in typical but unsuccessful home routines. Parent-child interaction in routines was coded in real time and sequential analyses were conducted to test hypotheses about coercive processes. Following observation, families were interviewed about the social validity of the construct. Results confirmed the presence of statistically significant, attention-driven coercive processes in routines in which parents were occupied with non-child centered tasks. Results partially confirmed the presence of escape-driven coercive processes in routines in which parent demands are common. Additional analysis revealed an alternative pattern with greater magnitude. Family perspectives suggested the social validity of the construct. Results are discussed in terms of preliminary, partial evidence for coercive processes in routines of families of children with developmental disabilities. Implications for behavioral assessment and intervention design are discussed. PMID:26321883
CBT Specific Process in Exposure-Based Treatments: Initial Examination in a Pediatric OCD Sample
Benito, Kristen Grabill; Conelea, Christine; Garcia, Abbe M.; Freeman, Jennifer B.
2012-01-01
Cognitive-Behavioral theory and empirical support suggest that optimal activation of fear is a critical component for successful exposure treatment. Using this theory, we developed coding methodology for measuring CBT-specific process during exposure. We piloted this methodology in a sample of young children (N = 18) who previously received CBT as part of a randomized controlled trial. Results supported the preliminary reliability and predictive validity of coding variables with 12 week and 3 month treatment outcome data, generally showing results consistent with CBT theory. However, given our limited and restricted sample, additional testing is warranted. Measurement of CBT-specific process using this methodology may have implications for understanding mechanism of change in exposure-based treatments and for improving dissemination efforts through identification of therapist behaviors associated with improved outcome. PMID:22523609
NASA Technical Reports Server (NTRS)
Wells, Jason E.; Black, David L.; Taylor, Casey L.
2013-01-01
Exhaust plumes from large solid rocket motors fired at ATK's Promontory test site carry particulates to high altitudes and typically produce deposits that fall on regions downwind of the test area. As populations and communities near the test facility grow, ATK has become increasingly concerned about the impact of motor testing on those surrounding communities. To assess the potential impact of motor testing on the community and to identify feasible mitigation strategies, it is essential to have a tool capable of predicting plume behavior downrange of the test stand. A software package, called PlumeTracker, has been developed and validated at ATK for this purpose. The code is a point model that offers a time-dependent, physics-based description of plume transport and precipitation. The code can utilize either measured or forecasted weather data to generate plume predictions. Next-Generation Radar (NEXRAD) data and field observations from twenty-three historical motor test fires at Promontory were collected to test the predictive capability of PlumeTracker. Model predictions for plume trajectories and deposition fields were found to correlate well with the collected dataset.
NASA Astrophysics Data System (ADS)
Wei, Ding; Cong-cong, Yu; Chen-hui, Wu; Zheng-yi, Shu
2018-03-01
To analyse the strain localization behavior of geomaterials, the forward Euler schemes and the tangent modulus matrix are formulated based on the transversely isotropic yield criterion with non-coaxial flow rule developed by Lade, the program code is implemented based on the user subroutine (UMAT) of ABAQUS. The influence of the material principal direction on the strain localization and the bearing capacity of the structure are investigated and analyzed. Numerical results show the validity and performance of the proposed model in simulating the strain localization behavior of geostructures.
An RL10A-3-3A rocket engine model using the rocket engine transient simulator (ROCETS) software
NASA Technical Reports Server (NTRS)
Binder, Michael
1993-01-01
Steady-state and transient computer models of the RL10A-3-3A rocket engine have been created using the Rocket Engine Transient Simulation (ROCETS) code. These models were created for several purposes. The RL10 engine is a critical component of past, present, and future space missions; the model will give NASA an in-house capability to simulate the performance of the engine under various operating conditions and mission profiles. The RL10 simulation activity is also an opportunity to further validate the ROCETS program. The ROCETS code is an important tool for modeling rocket engine systems at NASA Lewis. ROCETS provides a modular and general framework for simulating the steady-state and transient behavior of any desired propulsion system. Although the ROCETS code is being used in a number of different analysis and design projects within NASA, it has not been extensively validated for any system using actual test data. The RL10A-3-3A has a ten year history of test and flight applications; it should provide sufficient data to validate the ROCETS program capability. The ROCETS models of the RL10 system were created using design information provided by Pratt & Whitney, the engine manufacturer. These models are in the process of being validated using test-stand and flight data. This paper includes a brief description of the models and comparison of preliminary simulation output against flight and test-stand data.
Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali
2014-01-01
Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization. PMID:24600168
Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali
2014-01-01
Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization.
Kerr, Jacqueline; Sallis, James F; Bromby, Erica; Glanz, Karen
2012-01-01
To evaluate reliability and validity of a new tool for assessing the placement and promotional environment in grocery stores. Trained observers used the GroPromo instrument in 40 stores to code the placement of 7 products in 9 locations within a store, along with other promotional characteristics. To test construct validity, customers' receipts were coded for percentage of food purchases in each of the categories. Of the 22 categories tested, 21 demonstrated moderate to high interrater reliability (intraclass correlation ≥ 0.61). When more unhealthy items were placed in prominent locations, a higher percentage of money was spent on less-healthy items, and a lower percentage of food dollars were spent on fruits and vegetables. The prominence of locations was more important than the number of locations. The GroPromo tool can be used to assess promotional practices in stores. Data may help advocates campaign for more healthy food items in key promotional locations. Copyright © 2012 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Barnett, Miya L.; Niec, Larissa N.; Acevedo-Polakovich, I. David
2013-01-01
This paper describes the initial evaluation of the Therapist-Parent Interaction Coding System (TPICS), a measure of in vivo therapist coaching for the evidence-based behavioral parent training intervention, parent-child interaction therapy (PCIT). Sixty-one video-recorded treatment sessions were coded with the TPICS to investigate (1) the variety of coaching techniques PCIT therapists use in the early stage of treatment, (2) whether parent skill-level guides a therapist’s coaching style and frequency, and (3) whether coaching mediates changes in parents’ skill levels from one session to the next. Results found that the TPICS captured a range of coaching techniques, and that parent skill-level prior to coaching did relate to therapists’ use of in vivo feedback. Therapists’ responsive coaching (e.g., praise to parents) was a partial mediator of change in parenting behavior from one session to the next for specific child-centered parenting skills; whereas directive coaching (e.g., modeling) did not relate to change. The TPICS demonstrates promise as a measure of coaching during PCIT with good reliability scores and initial evidence of construct validity. PMID:24839350
Waller, Rebecca; Gardner, Frances; Shaw, Daniel S; Dishion, Thomas J; Wilson, Melvin N; Hyde, Luke W
2015-01-01
Youth with callous-unemotional (CU) behavior are at risk of developing more severe forms of aggressive and antisocial behavior. Previous cross-sectional studies suggest that associations between parenting and conduct problems are less strong when children or adolescents have high levels of CU behavior, implying lower malleability of behavior compared to low-CU children. The current study extends previous findings by examining the moderating role of CU behavior on associations between parenting and behavior problems in a very young sample, both concurrently and longitudinally, and using a variety of measurement methods. Data were collected from a multi-ethnic, high-risk sample at ages 2 to 4 (N = 364; 49% female). Parent-reported CU behavior was assessed at age 3 using a previously validated measure (Hyde et al., 2013 ). Parental harshness was coded from observations of parent-child interactions and parental warmth was coded from 5-min speech samples. In this large and young sample, CU behavior moderated cross-sectional correlations between parent-reported and observed warmth and child behavior problems. However, in cross-sectional and longitudinal models testing parental harshness, and longitudinal models testing warmth, there was no moderation by CU behavior. The findings are in line with recent literature suggesting parental warmth may be important to child behavior problems at high levels of CU behavior. In general, however, the results of this study contrast with much of the extant literature and suggest that in young children, affective aspects of parenting appear to be related to emerging behavior problems, regardless of the presence of early CU behavior.
Euler flow predictions for an oscillating cascade using a high resolution wave-split scheme
NASA Technical Reports Server (NTRS)
Huff, Dennis L.; Swafford, Timothy W.; Reddy, T. S. R.
1991-01-01
A compressible flow code that can predict the nonlinear unsteady aerodynamics associated with transonic flows over oscillating cascades is developed and validated. The code solves the two dimensional, unsteady Euler equations using a time-marching, flux-difference splitting scheme. The unsteady pressures and forces can be determined for arbitrary input motions, although only harmonic pitching and plunging motions are addressed. The code solves the flow equations on a H-grid which is allowed to deform with the airfoil motion. Predictions are presented for both flat plate cascades and loaded airfoil cascades. Results are compared to flat plate theory and experimental data. Predictions are also presented for several oscillating cascades with strong normal shocks where the pitching amplitudes, cascade geometry and interblade phase angles are varied to investigate nonlinear behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumitrescu, Eugene; Humble, Travis S.
The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less
Establishing confidence in complex physics codes: Art or science?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trucano, T.
1997-12-31
The ALEGRA shock wave physics code, currently under development at Sandia National Laboratories and partially supported by the US Advanced Strategic Computing Initiative (ASCI), is generic to a certain class of physics codes: large, multi-application, intended to support a broad user community on the latest generation of massively parallel supercomputer, and in a continual state of formal development. To say that the author has ``confidence`` in the results of ALEGRA is to say something different than that he believes that ALEGRA is ``predictive.`` It is the purpose of this talk to illustrate the distinction between these two concepts. The authormore » elects to perform this task in a somewhat historical manner. He will summarize certain older approaches to code validation. He views these methods as aiming to establish the predictive behavior of the code. These methods are distinguished by their emphasis on local information. He will conclude that these approaches are more art than science.« less
Evidence-Based School Behavior Assessment of Externalizing Behavior in Young Children
Bagner, Daniel M.; Boggs, Stephen R.; Eyberg, Sheila M.
2011-01-01
This study examined the psychometric properties of the Revised Edition of the School Observation Coding System (REDSOCS). Participants were 68 children ages 3 to 6 who completed parent-child interaction therapy for Oppositional Defiant Disorder as part of a larger efficacy trial. Interobserver reliability on REDSOCS categories was moderate to high, with percent agreement ranging from 47% to 90% (M = 67%) and Cohen’s kappa coefficients ranging from .69 to .95 (M = .82). Convergent validity of the REDSOCS categories was supported by significant correlations with the Intensity Scale of the Sutter-Eyberg Student Behavior Inventory-Revised and related subscales of the Conners’ Teacher Rating Scale-Revised: Long Version (CTRS-R: L). Divergent validity was indicated by nonsignificant correlations between REDSOCS categories and scales on the CTRS-R: L expected not to relate to disruptive classroom behavior. Treatment sensitivity was demonstrated for two of the three primary REDSOCS categories by significant pre to posttreatment changes. This study provides psychometric support for the designation of REDSOCS as an evidence-based assessment procedure for young children. PMID:21687781
Towards a Consolidated Approach for the Assessment of Evaluation Models of Nuclear Power Reactors
Epiney, A.; Canepa, S.; Zerkak, O.; ...
2016-11-02
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology
1993-11-01
interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion software, which make up digital...1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error detection and...formulate all the significant behavior of a system. MULTIVERSION PROGRAMMING. N-version programming. N-VERSION PROGRAMMING. The independent coding of a
Peng, Mingkai; Southern, Danielle A; Williamson, Tyler; Quan, Hude
2017-12-01
This study examined the coding validity of hypertension, diabetes, obesity and depression related to the presence of their co-existing conditions, death status and the number of diagnosis codes in hospital discharge abstract database. We randomly selected 4007 discharge abstract database records from four teaching hospitals in Alberta, Canada and reviewed their charts to extract 31 conditions listed in Charlson and Elixhauser comorbidity indices. Conditions associated with the four study conditions were identified through multivariable logistic regression. Coding validity (i.e. sensitivity, positive predictive value) of the four conditions was related to the presence of their associated conditions. Sensitivity increased with increasing number of diagnosis code. Impact of death on coding validity is minimal. Coding validity of conditions is closely related to its clinical importance and complexity of patients' case mix. We recommend mandatory coding of certain secondary diagnosis to meet the need of health research based on administrative health data.
Development of code evaluation criteria for assessing predictive capability and performance
NASA Technical Reports Server (NTRS)
Lin, Shyi-Jang; Barson, S. L.; Sindir, M. M.; Prueger, G. H.
1993-01-01
Computational Fluid Dynamics (CFD), because of its unique ability to predict complex three-dimensional flows, is being applied with increasing frequency in the aerospace industry. Currently, no consistent code validation procedure is applied within the industry. Such a procedure is needed to increase confidence in CFD and reduce risk in the use of these codes as a design and analysis tool. This final contract report defines classifications for three levels of code validation, directly relating the use of CFD codes to the engineering design cycle. Evaluation criteria by which codes are measured and classified are recommended and discussed. Criteria for selecting experimental data against which CFD results can be compared are outlined. A four phase CFD code validation procedure is described in detail. Finally, the code validation procedure is demonstrated through application of the REACT CFD code to a series of cases culminating in a code to data comparison on the Space Shuttle Main Engine High Pressure Fuel Turbopump Impeller.
Discrimination of correlated and entangling quantum channels with selective process tomography
Dumitrescu, Eugene; Humble, Travis S.
2016-10-10
The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less
Code-modulated interferometric imaging system using phased arrays
NASA Astrophysics Data System (ADS)
Chauhan, Vikas; Greene, Kevin; Floyd, Brian
2016-05-01
Millimeter-wave (mm-wave) imaging provides compelling capabilities for security screening, navigation, and bio- medical applications. Traditional scanned or focal-plane mm-wave imagers are bulky and costly. In contrast, phased-array hardware developed for mass-market wireless communications and automotive radar promise to be extremely low cost. In this work, we present techniques which can allow low-cost phased-array receivers to be reconfigured or re-purposed as interferometric imagers, removing the need for custom hardware and thereby reducing cost. Since traditional phased arrays power combine incoming signals prior to digitization, orthogonal code-modulation is applied to each incoming signal using phase shifters within each front-end and two-bit codes. These code-modulated signals can then be combined and processed coherently through a shared hardware path. Once digitized, visibility functions can be recovered through squaring and code-demultiplexing operations. Pro- vided that codes are selected such that the product of two orthogonal codes is a third unique and orthogonal code, it is possible to demultiplex complex visibility functions directly. As such, the proposed system modulates incoming signals but demodulates desired correlations. In this work, we present the operation of the system, a validation of its operation using behavioral models of a traditional phased array, and a benchmarking of the code-modulated interferometer against traditional interferometer and focal-plane arrays.
Waller, Rebecca; Gardner, Frances; Shaw, Daniel S.; Dishion, Thomas J.; Wilson, Melvin N.; Hyde, Luke W.
2014-01-01
Objective Youth with callous unemotional (CU) behavior are at risk of developing more severe forms of aggressive and antisocial behavior. Previous cross-sectional studies suggest that associations between parenting and conduct problems are less strong when children or adolescents have high levels of CU behavior, implying lower malleability of behavior compared to low-CU children. The current study extends previous findings by examining the moderating role of CU behavior on associations between parenting and behavior problems in a very young sample, both concurrently and longitudinally, and using a variety of measurement methods. Methods Data were collected from a multi-ethnic, high-risk sample at ages 2–4 (N = 364; 49% female). Parent-reported CU behavior was assessed at age 3 using a previously validated measure (Hyde et al., 2013). Parental harshness was coded from observations of parent-child interactions and parental warmth was coded from five-minute speech samples. Results In this large and young sample, CU behavior moderated cross-sectional correlations between parent-reported and observed warmth and child behavior problems. However, in cross-sectional and longitudinal models testing parental harshness, and longitudinal models testing warmth, there was no moderation by CU behavior. Conclusions The findings are in line with recent literature suggesting parental warmth may be important to child behavior problems at high levels of CU behavior. In general, however, the results of this study contrast with much of the extant literature and suggest that in young children, affective aspects of parenting appear to be related to emerging behavior problems, regardless of the presence of early CU behavior. PMID:24661288
Validation of the NASCAP model using spaceflight data
NASA Technical Reports Server (NTRS)
Stannard, P. R.; Katz, I.; Gedeon, L.; Roche, J. C.; Rubin, A. G.; Tautz, M. F.
1982-01-01
The NASA Charging Analyzer Program (NASCAP) has been validated in a space environment. Data collected by the SCATHA (Spacecraft Charging at High Altitude) spacecraft has been used with NASCAP to simulate the charging response of the spacecraft ground conductor and dielectric surfaces with considerable success. Charging of the spacecraft ground observed in eclipse, during moderate and severe substorm environments, and in sunlight has been reproduced using the code. Close agreement between both the currents and potentials measured by the SSPM's, and the NASCAP simulated response, has been obtained for differential charging. It is concluded that NASCAP is able to predict spacecraft charging behavior in a space environment.
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
On the validation of a code and a turbulence model appropriate to circulation control airfoils
NASA Technical Reports Server (NTRS)
Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.
1988-01-01
A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.
Holden, Richard J; Rivera-Rodriguez, A Joy; Faye, Héléne; Scanlon, Matthew C; Karsh, Ben-Tzion
2013-08-01
The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses' operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA's impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians' work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign.
Holden, Richard J.; Rivera-Rodriguez, A. Joy; Faye, Héléne; Scanlon, Matthew C.; Karsh, Ben-Tzion
2012-01-01
The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses’ operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA’s impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians’ work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign. PMID:24443642
Effect of a Diffusion Zone on Fatigue Crack Propagation in Layered FGMs
NASA Astrophysics Data System (ADS)
Hauber, Brett; Brockman, Robert; Paulino, Glaucio
2008-02-01
Research into functionally graded materials (FGMs) has led to advances in our ability to analyze cracks. However, two prominent aspects remain relatively unexplored: 1) development and validation of modeling methods for fatigue crack propagation in FGMs, and 2) experimental validation of stress intensity models in engineered materials such as two phase monolithic and graded materials. This work addresses some of these problems for a limited set of conditions, material systems (e.g., Ti/TiB), and material gradients. Numerical analyses are conducted for single edge notch bend (SENB) specimens. Stress intensity factors are computed using the specialized finite element code I-Franc (Illinois Fracture Analysis Code), which is tailored for both homogeneous and graded materials, as well as Franc2DL and ABAQUS. Crack extension is considered by means of specified crack increments, together with fatigue evaluations to predict crack propagation life. Results will be used to determine linear material gradient parameters that are significant for prediction of fatigue crack growth behavior.
Zebracki, Kathy; Kichler, Jessica C.; Fitzgerald, Christopher J.; Neff Greenley, Rachel; Alemzadeh, Ramin; Holmbeck, Grayson N.
2011-01-01
Objective To examine reliability and validity data for the Family Interaction Macro-coding System (FIMS) with adolescents with spina bifida (SB), adolescents with type 1 diabetes mellitus (T1DM), and healthy adolescents and their families. Methods Sixty-eight families of children with SB, 58 families of adolescents with T1DM, and 68 families in a healthy comparison group completed family interaction tasks and self-report questionnaires. Trained coders rated family interactions using the FIMS. Results Acceptable interrater and scale reliabilities were obtained for FIMS items and subscales. Observed FIMS parental acceptance, parental behavioral control, parental psychological control, family cohesion, and family conflict scores demonstrated convergent validity with conceptually similar self-report measures. Conclusions Preliminary evidence supports the use of the FIMS with families of youths with SB and T1DM and healthy youths. Future research on overall family functioning may be enhanced by use of the FIMS. PMID:21097956
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
Development and validation of a notational system to study the offensive process in football.
Sarmento, Hugo; Anguera, Teresa; Campaniço, Jorge; Leitão, José
2010-01-01
The most striking change within football development is the application of science to its problems and in particular the use of increasingly sophisticated technology that, supported by scientific data, allows us to establish a "code of reading" the reality of the game. Therefore, this study describes the process of the development and validation of an ad hoc system of categorization, which allows the different methods of offensive game in football and the interaction to be analyzed. Therefore, through an exploratory phase of the study, we identified 10 vertebrate criteria and the respective behaviors observed for each of these criteria. We heard a panel of five experts with the purpose of a content validation. The resulting instrument is characterized by a combination of field formats and systems of categories. The reliability of the instrument was calculated by the intraobserver agreement, and values above 0.95 for all criteria were achieved. Two FC Barcelona games were coded and analyzed, which allowed the detection of various T-patterns. The results show that the instrument serves the purpose for which it was developed and can provide important information for the understanding of game interaction in football.
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.
2000-04-01
Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were appliedmore » to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.« less
Development and Validation of the Masculine Attributes Questionnaire
Cho, Junhan; Kogan, Steven M.
2017-01-01
The present study describes the development and validation of the Masculine Attributes Questionnaire (MAQ). The purpose of this study was to develop a theoretically and empirically grounded measure of masculine attributes for sexual health research with African American young men. Consistent with Whitehead’s theory, the MAQ items were hypothesized to comprise two components representing reputation-based and respect-based attributes. The sample included 505 African American men aged 19 to 22 years (M = 20.29, SD = 1.10) living in resource-poor communities in the rural South. Convergent and discriminant validity of the MAQ were assessed by examining the associations of masculinity attributes with psychosocial factors. Criterion validity was assessed by examining the extent to which the MAQ subscales predicted sexual risk behavior outcomes. Consistent with study hypotheses, the MAQ was composed of (a) reputation-based attributes oriented toward sexual prowess, toughness, and authority-defying behavior and (b) respect-based attributes oriented toward economic independence, socially approved levels of hard work and education, and committed romantic relationships. Reputation-based attributes were associated positively with street code and negatively related to academic orientation, vocational engagement, and self-regulation, whereas respect-based attributes were associated positively with academic and vocational orientations and self-regulation. Finally, reputation-based attributes predicted sexual risk behaviors including concurrent sexual partnerships, multiple sexual partners, marijuana use, and incarceration, net of the influence of respect-based attributes. The development of the MAQ provides a new measure that permits systematic quantitative investigation of the associations between African American men’s masculinity ideology and sexual risk behavior. PMID:28413906
Development and Validation of the Masculine Attributes Questionnaire.
Cho, Junhan; Kogan, Steven M
2017-07-01
The present study describes the development and validation of the Masculine Attributes Questionnaire (MAQ). The purpose of this study was to develop a theoretically and empirically grounded measure of masculine attributes for sexual health research with African American young men. Consistent with Whitehead's theory, the MAQ items were hypothesized to comprise two components representing reputation-based and respect-based attributes. The sample included 505 African American men aged 19 to 22 years ( M = 20.29, SD = 1.10) living in resource-poor communities in the rural South. Convergent and discriminant validity of the MAQ were assessed by examining the associations of masculinity attributes with psychosocial factors. Criterion validity was assessed by examining the extent to which the MAQ subscales predicted sexual risk behavior outcomes. Consistent with study hypotheses, the MAQ was composed of (a) reputation-based attributes oriented toward sexual prowess, toughness, and authority-defying behavior and (b) respect-based attributes oriented toward economic independence, socially approved levels of hard work and education, and committed romantic relationships. Reputation-based attributes were associated positively with street code and negatively related to academic orientation, vocational engagement, and self-regulation, whereas respect-based attributes were associated positively with academic and vocational orientations and self-regulation. Finally, reputation-based attributes predicted sexual risk behaviors including concurrent sexual partnerships, multiple sexual partners, marijuana use, and incarceration, net of the influence of respect-based attributes. The development of the MAQ provides a new measure that permits systematic quantitative investigation of the associations between African American men's masculinity ideology and sexual risk behavior.
[An explanatory model of behavior toward mental illness].
García-Sílberman, Sarah
2002-01-01
To evaluate a theoretical model designed to explain behaviors toward mental illness, considering some variables related to the construct. A survey was conducted in 1996 on mental disorder beliefs, attitudes, and behavioral intentions. The sample population was stratified by socioeconomic status, age, and gender. Study subjects were 800 individuals from Mexico City's general population. A data collection instrument was constructed and validated, consisting of 120 Likert-type items with five options each. Data were coded and analyzed with the software package SPSS. Internal consistency of the scales was assessed using Cronbach's alpha and construct validity with factorial analysis. Student's t test and ANOVA were used to compare the groups in the different strata. The model allowed to confirm the predictive capacity of the causal chain connecting beliefs, attitudes, and intentions; nevertheless, other study variables did not contribute to explain it, and behavior was scarcely influenced by intentions, depending mainly on experimented necessity. Study findings constitute a basis to understand the attitudes of shame and fear usually related to mental illnesses, to plan efficient actions aimed at modifying them, and to design programs to promote mental health. The English version of this paper is available at: http://www.insp.mx/salud/index.html.
Holt, James B.; Zhang, Xingyou; Lu, Hua; Shah, Snehal N.; Dooley, Daniel P.; Matthews, Kevin A.; Croft, Janet B.
2017-01-01
Introduction Local health authorities need small-area estimates for prevalence of chronic diseases and health behaviors for multiple purposes. We generated city-level and census-tract–level prevalence estimates of 27 measures for the 500 largest US cities. Methods To validate the methodology, we constructed multilevel logistic regressions to predict 10 selected health indicators among adults aged 18 years or older by using 2013 Behavioral Risk Factor Surveillance System (BRFSS) data; we applied their predicted probabilities to census population data to generate city-level, neighborhood-level, and zip-code–level estimates for the city of Boston, Massachusetts. Results By comparing the predicted estimates with their corresponding direct estimates from a locally administered survey (Boston BRFSS 2010 and 2013), we found that our model-based estimates for most of the selected health indicators at the city level were close to the direct estimates from the local survey. We also found strong correlation between the model-based estimates and direct survey estimates at neighborhood and zip code levels for most indicators. Conclusion Findings suggest that our model-based estimates are reliable and valid at the city level for certain health outcomes. Local health authorities can use the neighborhood-level estimates if high quality local health survey data are not otherwise available. PMID:29049020
Interactive specification acquisition via scenarios: A proposal
NASA Technical Reports Server (NTRS)
Hall, Robert J.
1992-01-01
Some reactive systems are most naturally specified by giving large collections of behavior scenarios. These collections not only specify the behavior of the system, but also provide good test suites for validating the implemented system. Due to the complexity of the systems and the number of scenarios, however, it appears that automated assistance is necessary to make this software development process workable. Interactive Specification Acquisition Tool (ISAT) is a proposed interactive system for supporting the acquisition and maintenance of a formal system specification from scenarios, as well as automatic synthesis of control code and automated test generation. This paper discusses the background, motivation, proposed functions, and implementation status of ISAT.
2014-03-27
VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR... PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR MEASUREMENTS OF AN IRON BOX THESIS Presented to the Faculty Department of Engineering...STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED iv AFIT-ENP-14-M-05 VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6
A test of the validity of the motivational interviewing treatment integrity code.
Forsberg, Lars; Berman, Anne H; Kallmén, Håkan; Hermansson, Ulric; Helgason, Asgeir R
2008-01-01
To evaluate the Swedish version of the Motivational Interviewing Treatment Code (MITI), MITI coding was applied to tape-recorded counseling sessions. Construct validity was assessed using factor analysis on 120 MITI-coded sessions. Discriminant validity was assessed by comparing MITI coding of motivational interviewing (MI) sessions with information- and advice-giving sessions as well as by comparing MI-trained practitioners with untrained practitioners. A principal-axis factoring analysis yielded some evidence for MITI construct validity. MITI differentiated between practitioners with different levels of MI training as well as between MI practitioners and advice-giving counselors, thus supporting discriminant validity. MITI may be used as a training tool together with supervision to confirm and enhance MI practice in clinical settings. MITI can also serve as a tool for evaluating MI integrity in clinical research.
Lausberg, Hedda; Sloetjes, Han
2016-09-01
As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.
Zhong, Qiu-Yue; Karlson, Elizabeth W; Gelaye, Bizu; Finan, Sean; Avillach, Paul; Smoller, Jordan W; Cai, Tianxi; Williams, Michelle A
2018-05-29
We examined the comparative performance of structured, diagnostic codes vs. natural language processing (NLP) of unstructured text for screening suicidal behavior among pregnant women in electronic medical records (EMRs). Women aged 10-64 years with at least one diagnostic code related to pregnancy or delivery (N = 275,843) from Partners HealthCare were included as our "datamart." Diagnostic codes related to suicidal behavior were applied to the datamart to screen women for suicidal behavior. Among women without any diagnostic codes related to suicidal behavior (n = 273,410), 5880 women were randomly sampled, of whom 1120 had at least one mention of terms related to suicidal behavior in clinical notes. NLP was then used to process clinical notes for the 1120 women. Chart reviews were performed for subsamples of women. Using diagnostic codes, 196 pregnant women were screened positive for suicidal behavior, among whom 149 (76%) had confirmed suicidal behavior by chart review. Using NLP among those without diagnostic codes, 486 pregnant women were screened positive for suicidal behavior, among whom 146 (30%) had confirmed suicidal behavior by chart review. The use of NLP substantially improves the sensitivity of screening suicidal behavior in EMRs. However, the prevalence of confirmed suicidal behavior was lower among women who did not have diagnostic codes for suicidal behavior but screened positive by NLP. NLP should be used together with diagnostic codes for future EMR-based phenotyping studies for suicidal behavior.
NASA Astrophysics Data System (ADS)
Insulander Björk, Klara; Kekkonen, Laura
2015-12-01
Thorium-plutonium Mixed OXide (Th-MOX) fuel is considered for use in light water reactors fuel due to some inherent benefits over conventional fuel types in terms of neutronic properties. The good material properties of ThO2 also suggest benefits in terms of thermal-mechanical fuel performance, but the use of Th-MOX fuel for commercial power production demands that its thermal-mechanical behavior can be accurately predicted using a well validated fuel performance code. Given the scant operational experience with Th-MOX fuel, no such code is available today. This article describes the first phase of the development of such a code, based on the well-established code FRAPCON 3.4, and in particular the correlations reviewed and chosen for the fuel material properties. The results of fuel temperature calculations with the code in its current state of development are shown and compared with data from a Th-MOX test irradiation campaign which is underway in the Halden research reactor. The results are good for fresh fuel, whereas experimental complications make it difficult to judge the adequacy of the code for simulations of irradiated fuel.
Influence of Shock Wave on the Flutter Behavior of Fan Blades Investigated
NASA Technical Reports Server (NTRS)
Srivastava, Rakesh; Bakhle, Milind A.; Stefko, George L.
2003-01-01
Modern fan designs have blades with forward sweep; a lean, thin cross section; and a wide chord to improve performance and reduce noise. These geometric features coupled with the presence of a shock wave can lead to flutter instability. Flutter is a self-excited dynamic instability arising because of fluid-structure interaction, which causes the energy from the surrounding fluid to be extracted by the vibrating structure. An in-flight occurrence of flutter could be catastrophic and is a significant design issue for rotor blades in gas turbines. Understanding the flutter behavior and the influence of flow features on flutter will lead to a better and safer design. An aeroelastic analysis code, TURBO, has been developed and validated for flutter calculations at the NASA Glenn Research Center. The code has been used to understand the occurrence of flutter in a forward-swept fan design. The forward-swept fan, which consists of 22 inserted blades, encountered flutter during wind tunnel tests at part speed conditions.
Formal Validation of Fault Management Design Solutions
NASA Technical Reports Server (NTRS)
Gibson, Corrina; Karban, Robert; Andolfato, Luigi; Day, John
2013-01-01
The work presented in this paper describes an approach used to develop SysML modeling patterns to express the behavior of fault protection, test the model's logic by performing fault injection simulations, and verify the fault protection system's logical design via model checking. A representative example, using a subset of the fault protection design for the Soil Moisture Active-Passive (SMAP) system, was modeled with SysML State Machines and JavaScript as Action Language. The SysML model captures interactions between relevant system components and system behavior abstractions (mode managers, error monitors, fault protection engine, and devices/switches). Development of a method to implement verifiable and lightweight executable fault protection models enables future missions to have access to larger fault test domains and verifiable design patterns. A tool-chain to transform the SysML model to jpf-Statechart compliant Java code and then verify the generated code via model checking was established. Conclusions and lessons learned from this work are also described, as well as potential avenues for further research and development.
Statistical inference of static analysis rules
NASA Technical Reports Server (NTRS)
Engler, Dawson Richards (Inventor)
2009-01-01
Various apparatus and methods are disclosed for identifying errors in program code. Respective numbers of observances of at least one correctness rule by different code instances that relate to the at least one correctness rule are counted in the program code. Each code instance has an associated counted number of observances of the correctness rule by the code instance. Also counted are respective numbers of violations of the correctness rule by different code instances that relate to the correctness rule. Each code instance has an associated counted number of violations of the correctness rule by the code instance. A respective likelihood of the validity is determined for each code instance as a function of the counted number of observances and counted number of violations. The likelihood of validity indicates a relative likelihood that a related code instance is required to observe the correctness rule. The violations may be output in order of the likelihood of validity of a violated correctness rule.
Niolon, Phyllis Holditch; Kuperminc, Gabriel P; Allen, Joseph P
2015-04-01
This multi-method, longitudinal study examines the negotiation of autonomy and relatedness between teens and their mothers as etiologic predictors of perpetration and victimization of dating aggression two years later. Observations of 88 mid-adolescents and their mothers discussing a topic of disagreement were coded for each individual's demonstrations of autonomy and relatedness using a validated coding system. Adolescents self-reported on perpetration and victimization of physical and psychological dating aggression two years later. We hypothesized that mother's and adolescents' behaviors supporting autonomy and relatedness would longitudinally predict lower reporting of dating aggression, and that their behaviors inhibiting autonomy and relatedness would predict higher reporting of dating aggression. Hypotheses were not supported; main findings were characterized by interactions of sex and risk status with autonomy. Maternal behaviors supporting autonomy predicted higher reports of perpetration and victimization of physical dating aggression for girls, but not for boys. Adolescent behaviors supporting autonomy predicted higher reports of perpetration of physical dating aggression for high-risk adolescents, but not for low-risk adolescents. Results indicate that autonomy is a dynamic developmental process, operating differently as a function of social contexts in predicting dating aggression. Examination of these and other developmental processes within parent-child relationships is important in predicting dating aggression, but may depend on social context.
Validation of the Electromagnetic Code FACETS for Numerical Simulation of Radar Target Images
2009-12-01
Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong...Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong DRDC Ottawa...for simulating radar images of a target is obtained, through direct simulation-to-measurement comparisons. A 3-dimensional computer-aided design
Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David
2014-01-01
Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.
Springate, David A.; Kontopantelis, Evangelos; Ashcroft, Darren M.; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David
2014-01-01
Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects. PMID:24941260
NASA Astrophysics Data System (ADS)
Sanchez, M. J.; Santamarina, C.; Gai, X., Sr.; Teymouri, M., Sr.
2017-12-01
Stability and behavior of Hydrate Bearing Sediments (HBS) are characterized by the metastable character of the gas hydrate structure which strongly depends on thermo-hydro-chemo-mechanical (THCM) actions. Hydrate formation, dissociation and methane production from hydrate bearing sediments are coupled THCM processes that involve, amongst other, exothermic formation and endothermic dissociation of hydrate and ice phases, mixed fluid flow and large changes in fluid pressure. The analysis of available data from past field and laboratory experiments, and the optimization of future field production studies require a formal and robust numerical framework able to capture the very complex behavior of this type of soil. A comprehensive fully coupled THCM formulation has been developed and implemented into a finite element code to tackle problems involving gas hydrates sediments. Special attention is paid to the geomechanical behavior of HBS, and particularly to their response upon hydrate dissociation under loading. The numerical framework has been validated against recent experiments conducted under controlled conditions in the laboratory that challenge the proposed approach and highlight the complex interaction among THCM processes in HBS. The performance of the models in these case studies is highly satisfactory. Finally, the numerical code is applied to analyze the behavior of gas hydrate soils under field-scale conditions exploring different features of material behavior under possible reservoir conditions.
Rapid Prototyping: A Survey and Evaluation of Methodologies and Models
1990-03-01
possibility of program coding errors or design differences from the actual prototype the user validated. The method - ology should result in a production...behavior within the problem domain to be defned. "Each method has a different approach towards developing the set of symbols with which to define the...investigate prototyping as a viable alternative to the conventional method of software development. By the mid 1980’s, it was evi- dent that the traditional
The DoE method as an efficient tool for modeling the behavior of monocrystalline Si-PV module
NASA Astrophysics Data System (ADS)
Kessaissia, Fatma Zohra; Zegaoui, Abdallah; Boutoubat, Mohamed; Allouache, Hadj; Aillerie, Michel; Charles, Jean-Pierre
2018-05-01
The objective of this paper is to apply the Design of Experiments (DoE) method to study and to obtain a predictive model of any marketed monocrystalline photovoltaic (mc-PV) module. This technique allows us to have a mathematical model that represents the predicted responses depending upon input factors and experimental data. Therefore, the DoE model for characterization and modeling of mc-PV module behavior can be obtained by just performing a set of experimental trials. The DoE model of the mc-PV panel evaluates the predictive maximum power, as a function of irradiation and temperature in a bounded domain of study for inputs. For the mc-PV panel, the predictive model for both one level and two levels were developed taking into account both influences of the main effect and the interactive effects on the considered factors. The DoE method is then implemented by developing a code under Matlab software. The code allows us to simulate, characterize, and validate the predictive model of the mc-PV panel. The calculated results were compared to the experimental data, errors were estimated, and an accurate validation of the predictive models was evaluated by the surface response. Finally, we conclude that the predictive models reproduce the experimental trials and are defined within a good accuracy.
VICTORIA: A mechanistic model for radionuclide behavior in the reactor coolant system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaperow, J.H.; Bixler, N.E.
1996-12-31
VICTORIA is the U.S. Nuclear Regulatory Commission`s (NRC`s) mechanistic, best-estimate code for analysis of fission product release from the core and subsequent transport in the reactor vessel and reactor coolant system. VICTORIA requires thermal-hydraulic data (i.e., temperatures, pressures, and velocities) as input. In the past, these data have been taken from the results of calculations from thermal-hydraulic codes such as SCDAP/RELAP5, MELCOR, and MAAP. Validation and assessment of VICTORIA 1.0 have been completed. An independent peer review of VICTORIA, directed by Brookhaven National Laboratory and supported by experts in the areas of fuel release, fission product chemistry, and aerosol physics,more » has been undertaken. This peer review, which will independently assess the code`s capabilities, is nearing completion with the peer review committee`s final report expected in Dec 1996. A limited amount of additional development is expected as a result of the peer review. Following this additional development, the NRC plans to release VICTORIA 1.1 and an updated and improved code manual. Future plans mainly involve use of the code for plant calculations to investigate specific safety issues as they arise. Also, the code will continue to be used in support of the Phebus experiments.« less
Validation of CESAR Thermal-hydraulic Module of ASTEC V1.2 Code on BETHSY Experiments
NASA Astrophysics Data System (ADS)
Tregoures, Nicolas; Bandini, Giacomino; Foucher, Laurent; Fleurot, Joëlle; Meloni, Paride
The ASTEC V1 system code is being jointly developed by the French Institut de Radioprotection et Sûreté Nucléaire (IRSN) and the German Gesellschaft für Anlagen und ReaktorSicherheit (GRS) to address severe accident sequences in a nuclear power plant. Thermal-hydraulics in primary and secondary system is addressed by the CESAR module. The aim of this paper is to present the validation of the CESAR module, from the ASTEC V1.2 version, on the basis of well instrumented and qualified integral experiments carried out in the BETHSY facility (CEA, France), which simulates a French 900 MWe PWR reactor. Three tests have been thoroughly investigated with CESAR: the loss of coolant 9.1b test (OECD ISP N° 27), the loss of feedwater 5.2e test, and the multiple steam generator tube rupture 4.3b test. In the present paper, the results of the code for the three analyzed tests are presented in comparison with the experimental data. The thermal-hydraulic behavior of the BETHSY facility during the transient phase is well reproduced by CESAR: the occurrence of major events and the time evolution of main thermal-hydraulic parameters of both primary and secondary circuits are well predicted.
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
Henrickson Parker, Sarah; Flin, Rhona; McKinley, Aileen; Yule, Steven
2013-06-01
Surgeons must demonstrate leadership to optimize performance and maximize patient safety in the operating room, but no behavior rating tool is available to measure leadership. Ten focus groups with members of the operating room team discussed surgeons' intraoperative leadership. Surgeons' leadership behaviors were extracted and used to finalize the Surgeons' Leadership Inventory (SLI), which was checked by surgeons (n = 6) for accuracy and face validity. The SLI was used to code video recordings (n = 5) of operations to test reliability. Eight elements of surgeons' leadership were included in the SLI: (1) maintaining standards, (2) managing resources, (3) making decisions, (4) directing, (5) training, (6) supporting others, (7) communicating, and (8) coping with pressure. Interrater reliability to code videos of surgeons' behaviors while operating using this tool was acceptable (κ = .70). The SLI is empirically grounded in focus group data and both the leadership and surgical literature. The interrater reliability of the system was acceptable. The inventory could be used for rating surgeons' leadership in the operating room for research or as a basis for postoperative feedback on performance. Copyright © 2013 Elsevier Inc. All rights reserved.
Neuronal Reward and Decision Signals: From Theories to Data
Schultz, Wolfram
2015-01-01
Rewards are crucial objects that induce learning, approach behavior, choices, and emotions. Whereas emotions are difficult to investigate in animals, the learning function is mediated by neuronal reward prediction error signals which implement basic constructs of reinforcement learning theory. These signals are found in dopamine neurons, which emit a global reward signal to striatum and frontal cortex, and in specific neurons in striatum, amygdala, and frontal cortex projecting to select neuronal populations. The approach and choice functions involve subjective value, which is objectively assessed by behavioral choices eliciting internal, subjective reward preferences. Utility is the formal mathematical characterization of subjective value and a prime decision variable in economic choice theory. It is coded as utility prediction error by phasic dopamine responses. Utility can incorporate various influences, including risk, delay, effort, and social interaction. Appropriate for formal decision mechanisms, rewards are coded as object value, action value, difference value, and chosen value by specific neurons. Although all reward, reinforcement, and decision variables are theoretical constructs, their neuronal signals constitute measurable physical implementations and as such confirm the validity of these concepts. The neuronal reward signals provide guidance for behavior while constraining the free will to act. PMID:26109341
Toward a CFD nose-to-tail capability - Hypersonic unsteady Navier-Stokes code validation
NASA Technical Reports Server (NTRS)
Edwards, Thomas A.; Flores, Jolen
1989-01-01
Computational fluid dynamics (CFD) research for hypersonic flows presents new problems in code validation because of the added complexity of the physical models. This paper surveys code validation procedures applicable to hypersonic flow models that include real gas effects. The current status of hypersonic CFD flow analysis is assessed with the Compressible Navier-Stokes (CNS) code as a case study. The methods of code validation discussed to beyond comparison with experimental data to include comparisons with other codes and formulations, component analyses, and estimation of numerical errors. Current results indicate that predicting hypersonic flows of perfect gases and equilibrium air are well in hand. Pressure, shock location, and integrated quantities are relatively easy to predict accurately, while surface quantities such as heat transfer are more sensitive to the solution procedure. Modeling transition to turbulence needs refinement, though preliminary results are promising.
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1994-01-01
A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.
Artificial neural network prediction of aircraft aeroelastic behavior
NASA Astrophysics Data System (ADS)
Pesonen, Urpo Juhani
An Artificial Neural Network that predicts aeroelastic behavior of aircraft is presented. The neural net was designed to predict the shape of a flexible wing in static flight conditions using results from a structural analysis and an aerodynamic analysis performed with traditional computational tools. To generate reliable training and testing data for the network, an aeroelastic analysis code using these tools as components was designed and validated. To demonstrate the advantages and reliability of Artificial Neural Networks, a network was also designed and trained to predict airfoil maximum lift at low Reynolds numbers where wind tunnel data was used for the training. Finally, a neural net was designed and trained to predict the static aeroelastic behavior of a wing without the need to iterate between the structural and aerodynamic solvers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moridis, George J.; Kowalsky, Michael B.; Pruess, Karsten
TOUGH+HYDRATE v1.2 is a code for the simulation of the behavior of hydratebearing geologic systems, and represents the second update of the code since its first release [Moridis et al., 2008]. By solving the coupled equations of mass and heat balance, TOUGH+HYDRATE can model the non-isothermal gas release, phase behavior and flow of fluids and heat under conditions typical of common natural CH4-hydrate deposits (i.e., in the permafrost and in deep ocean sediments) in complex geological media at any scale (from laboratory to reservoir) at which Darcy’s law is valid. TOUGH+HYDRATE v1.2 includes both an equilibrium and a kinetic modelmore » of hydrate formation and dissociation. The model accounts for heat and up to four mass components, i.e., water, CH4, hydrate, and water-soluble inhibitors such as salts or alcohols. These are partitioned among four possible phases (gas phase, liquid phase, ice phase and hydrate phase). Hydrate dissociation or formation, phase changes and the corresponding thermal effects are fully described, as are the effects of inhibitors. The model can describe all possible hydrate dissociation mechanisms, i.e., depressurization, thermal stimulation, salting-out effects and inhibitor-induced effects. TOUGH+HYDRATE is a member of TOUGH+, the successor to the TOUGH2 [Pruess et al., 1991] family of codes for multi-component, multiphase fluid and heat flow developed at the Lawrence Berkeley National Laboratory. It is written in standard FORTRAN 95/2003, and can be run on any computational platform (workstation, PC, Macintosh) for which such compilers are available.« less
Developing and Modifying Behavioral Coding Schemes in Pediatric Psychology: A Practical Guide
McMurtry, C. Meghan; Chambers, Christine T.; Bakeman, Roger
2015-01-01
Objectives To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. Methods This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. Results A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Conclusions Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. PMID:25416837
Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August
2018-07-01
Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.
Lattice Truss Structural Response Using Energy Methods
NASA Technical Reports Server (NTRS)
Kenner, Winfred Scottson
1996-01-01
A deterministic methodology is presented for developing closed-form deflection equations for two-dimensional and three-dimensional lattice structures. Four types of lattice structures are studied: beams, plates, shells and soft lattices. Castigliano's second theorem, which entails the total strain energy of a structure, is utilized to generate highly accurate results. Derived deflection equations provide new insight into the bending and shear behavior of the four types of lattices, in contrast to classic solutions of similar structures. Lattice derivations utilizing kinetic energy are also presented, and used to examine the free vibration response of simple lattice structures. Derivations utilizing finite element theory for unique lattice behavior are also presented and validated using the finite element analysis code EAL.
NASA Astrophysics Data System (ADS)
Velioglu Sogut, Deniz; Yalciner, Ahmet Cevdet
2018-06-01
Field observations provide valuable data regarding nearshore tsunami impact, yet only in inundation areas where tsunami waves have already flooded. Therefore, tsunami modeling is essential to understand tsunami behavior and prepare for tsunami inundation. It is necessary that all numerical models used in tsunami emergency planning be subject to benchmark tests for validation and verification. This study focuses on two numerical codes, NAMI DANCE and FLOW-3D®, for validation and performance comparison. NAMI DANCE is an in-house tsunami numerical model developed by the Ocean Engineering Research Center of Middle East Technical University, Turkey and Laboratory of Special Research Bureau for Automation of Marine Research, Russia. FLOW-3D® is a general purpose computational fluid dynamics software, which was developed by scientists who pioneered in the design of the Volume-of-Fluid technique. The codes are validated and their performances are compared via analytical, experimental and field benchmark problems, which are documented in the ``Proceedings and Results of the 2011 National Tsunami Hazard Mitigation Program (NTHMP) Model Benchmarking Workshop'' and the ``Proceedings and Results of the NTHMP 2015 Tsunami Current Modeling Workshop". The variations between the numerical solutions of these two models are evaluated through statistical error analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solis, D.
1998-10-16
The DART code is based upon a thermomechanical model that can predict swelling, recrystallization, fuel-meat interdiffusion and other issues related with MTR dispersed FE behavior under irradiation. As a part of a common effort to develop an optimized version of DART, a comparison between DART predictions and CNEA miniplates irradiation experimental data was made. The irradiation took place during 1981-82 for U3O8 miniplates and 1985-86 for U{sub 3}Si{sub x} at Oak Ridge Research Reactor (ORR). The microphotographs were studied by means of IMAWIN 3.0 Image Analysis Code and different fission gas bubbles distributions were obtained. Also it was possible tomore » find and identify different morphologic zones. In both kinds of fuels, different phases were recognized, like particle peripheral zones with evidence of Al-U reaction, internal recrystallized zones and bubbles. A very good agreement between code prediction and irradiation results was found. The few discrepancies are due to local, fabrication and irradiation uncertainties, as the presence of U{sub 3}Si phase in U{sub 3}Si{sub 2} particles and effective burnup.« less
Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.
Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger
2015-01-01
To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Seeing the Invisible: Embedding Tests in Code That Cannot be Modified
NASA Technical Reports Server (NTRS)
O'Malley, Owen; Mansouri-Samani, Masoud; Mehlitz, Peter; Penix, John
2005-01-01
The difficulty of characterizing and observing valid software behavior during testing can be very difficult in flight systems. To address this issue, we evaluated several approaches to increasing test observability on the Shuttle Abort Flight Management (SAFM) system. To increase test observability, we added probes into the running system to evaluate the internal state and analyze test data. To minimize the impact of the instrumentation and reduce manual effort, we used Aspect-Oriented Programming (AOP) tools to instrument the source code. We developed and elicited a spectrum of properties, from generic to application specific properties, to be monitored via the instrumentation. To evaluate additional approaches, SAFM was ported to Linux, enabling the use of gcov for measuring test coverage, Valgrind for looking for memory usage errors, and libraries for finding non-normal floating point values. An in-house C++ source code scanning tool was also used to identify violations of SAFM coding standards, and other potentially problematic C++ constructs. Using these approaches with the existing test data sets, we were able to verify several important properties, confirm several problems and identify some previously unidentified issues.
Woon, Yuan-Liang; Lee, Keng-Yee; Mohd Anuar, Siti Fatimah Zahra; Goh, Pik-Pin; Lim, Teck-Onn
2018-04-20
Hospitalization due to dengue illness is an important measure of dengue morbidity. However, limited studies are based on administrative database because the validity of the diagnosis codes is unknown. We validated the International Classification of Diseases, 10th revision (ICD) diagnosis coding for dengue infections in the Malaysian Ministry of Health's (MOH) hospital discharge database. This validation study involves retrospective review of available hospital discharge records and hand-search medical records for years 2010 and 2013. We randomly selected 3219 hospital discharge records coded with dengue and non-dengue infections as their discharge diagnoses from the national hospital discharge database. We then randomly sampled 216 and 144 records for patients with and without codes for dengue respectively, in keeping with their relative frequency in the MOH database, for chart review. The ICD codes for dengue were validated against lab-based diagnostic standard (NS1 or IgM). The ICD-10-CM codes for dengue had a sensitivity of 94%, modest specificity of 83%, positive predictive value of 87% and negative predictive value 92%. These results were stable between 2010 and 2013. However, its specificity decreased substantially when patients manifested with bleeding or low platelet count. The diagnostic performance of the ICD codes for dengue in the MOH's hospital discharge database is adequate for use in health services research on dengue.
Development and evaluation of an instrument for assessing brief behavioral change interventions.
Strayer, Scott M; Martindale, James R; Pelletier, Sandra L; Rais, Salehin; Powell, Jon; Schorling, John B
2011-04-01
To develop an observational coding instrument for evaluating the fidelity and quality of brief behavioral change interventions based on the behavioral theories of the 5 A's, Stages of Change and Motivational Interviewing. Content and face validity were assessed prior to an intervention where psychometric properties were evaluated with a prospective cohort of 116 medical students. Properties assessed included the inter-rater reliability of the instrument, internal consistency of the full scale and sub-scales and descriptive statistics of the instrument. Construct validity was assessed based on student's scores. Inter-rater reliability for the instrument was 0.82 (intraclass correlation). Internal consistency for the full scale was 0.70 (KR20). Internal consistencies for the sub-scales were as follows: MI intervention component (KR20=.7); stage-appropriate MI-based intervention (KR20=.55); MI spirit (KR20=.5); appropriate assessment (KR20=.45) and appropriate assisting (KR20=.56). The instrument demonstrated good inter-rater reliability and moderate overall internal consistency when used to assess performing brief behavioral change interventions by medical students. This practical instrument can be used with minimal training and demonstrates promising psychometric properties when evaluated with medical students counseling standardized patients. Further testing is required to evaluate its usefulness in clinical settings. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
van Dijk, Marijn; Bruinsma, Eke; Hauser, M Paulina
2016-04-01
Because feeding problems have clear negative consequences for both child and caretakers, early diagnosis and intervention are important. Parent-report questionnaires can contribute to early identification, because they are efficient and typically offer a 'holistic' perspective of the child's eating in different contexts. In this pilot study, we aim to explore the concurrent validity of a short screening instrument (the SEP, which is the Dutch MCH-FS) in one of its target populations (a group of premature children) by comparing the total score with the observed behavior of the child and caretaker during a regular home meal. 28 toddlers (aged 9-18 months) and their caretakers participated in the study. Video-observations of the meals were coded for categories of eating behavior and parent-child interaction. The results show that the total SEP-score correlates with food refusal, feeding efficiency, and self-feeding, but not with negative affect and parental instructions. This confirms that the SEP has a certain degree of concurrent validity in the sense that its total score is associated with specific 'benchmark' feeding behaviors: food refusal, feeding efficiency and autonomy. Future studies with larger samples are needed to generalize the findings from this pilot to a broader context. Copyright © 2016 Elsevier Ltd. All rights reserved.
Jones, B E; South, B R; Shao, Y; Lu, C C; Leng, J; Sauer, B C; Gundlapalli, A V; Samore, M H; Zeng, Q
2018-01-01
Identifying pneumonia using diagnosis codes alone may be insufficient for research on clinical decision making. Natural language processing (NLP) may enable the inclusion of cases missed by diagnosis codes. This article (1) develops a NLP tool that identifies the clinical assertion of pneumonia from physician emergency department (ED) notes, and (2) compares classification methods using diagnosis codes versus NLP against a gold standard of manual chart review to identify patients initially treated for pneumonia. Among a national population of ED visits occurring between 2006 and 2012 across the Veterans Affairs health system, we extracted 811 physician documents containing search terms for pneumonia for training, and 100 random documents for validation. Two reviewers annotated span- and document-level classifications of the clinical assertion of pneumonia. An NLP tool using a support vector machine was trained on the enriched documents. We extracted diagnosis codes assigned in the ED and upon hospital discharge and calculated performance characteristics for diagnosis codes, NLP, and NLP plus diagnosis codes against manual review in training and validation sets. Among the training documents, 51% contained clinical assertions of pneumonia; in the validation set, 9% were classified with pneumonia, of which 100% contained pneumonia search terms. After enriching with search terms, the NLP system alone demonstrated a recall/sensitivity of 0.72 (training) and 0.55 (validation), and a precision/positive predictive value (PPV) of 0.89 (training) and 0.71 (validation). ED-assigned diagnostic codes demonstrated lower recall/sensitivity (0.48 and 0.44) but higher precision/PPV (0.95 in training, 1.0 in validation); the NLP system identified more "possible-treated" cases than diagnostic coding. An approach combining NLP and ED-assigned diagnostic coding classification achieved the best performance (sensitivity 0.89 and PPV 0.80). System-wide application of NLP to clinical text can increase capture of initial diagnostic hypotheses, an important inclusion when studying diagnosis and clinical decision-making under uncertainty. Schattauer GmbH Stuttgart.
Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview
NASA Technical Reports Server (NTRS)
Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'
NASA Astrophysics Data System (ADS)
Zhou, Abel; White, Graeme L.; Davidson, Rob
2018-02-01
Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.
Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code
NASA Technical Reports Server (NTRS)
Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William
2006-01-01
The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1993-01-01
Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1992-01-01
Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.
Niolon, Phyllis Holditch; Kuperminc, Gabriel P.; Allen, Joseph P.
2015-01-01
Objective This multi-method, longitudinal study examines the negotiation of autonomy and relatedness between teens and their mothers as etiologic predictors of perpetration and victimization of dating aggression two years later. Method Observations of 88 mid-adolescents and their mothers discussing a topic of disagreement were coded for each individual’s demonstrations of autonomy and relatedness using a validated coding system. Adolescents self-reported on perpetration and victimization of physical and psychological dating aggression two years later. We hypothesized that mother’s and adolescents’ behaviors supporting autonomy and relatedness would longitudinally predict lower reporting of dating aggression, and that their behaviors inhibiting autonomy and relatedness would predict higher reporting of dating aggression. Results Hypotheses were not supported; main findings were characterized by interactions of sex and risk status with autonomy. Maternal behaviors supporting autonomy predicted higher reports of perpetration and victimization of physical dating aggression for girls, but not for boys. Adolescent behaviors supporting autonomy predicted higher reports of perpetration of physical dating aggression for high-risk adolescents, but not for low-risk adolescents. Conclusions Results indicate that autonomy is a dynamic developmental process, operating differently as a function of social contexts in predicting dating aggression. Examination of these and other developmental processes within parent-child relationships is important in predicting dating aggression, but may depend on social context. PMID:25914852
Dishion, Thomas J; Forgatch, Marion; Van Ryzin, Mark; Winter, Charlotte
2012-07-01
In this study we examined the videotaped family interactions of a community sample of adolescents and their parents. Youths were assessed in early to late adolescence on their levels of antisocial behavior. At age 16-17, youths and their parents were videotaped interacting while completing a variety of tasks, including family problem solving. The interactions were coded and compared for three developmental patterns of antisocial behavior: early onset, persistent; adolescence onset; and typically developing. The mean duration of conflict bouts was the only interaction pattern that discriminated the 3 groups. In the prediction of future antisocial behavior, parent and youth reports of transition entropy and conflict resolution interacted to account for antisocial behavior at age 18-19. Families with low entropy and peaceful resolutions predicted low levels of youth antisocial behavior at age 18-19. These findings suggest the need to study both attractors and repellers to understand family dynamics associated with health and social and emotional development.
Dishion, Thomas J.; Forgatch, Marion; Van Ryzin, Mark; Winter, Charlotte
2012-01-01
In this study we examined the videotaped family interactions of a community sample of adolescents and their parents. Youths were assessed in early to late adolescence on their levels of antisocial behavior. At age 16–17, youths and their parents were videotaped interacting while completing a variety of tasks, including family problem solving. The interactions were coded and compared for 3 developmental patterns of antisocial behavior: early onset, persistent; adolescence onset; and typically developing. The mean duration of conflict bouts was the only interaction pattern that discriminated the 3 groups. In the prediction of future antisocial behavior, parent and youth reports of transition entropy and conflict resolution interacted to account for antisocial behavior at age 18–19. Families with low entropy and peaceful resolutions predicted low levels of youth antisocial behavior at age 18–19. These findings suggest the need to study both attractors and repellers to understand family dynamics associated with health and social and emotional development. PMID:22695152
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2017-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. An extended MHD model has shown good agreement with experimental data at 14 kHz injector operation. Efforts to extend the existing validation to a range of higher frequencies (36, 53, 68 kHz) using the PSI-Tet 3D extended MHD code will be presented, along with simulations of potential combinations of flux conserver features and helicity injector configurations and their impact on current drive performance, density control, and temperature for future SIHI experiments. Work supported by USDoE.
CFD validation needs for advanced concepts at Northrop Corporation
NASA Technical Reports Server (NTRS)
George, Michael W.
1987-01-01
Information is given in viewgraph form on the Computational Fluid Dynamics (CFD) Workshop held July 14 - 16, 1987. Topics covered include the philosophy of CFD validation, current validation efforts, the wing-body-tail Euler code, F-20 Euler simulated oil flow, and Euler Navier-Stokes code validation for 2D and 3D nozzle afterbody applications.
Radiation from advanced solid rocket motor plumes
NASA Technical Reports Server (NTRS)
Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.
1994-01-01
The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.
Li, Tingwen; Rogers, William A.; Syamlal, Madhava; ...
2016-07-29
Here, the MFiX suite of multiphase computational fluid dynamics (CFD) codes is being developed at U.S. Department of Energy's National Energy Technology Laboratory (NETL). It includes several different approaches to multiphase simulation: MFiX-TFM, a two-fluid (Eulerian–Eulerian) model; MFiX-DEM, an Eulerian fluid model with a Lagrangian Discrete Element Model for the solids phase; and MFiX-PIC, Eulerian fluid model with Lagrangian particle ‘parcels’ representing particle groups. These models are undergoing continuous development and application, with verification, validation, and uncertainty quantification (VV&UQ) as integrated activities. After a brief summary of recent progress in the verification, validation and uncertainty quantification (VV&UQ), this article highlightsmore » two recent accomplishments in the application of MFiX-TFM to fossil energy technology development. First, recent application of MFiX to the pilot-scale KBR TRIG™ Transport Gasifier located at DOE's National Carbon Capture Center (NCCC) is described. Gasifier performance over a range of operating conditions was modeled and compared to NCCC operational data to validate the ability of the model to predict parametric behavior. Second, comparison of code predictions at a detailed fundamental scale is presented studying solid sorbents for the post-combustion capture of CO 2 from flue gas. Specifically designed NETL experiments are being used to validate hydrodynamics and chemical kinetics for the sorbent-based carbon capture process.« less
The Modified Cognitive Constructions Coding System: Reliability and Validity Assessments
ERIC Educational Resources Information Center
Moran, Galia S.; Diamond, Gary M.
2006-01-01
The cognitive constructions coding system (CCCS) was designed for coding client's expressed problem constructions on four dimensions: intrapersonal-interpersonal, internal-external, responsible-not responsible, and linear-circular. This study introduces, and examines the reliability and validity of, a modified version of the CCCS--a version that…
Cognitive/emotional models for human behavior representation in 3D avatar simulations
NASA Astrophysics Data System (ADS)
Peterson, James K.
2004-08-01
Simplified models of human cognition and emotional response are presented which are based on models of auditory/ visual polymodal fusion. At the core of these models is a computational model of Area 37 of the temporal cortex which is based on new isocortex models presented recently by Grossberg. These models are trained using carefully chosen auditory (musical sequences), visual (paintings) and higher level abstract (meta level) data obtained from studies of how optimization strategies are chosen in response to outside managerial inputs. The software modules developed are then used as inputs to character generation codes in standard 3D virtual world simulations. The auditory and visual training data also enable the development of simple music and painting composition generators which significantly enhance one's ability to validate the cognitive model. The cognitive models are handled as interacting software agents implemented as CORBA objects to allow the use of multiple language coding choices (C++, Java, Python etc) and efficient use of legacy code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marelle, V.; Dubois, S.; Ripert, M.
2008-07-15
MAIA is a thermo-mechanical code dedicated to the modeling of MTR fuel plates. The main physical phenomena modeled in the code are the cladding oxidation, the interaction between fuel and Al-matrix, the swelling due to fission products and the Al/fuel particles interaction. The creeping of the plate can be modeled in the mechanical calculation. MAIA has been validated on U-Mo dispersion fuel experiments such as IRIS 1 and 2 and FUTURE. The results are in rather good agreement with post-irradiation examinations. MAIA can also be used to calculate in-pile behavior of U{sub 3}Si{sub 2} plates as in the SHARE experimentmore » irradiated in the SCK/Mol BR2 reactor. The main outputs given by MAIA throughout the irradiation are temperatures, cladding oxidation thickness, interaction thickness, volume fraction of meat constituents, swelling, displacements, strains and stresses. MAIA is originally a two-dimensional code but a three-dimensional version is currently under development. (author)« less
HBOI Underwater Imaging and Communication Research - Phase 1
2012-04-19
validation of one-way pulse stretching radiative transfer code The objective was to develop and validate time-resolved radiative transfer models that...and validation of one-way pulse stretching radiative transfer code The models were subjected to a series of validation experiments over 12.5 meter...about the theoretical basis of the model together with validation results can be found in Dalgleish et al., (20 1 0). Forward scattering Mueller
Myers, Beth M; Wells, Nancy M
2015-04-01
Gardens are a promising intervention to promote physical activity (PA) and foster health. However, because of the unique characteristics of gardening, no extant tool can capture PA, postures, and motions that take place in a garden. The Physical Activity Research and Assessment tool for Garden Observation (PARAGON) was developed to assess children's PA levels, tasks, postures, and motions, associations, and interactions while gardening. PARAGON uses momentary time sampling in which a trained observer watches a focal child for 15 seconds and then records behavior for 15 seconds. Sixty-five children (38 girls, 27 boys) at 4 elementary schools in New York State were observed over 8 days. During the observation, children simultaneously wore Actigraph GT3X+ accelerometers. The overall interrater reliability was 88% agreement, and Ebel was .97. Percent agreement values for activity level (93%), garden tasks (93%), motions (80%), associations (95%), and interactions (91%) also met acceptable criteria. Validity was established by previously validated PA codes and by expected convergent validity with accelerometry. PARAGON is a valid and reliable observation tool for assessing children's PA in the context of gardening.
Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN
NASA Technical Reports Server (NTRS)
Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.
1996-01-01
A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.
HYDRATE v1.5 OPTION OF TOUGH+ v1.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moridis, George
HYDRATE v1.5 is a numerical code that for the simulation of the behavior of hydrate-bearing geologic systems, and represents the third update of the code since its first release [Moridis et al., 2008]. It is an option of TOUGH+ v1.5 [Moridis and Pruess, 2014], a successor to the TOUGH2 [Pruess et al., 1999, 2012] family of codes for multi-component, multiphase fluid and heat flow developed at the Lawrence Berkeley National Laboratory. HYDRATE v1.5 needs the TOUGH+ v1.5 core code in order to compile and execute. It is written in standard FORTRAN 95/2003, and can be run on any computational platformmore » (workstation, PC, Macintosh) for which such compilers are available. By solving the coupled equations of mass and heat balance, the fully operational TOUGH+HYDRATE code can model the non-isothermal gas release, phase behavior and flow of fluids and heat under conditions typical of common natural CH 4-hydrate deposits (i.e., in the permafrost and in deep ocean sediments) in complex geological media at any scale (from laboratory to reservoir) at which Darcy's law is valid. TOUGH+HYDRATE v1.5 includes both an equilibrium and a kinetic model of hydrate formation and dissociation. The model accounts for heat and up to four mass components, i.e., water, CH 4, hydrate, and water-soluble inhibitors such as salts or alcohols. These are partitioned among four possible phases (gas phase, liquid phase, ice phase and hydrate phase). Hydrate dissociation or formation, phase changes and the corresponding thermal effects are fully described, as are the effects of inhibitors. The model can describe all possible hydrate dissociation mechanisms, i.e., depressurization, thermal stimulation, salting-out effects and inhibitor-induced effects.« less
Monte Carol-based validation of neutronic methodology for EBR-II analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liaw, J.R.; Finck, P.J.
1993-01-01
The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less
Anderson, Jamie
2015-01-01
The extent to which novel land-efficient neighborhood design can promote key health behaviors is examined, concentrating on communal outdoor space provision (COSP). To test whether a neighborhood (Accordia) with a higher ratio of communal to private outdoor space is associated with higher levels of resident's (a) self-reported local health behaviors and (b) observed engagement in local health behaviors, compared to a matched neighborhood with lower proportion of COSP. Health behaviors were examined via direct observation and postal survey. Bespoke observation codes and survey items represented key well-being behaviors including "connecting," "keeping active," "taking notice," "keep learning," and "giving." The questionnaire was validated using psychometric analyses and observed behaviors were mapped in real-time. General pursuit of health behaviors was very similar in both areas but Accordia residents reported substantially greater levels of local activity. Validated testing of survey dataset (n = 256) showed support for a stronger Attitude to Neighborhood Life (connecting and giving locally) in Accordia and partial support of greater physical activity. Analyses of the behavior observation dataset (n = 7,298) support the self-reported findings. Mapped observations revealed a proliferation of activity within Accordia's innovative outdoor hard spaces. Representation is limited to upper-middle class UK groups. However, Accordia was found to promote health behaviors compared a traditional neighborhood that demands considerably more land area. The positive role of home zone streets, hard-standing and semi-civic space highlights the principle of quality as well as quantity. The findings should be considered as part of three forthcoming locally led UK garden cities, to be built before 2020.
PCC Framework for Program-Generators
NASA Technical Reports Server (NTRS)
Kong, Soonho; Choi, Wontae; Yi, Kwangkeun
2009-01-01
In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.
Horowitz, Laura; Westlund, Karolina; Ljungberg, Tomas
2007-10-01
This study examined conflict behavior in naturalistic preschool settings to better understand the role of non-affiliative behavior and language in conflict management. Free-play at preschool was filmed among 20 boys with typically developing language (TL) and among 11 boys with Language Impairment (LI); the boys 4-7 years old. Conflict behavior was coded and analyzed with a validated system. Post-conflict non-affiliative behavior (aggression and withdrawal) displays, and the links between the displays and reconciliation (i.e., former opponents exchange friendly behavioral shortly after conflict termination) was examined. Group comparisons revealed boys with LI displayed aggression in a smaller share of conflicts, but exhibited [Symbol: see text]active' withdrawal (left the room), in a larger conflict share. Boys with TL overcame aggression (more common TL behavior) and after reconciled, to a greater extent than the boys with LI after active withdrawal (more common LI behavior). Also, after reciprocal or only verbal aggression, boys with LI reconciled to a lesser extent than boys with TL. The boys with LI demonstrated difficulties confronting conflict management, as well as concluding emotionally heightened and aggressive behavioral turns.
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
NASA Astrophysics Data System (ADS)
Atmani, O.; Abbès, B.; Abbès, F.; Li, Y. M.; Batkam, S.
2018-05-01
Thermoforming of high impact polystyrene sheets (HIPS) requires technical knowledge on material behavior, mold type, mold material, and process variables. Accurate thermoforming simulations are needed in the optimization process. Determining the behavior of the material under thermoforming conditions is one of the key parameters for an accurate simulation. The aim of this work is to identify the thermomechanical behavior of HIPS in the thermoforming conditions. HIPS behavior is highly dependent on temperature and strain rate. In order to reproduce the behavior of such material, a thermo-elasto-viscoplastic constitutive law was implement in the finite element code ABAQUS. The proposed model parameters are considered as thermo-dependent. The strain-dependence effect is introduced using Prony series. Tensile tests were carried out at different temperatures and strain rates. The material parameters were then identified using a NSGA-II algorithm. To validate the rheological model, experimental blowing tests were carried out on a thermoforming pilot machine. To compare the numerical results with the experimental ones the thickness distribution and the bubble shape were investigated.
Prediction of plant lncRNA by ensemble machine learning classifiers.
Simopoulos, Caitlin M A; Weretilnyk, Elizabeth A; Golding, G Brian
2018-05-02
In plants, long non-protein coding RNAs are believed to have essential roles in development and stress responses. However, relative to advances on discerning biological roles for long non-protein coding RNAs in animal systems, this RNA class in plants is largely understudied. With comparatively few validated plant long non-coding RNAs, research on this potentially critical class of RNA is hindered by a lack of appropriate prediction tools and databases. Supervised learning models trained on data sets of mostly non-validated, non-coding transcripts have been previously used to identify this enigmatic RNA class with applications largely focused on animal systems. Our approach uses a training set comprised only of empirically validated long non-protein coding RNAs from plant, animal, and viral sources to predict and rank candidate long non-protein coding gene products for future functional validation. Individual stochastic gradient boosting and random forest classifiers trained on only empirically validated long non-protein coding RNAs were constructed. In order to use the strengths of multiple classifiers, we combined multiple models into a single stacking meta-learner. This ensemble approach benefits from the diversity of several learners to effectively identify putative plant long non-coding RNAs from transcript sequence features. When the predicted genes identified by the ensemble classifier were compared to those listed in GreeNC, an established plant long non-coding RNA database, overlap for predicted genes from Arabidopsis thaliana, Oryza sativa and Eutrema salsugineum ranged from 51 to 83% with the highest agreement in Eutrema salsugineum. Most of the highest ranking predictions from Arabidopsis thaliana were annotated as potential natural antisense genes, pseudogenes, transposable elements, or simply computationally predicted hypothetical protein. Due to the nature of this tool, the model can be updated as new long non-protein coding transcripts are identified and functionally verified. This ensemble classifier is an accurate tool that can be used to rank long non-protein coding RNA predictions for use in conjunction with gene expression studies. Selection of plant transcripts with a high potential for regulatory roles as long non-protein coding RNAs will advance research in the elucidation of long non-protein coding RNA function.
Palm: Easing the Burden of Analytical Performance Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallent, Nathan R.; Hoisie, Adolfy
2014-06-01
Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less
A Dynamic/Anisotropic Low Earth Orbit (LEO) Ionizing Radiation Model
NASA Technical Reports Server (NTRS)
Badavi, Francis F.; West, Katie J.; Nealy, John E.; Wilson, John W.; Abrahms, Briana L.; Luetke, Nathan J.
2006-01-01
The International Space Station (ISS) provides the proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the experimental validation of ionizing radiation environmental models, nuclear transport code algorithms, and nuclear reaction cross sections. Indeed, prior measurements on the Space Transportation System (STS; Shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the Low Earth Orbit (LEO) environment. Previous studies using Computer Aided Design (CAD) models of the evolving ISS configurations with Thermo Luminescent Detector (TLD) area monitors, demonstrated that computational dosimetry requires environmental models with accurate non-isotropic as well as dynamic behavior, detailed information on rack loading, and an accurate 6 degree of freedom (DOF) description of ISS trajectory and orientation.
Elasto-Plastic Analysis of Tee Joints Using HOT-SMAC
NASA Technical Reports Server (NTRS)
Arnold, Steve M. (Technical Monitor); Bednarcyk, Brett A.; Yarrington, Phillip W.
2004-01-01
The Higher Order Theory - Structural/Micro Analysis Code (HOT-SMAC) software package is applied to analyze the linearly elastic and elasto-plastic response of adhesively bonded tee joints. Joints of this type are finding an increasing number of applications with the increased use of composite materials within advanced aerospace vehicles, and improved tools for the design and analysis of these joints are needed. The linearly elastic results of the code are validated vs. finite element analysis results from the literature under different loading and boundary conditions, and new results are generated to investigate the inelastic behavior of the tee joint. The comparison with the finite element results indicates that HOT-SMAC is an efficient and accurate alternative to the finite element method and has a great deal of potential as an analysis tool for a wide range of bonded joints.
An overview of aeroelasticity studies for the National Aero-Space Plane
NASA Technical Reports Server (NTRS)
Ricketts, Rodney H.; Noll, Thomas E.; Whitlow, Woodrow, Jr.; Huttsell, Lawrence J.
1993-01-01
The National Aero-Space Plane (NASP), or X-30, is a single-stage-to-orbit vehicle that is designed to takeoff and land on conventional runways. Research in aeroelasticity was conducted by the NASA and the Wright Laboratory to support the design of a flight vehicle by the national contractor team. This research includes the development of new computational codes for predicting unsteady aerodynamic pressures. In addition, studies were conducted to determine the aerodynamic heating effects on vehicle aeroelasticity and to determine the effects of fuselage flexibility on the stability of the control systems. It also includes the testing of scale models to better understand the aeroelastic behavior of the X-30 and to obtain data for code validation and correlation. This paper presents an overview of the aeroelastic research which has been conducted to support the airframe design.
Reliability, Validity, and Usability of Data Extraction Programs for Single-Case Research Designs.
Moeyaert, Mariola; Maggin, Daniel; Verkuilen, Jay
2016-11-01
Single-case experimental designs (SCEDs) have been increasingly used in recent years to inform the development and validation of effective interventions in the behavioral sciences. An important aspect of this work has been the extension of meta-analytic and other statistical innovations to SCED data. Standard practice within SCED methods is to display data graphically, which requires subsequent users to extract the data, either manually or using data extraction programs. Previous research has examined issues of reliability and validity of data extraction programs in the past, but typically at an aggregate level. Little is known, however, about the coding of individual data points. We focused on four different software programs that can be used for this purpose (i.e., Ungraph, DataThief, WebPlotDigitizer, and XYit), and examined the reliability of numeric coding, the validity compared with real data, and overall program usability. This study indicates that the reliability and validity of the retrieved data are independent of the specific software program, but are dependent on the individual single-case study graphs. Differences were found in program usability in terms of user friendliness, data retrieval time, and license costs. Ungraph and WebPlotDigitizer received the highest usability scores. DataThief was perceived as unacceptable and the time needed to retrieve the data was double that of the other three programs. WebPlotDigitizer was the only program free to use. As a consequence, WebPlotDigitizer turned out to be the best option in terms of usability, time to retrieve the data, and costs, although the usability scores of Ungraph were also strong. © The Author(s) 2016.
The effects of expressivity and flight task on cockpit communication and resource management
NASA Technical Reports Server (NTRS)
Jensen, R. S.
1986-01-01
The results of an investigation to develop a methodology for evaluating crew communication behavior on the flight deck and a flight simulator experiment to test the effects of crew member expressivity, as measured by the Personal Attributes Questionnarie, and flight task on crew communication and flight performance are discussed. A methodology for coding and assessing flight crew communication behavior as well as a model for predicting that behavior is advanced. Although not enough crews were found to provide valid statistical tests, the results of the study tend to indicate that crews in which the captain has high expressivity perform better than those whose captain is low in expressivity. There appears to be a strong interaction between captains and first officers along the level of command dimension of communication. The PAQ appears to identify those pilots who offer disagreements and inititate new subjects for discussion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atamturktur, Sez; Unal, Cetin; Hemez, Francois
The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed frameworkmore » is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this framework, the project team has focused on optimizing resource allocation for improving numerical models through further code development and experimentation. Related to further code development, we have developed a code prioritization index (CPI) for coupled numerical models. CPI is implemented to effectively improve the predictive capability of the coupled model by increasing the sophistication of constituent codes. In relation to designing new experiments, we investigated the information gained by the addition of each new experiment used for calibration and bias correction of a simulation model. Additionally, the variability of ‘information gain’ through the design domain has been investigated in order to identify the experiment settings where maximum information gain occurs and thus guide the experimenters in the selection of the experiment settings. This idea was extended to evaluate the information gain from each experiment can be improved by intelligently selecting the experiments, leading to the development of the Batch Sequential Design (BSD) technique. Additionally, we evaluated the importance of sufficiently exploring the domain of applicability in experiment-based validation of high-consequence modeling and simulation by developing a new metric to quantify coverage. This metric has also been incorporated into the design of new experiments. Finally, we have proposed a data-aware calibration approach for the calibration of numerical models. This new method considers the complexity of a numerical model (the number of parameters to be calibrated, parameter uncertainty, and form of the model) and seeks to identify the number of experiments necessary to calibrate the model based on the level of sophistication of the physics. The final component in the project team’s work to improve model calibration and validation methods is the incorporation of robustness to non-probabilistic uncertainty in the input parameters. This is an improvement to model validation and uncertainty quantification stemming beyond the originally proposed scope of the project. We have introduced a new metric for incorporating the concept of robustness into experiment-based validation of numerical models. This project has accounted for the graduation of two Ph.D. students (Kendra Van Buren and Josh Hegenderfer) and two M.S. students (Matthew Egeberg and Parker Shields). One of the doctoral students is now working in the nuclear engineering field and the other one is a post-doctoral fellow at the Los Alamos National Laboratory. Additionally, two more Ph.D. students (Garrison Stevens and Tunc Kulaksiz) who are working towards graduation have been supported by this project.« less
Lee, Jin Hee; Hong, Ki Jeong; Kim, Do Kyun; Kwak, Young Ho; Jang, Hye Young; Kim, Hahn Bom; Noh, Hyun; Park, Jungho; Song, Bongkyu; Jung, Jae Yun
2013-12-01
A clinically sensible diagnosis grouping system (DGS) is needed for describing pediatric emergency diagnoses for research, medical resource preparedness, and making national policy for pediatric emergency medical care. The Pediatric Emergency Care Applied Research Network (PECARN) developed the DGS successfully. We developed the modified PECARN DGS based on the different pediatric population of South Korea and validated the system to obtain the accurate and comparable epidemiologic data of pediatric emergent conditions of the selected population. The data source used to develop and validate the modified PECARN DGS was the National Emergency Department Information System of South Korea, which was coded by the International Classification of Diseases, 10th Revision (ICD-10) code system. To develop the modified DGS based on ICD-10 code, we matched the selected ICD-10 codes with those of the PECARN DGS by the General Equivalence Mappings (GEMs). After converting ICD-10 codes to ICD-9 codes by GEMs, we matched ICD-9 codes into PECARN DGS categories using the matrix developed by PECARN group. Lastly, we conducted the expert panel survey using Delphi method for the remaining diagnosis codes that were not matched. A total of 1879 ICD-10 codes were used in development of the modified DGS. After 1078 (57.4%) of 1879 ICD-10 codes were assigned to the modified DGS by GEM and PECARN conversion tools, investigators assigned each of the remaining 801 codes (42.6%) to DGS subgroups by 2 rounds of electronic Delphi surveys. And we assigned the remaining 29 codes (4%) into the modified DGS at the second expert consensus meeting. The modified DGS accounts for 98.7% and 95.2% of diagnoses of the 2008 and 2009 National Emergency Department Information System data set. This modified DGS also exhibited strong construct validity using the concepts of age, sex, site of care, and seasons. This also reflected the 2009 outbreak of H1N1 influenza in Korea. We developed and validated clinically feasible and sensible DGS system for describing pediatric emergent conditions in Korea. The modified PECARN DGS showed good comprehensiveness and demonstrated reliable construct validity. This modified DGS based on PECARN DGS framework may be effectively implemented for research, reporting, and resource planning in pediatric emergency system of South Korea.
History of one family of atmospheric radiative transfer codes
NASA Astrophysics Data System (ADS)
Anderson, Gail P.; Wang, Jinxue; Hoke, Michael L.; Kneizys, F. X.; Chetwynd, James H., Jr.; Rothman, Laurence S.; Kimball, L. M.; McClatchey, Robert A.; Shettle, Eric P.; Clough, Shepard (.; Gallery, William O.; Abreu, Leonard W.; Selby, John E. A.
1994-12-01
Beginning in the early 1970's, the then Air Force Cambridge Research Laboratory initiated a program to develop computer-based atmospheric radiative transfer algorithms. The first attempts were translations of graphical procedures described in a 1970 report on The Optical Properties of the Atmosphere, based on empirical transmission functions and effective absorption coefficients derived primarily from controlled laboratory transmittance measurements. The fact that spectrally-averaged atmospheric transmittance (T) does not obey the Beer-Lambert Law (T equals exp(-(sigma) (DOT)(eta) ), where (sigma) is a species absorption cross section, independent of (eta) , the species column amount along the path) at any but the finest spectral resolution was already well known. Band models to describe this gross behavior were developed in the 1950's and 60's. Thus began LOWTRAN, the Low Resolution Transmittance Code, first released in 1972. This limited initial effort has how progressed to a set of codes and related algorithms (including line-of-sight spectral geometry, direct and scattered radiance and irradiance, non-local thermodynamic equilibrium, etc.) that contain thousands of coding lines, hundreds of subroutines, and improved accuracy, efficiency, and, ultimately, accessibility. This review will include LOWTRAN, HITRAN (atlas of high-resolution molecular spectroscopic data), FASCODE (Fast Atmospheric Signature Code), and MODTRAN (Moderate Resolution Transmittance Code), their permutations, validations, and applications, particularly as related to passive remote sensing and energy deposition.
NASA Technical Reports Server (NTRS)
Winter, Michael
2012-01-01
The characterization of ablation and recession of heat shield materials during arc jet testing is an important step towards understanding the governing processes during these tests and therefore for a successful extrapolation of ground test data to flight. The behavior of ablative heat shield materials in a ground-based arc jet facility is usually monitored through measurement of temperature distributions (across the surface and in-depth), and through measurement of the final surface recession. These measurements are then used to calibrate/validate materials thermal response codes, which have mathematical models with reasonably good fidelity to the physics and chemistry of ablation, and codes thus calibrated are used for predicting material behavior in flight environments. However, these thermal measurements only indirectly characterize the pyrolysis processes within an ablative material pyrolysis is the main effect during ablation. Quantification of pyrolysis chemistry would therefore provide more definitive and useful data for validation of the material response codes. Information of the chemical products of ablation, to various levels of detail, can be obtained using optical methods. Suitable optical methods to measure the shape and composition of these layers (with emphasis on the blowing layer) during arc jet testing are: 1) optical emission spectroscopy (OES) 2) filtered imaging 3) laser induced fluorescence (LIF) and 4) absorption spectroscopy. Several attempts have been made to optically measure the material response of ablative materials during arc-jet testing. Most recently, NH and OH have been identified in the boundary layer of a PICA ablator. These species are suitable candidates for a detection through PLIF which would enable a spatially-resolved characterization of the blowing layer in terms of both its shape and composition. The recent emission spectroscopy data will be presented and future experiments for a qualitative and quantitative characterization of the material response of ablative materials during arc-jet testing will be discussed.
Further Validation of a CFD Code for Calculating the Performance of Two-Stage Light Gas Guns
NASA Technical Reports Server (NTRS)
Bogdanoff, David W.
2017-01-01
Earlier validations of a higher-order Godunov code for modeling the performance of two-stage light gas guns are reviewed. These validation comparisons were made between code predictions and experimental data from the NASA Ames 1.5" and 0.28" guns and covered muzzle velocities of 6.5 to 7.2 km/s. In the present report, five more series of code validation comparisons involving experimental data from the Ames 0.22" (1.28" pump tube diameter), 0.28", 0.50", 1.00" and 1.50" guns are presented. The total muzzle velocity range of the validation data presented herein is 3 to 11.3 km/s. The agreement between the experimental data and CFD results is judged to be very good. Muzzle velocities were predicted within 0.35 km/s for 74% of the cases studied with maximum differences being 0.5 km/s and for 4 out of 50 cases, 0.5 - 0.7 km/s.
NASA Astrophysics Data System (ADS)
Leclaire, N.; Cochet, B.; Le Dauphin, F. X.; Haeck, W.; Jacquet, O.
2014-06-01
The present paper aims at providing experimental validation for the use of the MORET 5 code for advanced concepts of reactor involving thorium and heavy water. It therefore constitutes an opportunity to test and improve the thermal-scattering data of heavy water and also to test the recent implementation of probability tables in the MORET 5 code.
Criticality Calculations with MCNP6 - Practical Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2016-11-29
These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input modelmore » for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.« less
Social Workers and the NASW "Code of Ethics": Belief, Behavior, Disjuncture
ERIC Educational Resources Information Center
DiFranks, Nikki Nelson
2008-01-01
A quantitative descriptive survey of a national sample of social workers (N = 206) examined discrepancies between belief in the NASW "Code of Ethics" and behavior in implementing the code and social workers' disjunctive distress (disjuncture) when belief and behavior are discordant. Relationships between setting and disjuncture and ethics…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karthikeyan, R.; Tellier, R. L.; Hebert, A.
2006-07-01
The Coolant Void Reactivity (CVR) is an important safety parameter that needs to be estimated at the design stage of a nuclear reactor. It helps to have an a priori knowledge of the behavior of the system during a transient initiated by the loss of coolant. In the present paper, we have attempted to estimate the CVR for a CANDU New Generation (CANDU-NG) lattice, as proposed at an early stage of the Advanced CANDU Reactor (ACR) development. We have attempted to estimate the CVR with development version of the code DRAGON, using the method of characteristics. DRAGON has several advancedmore » self-shielding models incorporated in it, each of them compatible with the method of characteristics. This study will bring to focus the performance of these self-shielding models, especially when there is voiding of such a tight lattice. We have also performed assembly calculations in 2 x 2 pattern for the CANDU-NG fuel, with special emphasis on checkerboard voiding. The results obtained have been validated against Monte Carlo codes MCNP5 and TRIPOLI-4.3. (authors)« less
NASA Astrophysics Data System (ADS)
Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.
2016-07-01
This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.
Autism-like behavioral phenotypes in BTBR T+tf/J mice.
McFarlane, H G; Kusek, G K; Yang, M; Phoenix, J L; Bolivar, V J; Crawley, J N
2008-03-01
Autism is a behaviorally defined neurodevelopmental disorder of unknown etiology. Mouse models with face validity to the core symptoms offer an experimental approach to test hypotheses about the causes of autism and translational tools to evaluate potential treatments. We discovered that the inbred mouse strain BTBR T+tf/J (BTBR) incorporates multiple behavioral phenotypes relevant to all three diagnostic symptoms of autism. BTBR displayed selectively reduced social approach, low reciprocal social interactions and impaired juvenile play, as compared with C57BL/6J (B6) controls. Impaired social transmission of food preference in BTBR suggests communication deficits. Repetitive behaviors appeared as high levels of self-grooming by juvenile and adult BTBR mice. Comprehensive analyses of procedural abilities confirmed that social recognition and olfactory abilities were normal in BTBR, with no evidence for high anxiety-like traits or motor impairments, supporting an interpretation of highly specific social deficits. Database comparisons between BTBR and B6 on 124 putative autism candidate genes showed several interesting single nucleotide polymorphisms (SNPs) in the BTBR genetic background, including a nonsynonymous coding region polymorphism in Kmo. The Kmo gene encodes kynurenine 3-hydroxylase, an enzyme-regulating metabolism of kynurenic acid, a glutamate antagonist with neuroprotective actions. Sequencing confirmed this coding SNP in Kmo, supporting further investigation into the contribution of this polymorphism to autism-like behavioral phenotypes. Robust and selective social deficits, repetitive self-grooming, genetic stability and commercial availability of the BTBR inbred strain encourage its use as a research tool to search for background genes relevant to the etiology of autism, and to explore therapeutics to treat the core symptoms.
Organizational Effectiveness Information System (OEIS) User’s Manual
1986-09-01
SUBJECT CODES B-l C. LISTING OF VALID RESOURCE SYSTEM CODES C-l »TflerÄ*w»fi*%f*fc**v.nft; ^’.A/.V. A y.A/.AAA«•.*-A/. AAV ...the valid codes used la the Implementation and Design System. MACOM 01 COE 02 DARCOM 03 EUSA 04 FORSCOM 05 HSC 06 HQDA 07 INSCOM 08 MDW 09
Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor
NASA Technical Reports Server (NTRS)
Ajmani, Kumud; Liu, Nan-Suey
2005-01-01
The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.
Sukanya, Chongthawonsatid
2017-10-01
This study examined the validity of the principal diagnoses on discharge summaries and coding assessments. Data were collected from the National Health Security Office (NHSO) of Thailand in 2015. In total, 118,971 medical records were audited. The sample was drawn from government hospitals and private hospitals covered by the Universal Coverage Scheme in Thailand. Hospitals and cases were selected using NHSO criteria. The validity of the principal diagnoses listed in the "Summary and Coding Assessment" forms was established by comparing data from the discharge summaries with data obtained from medical record reviews, and additionally, by comparing data from the coding assessments with data in the computerized ICD (the data base used for reimbursement-purposes). The summary assessments had low sensitivities (7.3%-37.9%), high specificities (97.2%-99.8%), low positive predictive values (9.2%-60.7%), and high negative predictive values (95.9%-99.3%). The coding assessments had low sensitivities (31.1%-69.4%), high specificities (99.0%-99.9%), moderate positive predictive values (43.8%-89.0%), and high negative predictive values (97.3%-99.5%). The discharge summaries and codings often contained mistakes, particularly the categories "Endocrine, nutritional, and metabolic diseases", "Symptoms, signs, and abnormal clinical and laboratory findings not elsewhere classified", "Factors influencing health status and contact with health services", and "Injury, poisoning, and certain other consequences of external causes". The validity of the principal diagnoses on the summary and coding assessment forms was found to be low. The training of physicians and coders must be strengthened to improve the validity of discharge summaries and codings.
Extension, validation and application of the NASCAP code
NASA Technical Reports Server (NTRS)
Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.
1979-01-01
Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
On the Nonlinear Behavior of a Glass-Ceramic Seal and its Application in Planar SOFC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Koeppel, Brian J.; Vetrano, John S.
2006-06-01
This paper studies the nonlinear behavior of a glass-ceramic seal used in planar solid oxide fuel cells (SOFCs). To this end, a viscoelastic damage model has been developed that can capture the nonlinear material response due to both progressive damage in the glass-ceramic material and viscous flow of the residual glass in this material. The model has been implemented in the MSC MARC finite element code, and its validation has been carried out using the experimental relaxation test data obtained for this material at 700oC, 750oC, and 800oC. Finally, it has been applied to the simulation of a SOFC stackmore » under thermal cycling conditions. The areas of potential damage have been predicted.« less
Hadden, Kellie L; LeFort, Sandra; O'Brien, Michelle; Coyte, Peter C; Guerriere, Denise N
2016-04-01
The purpose of the current study was to examine the concurrent and discriminant validity of the Child Facial Coding System for children with cerebral palsy. Eighty-five children (mean = 8.35 years, SD = 4.72 years) were videotaped during a passive joint stretch with their physiotherapist and during 3 time segments: baseline, passive joint stretch, and recovery. Children's pain responses were rated from videotape using the Numerical Rating Scale and Child Facial Coding System. Results indicated that Child Facial Coding System scores during the passive joint stretch significantly correlated with Numerical Rating Scale scores (r = .72, P < .01). Child Facial Coding System scores were also significantly higher during the passive joint stretch than the baseline and recovery segments (P < .001). Facial activity was not significantly correlated with the developmental measures. These findings suggest that the Child Facial Coding System is a valid method of identifying pain in children with cerebral palsy. © The Author(s) 2015.
Challenges in using medicaid claims to ascertain child maltreatment.
Raghavan, Ramesh; Brown, Derek S; Allaire, Benjamin T; Garfield, Lauren D; Ross, Raven E; Hedeker, Donald
2015-05-01
Medicaid data contain International Classification of Diseases, Clinical Modification (ICD-9-CM) codes indicating maltreatment, yet there is a little information on how valid these codes are for the purposes of identifying maltreatment from health, as opposed to child welfare, data. This study assessed the validity of Medicaid codes in identifying maltreatment. Participants (n = 2,136) in the first National Survey of Child and Adolescent Well-Being were linked to their Medicaid claims obtained from 36 states. Caseworker determinations of maltreatment were compared with eight sets of ICD-9-CM codes. Of the 1,921 children identified by caseworkers as being maltreated, 15.2% had any relevant ICD-9-CM code in any of their Medicaid files across 4 years of observation. Maltreated boys and those of African American race had lower odds of displaying a maltreatment code. Using only Medicaid claims to identify maltreated children creates validity problems. Medicaid data linkage with other types of administrative data is required to better identify maltreated children. © The Author(s) 2014.
A Comprehensive Validation Approach Using The RAVEN Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J
2015-06-01
The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugler, Raymond E.; Lee, Chi-Miag (Technical Monitor)
2001-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this paper, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery for space launch vehicle propulsion systems.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugfer, Raymond E.
2002-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugler, Raymond E.
2002-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid beat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.
Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.
2005-01-01
In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.
Adams, Derk; Schreuder, Astrid B; Salottolo, Kristin; Settell, April; Goss, J Richard
2011-07-01
There are significant changes in the abbreviated injury scale (AIS) 2005 system, which make it impractical to compare patients coded in AIS version 98 with patients coded in AIS version 2005. Harborview Medical Center created a computer algorithm "Harborview AIS Mapping Program (HAMP)" to automatically convert AIS 2005 to AIS 98 injury codes. The mapping was validated using 6 months of double-coded patient injury records from a Level I Trauma Center. HAMP was used to determine how closely individual AIS and injury severity scores (ISS) were converted from AIS 2005 to AIS 98 versions. The kappa statistic was used to measure the agreement between manually determined codes and HAMP-derived codes. Seven hundred forty-nine patient records were used for validation. For the conversion of AIS codes, the measure of agreement between HAMP and manually determined codes was [kappa] = 0.84 (95% confidence interval, 0.82-0.86). The algorithm errors were smaller in magnitude than the manually determined coding errors. For the conversion of ISS, the agreement between HAMP versus manually determined ISS was [kappa] = 0.81 (95% confidence interval, 0.78-0.84). The HAMP algorithm successfully converted injuries coded in AIS 2005 to AIS 98. This algorithm will be useful when comparing trauma patient clinical data across populations coded in different versions, especially for longitudinal studies.
Song, Lunar; Park, Byeonghwa; Oh, Kyeung Mi
2015-04-01
Serious medication errors continue to exist in hospitals, even though there is technology that could potentially eliminate them such as bar code medication administration. Little is known about the degree to which the culture of patient safety is associated with behavioral intention to use bar code medication administration. Based on the Technology Acceptance Model, this study evaluated the relationships among patient safety culture and perceived usefulness and perceived ease of use, and behavioral intention to use bar code medication administration technology among nurses in hospitals. Cross-sectional surveys with a convenience sample of 163 nurses using bar code medication administration were conducted. Feedback and communication about errors had a positive impact in predicting perceived usefulness (β=.26, P<.01) and perceived ease of use (β=.22, P<.05). In a multiple regression model predicting for behavioral intention, age had a negative impact (β=-.17, P<.05); however, teamwork within hospital units (β=.20, P<.05) and perceived usefulness (β=.35, P<.01) both had a positive impact on behavioral intention. The overall bar code medication administration behavioral intention model explained 24% (P<.001) of the variance. Identified factors influencing bar code medication administration behavioral intention can help inform hospitals to develop tailored interventions for RNs to reduce medication administration errors and increase patient safety by using this technology.
Jones, Natalie; Schneider, Gary; Kachroo, Sumesh; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's (FDA) Mini-Sentinel pilot program initially aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest (HOIs) from administrative and claims data. This paper summarizes the process and findings of the algorithm review of acute respiratory failure (ARF). PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the anaphylaxis HOI. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify ARF, including validation estimates of the coding algorithms. Our search revealed a deficiency of literature focusing on ARF algorithms and validation estimates. Only two studies provided codes for ARF, each using related yet different ICD-9 codes (i.e., ICD-9 codes 518.8, "other diseases of lung," and 518.81, "acute respiratory failure"). Neither study provided validation estimates. Research needs to be conducted on designing validation studies to test ARF algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.
MATCHED-INDEX-OF-REFRACTION FLOW FACILITY FOR FUNDAMENTAL AND APPLIED RESEARCH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piyush Sabharwall; Carl Stoots; Donald M. McEligot
2014-11-01
Significant challenges face reactor designers with regard to thermal hydraulic design and associated modeling for advanced reactor concepts. Computational thermal hydraulic codes solve only a piece of the core. There is a need for a whole core dynamics system code with local resolution to investigate and understand flow behavior with all the relevant physics and thermo-mechanics. The matched index of refraction (MIR) flow facility at Idaho National Laboratory (INL) has a unique capability to contribute to the development of validated computational fluid dynamics (CFD) codes through the use of state-of-the-art optical measurement techniques, such as Laser Doppler Velocimetry (LDV) andmore » Particle Image Velocimetry (PIV). PIV is a non-intrusive velocity measurement technique that tracks flow by imaging the movement of small tracer particles within a fluid. At the heart of a PIV calculation is the cross correlation algorithm, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. Generally, the displacement is indicated by the location of the largest peak. To quantify these measurements accurately, sophisticated processing algorithms correlate the locations of particles within the image to estimate the velocity (Ref. 1). Prior to use with reactor deign, the CFD codes have to be experimentally validated, which requires rigorous experimental measurements to produce high quality, multi-dimensional flow field data with error quantification methodologies. Computational thermal hydraulic codes solve only a piece of the core. There is a need for a whole core dynamics system code with local resolution to investigate and understand flow behavior with all the relevant physics and thermo-mechanics. Computational techniques with supporting test data may be needed to address the heat transfer from the fuel to the coolant during the transition from turbulent to laminar flow, including the possibility of an early laminarization of the flow (Refs. 2 and 3) (laminarization is caused when the coolant velocity is theoretically in the turbulent regime, but the heat transfer properties are indicative of the coolant velocity being in the laminar regime). Such studies are complicated enough that computational fluid dynamics (CFD) models may not converge to the same conclusion. Thus, experimentally scaled thermal hydraulic data with uncertainties should be developed to support modeling and simulation for verification and validation activities. The fluid/solid index of refraction matching technique allows optical access in and around geometries that would otherwise be impossible while the large test section of the INL system provides better spatial and temporal resolution than comparable facilities. Benchmark data for assessing computational fluid dynamics can be acquired for external flows, internal flows, and coupled internal/external flows for better understanding of physical phenomena of interest. The core objective of this study is to describe MIR and its capabilities, and mention current development areas for uncertainty quantification, mainly the uncertainty surface method and cross-correlation method. Using these methods, it is anticipated to establish a suitable approach to quantify PIV uncertainty for experiments performed in the MIR.« less
Some Colleges Extend Their Codes of Conduct to Off-Campus Behavior.
ERIC Educational Resources Information Center
Gose, Ben
1998-01-01
Protesters at the University of California at Berkeley are calling for the expulsion of a sophomore who failed to report a murder. However, the student's behavior does not violate the university's code of conduct, which is restricted to campus behavior. Some institutions are expanding the geographic reach of their codes to discipline students for…
Coding Manual for Continuous Observation of Interactions by Single Subjects in an Academic Setting.
ERIC Educational Resources Information Center
Cobb, Joseph A.; Hops, Hyman
The manual, designed particularly for work with acting-out or behavior problem students, describes coding procedures used in the observation of continuous classroom interactions between the student and his peers and teacher. Peer and/or teacher behaviors antecedent and consequent to the subject's behavior are identified in the coding process,…
Benchmark studies of thermal jet mixing in SFRs using a two-jet model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omotowa, O. A.; Skifton, R.; Tokuhiro, A.
To guide the modeling, simulations and design of Sodium Fast Reactors (SFRs), we explore and compare the predictive capabilities of two numerical solvers COMSOL and OpenFOAM in the thermal jet mixing of two buoyant jets typical of the outlet flow from a SFR tube bundle. This process will help optimize on-going experimental efforts at obtaining high resolution data for V and V of CFD codes as anticipated in next generation nuclear systems. Using the k-{epsilon} turbulence models of both codes as reference, their ability to simulate the turbulence behavior in similar environments was first validated for single jet experimental datamore » reported in literature. This study investigates the thermal mixing of two parallel jets having a temperature difference (hot-to-cold) {Delta}T{sub hc}= 5 deg. C, 10 deg. C and velocity ratios U{sub c}/U{sub h} = 0.5, 1. Results of the computed turbulent quantities due to convective mixing and the variations in flow field along the axial position are presented. In addition, this study also evaluates the effect of spacing ratio between jets in predicting the flow field and jet behavior in near and far fields. (authors)« less
Noh, Hyun Ji; Tang, Ruqi; Flannick, Jason; O'Dushlaine, Colm; Swofford, Ross; Howrigan, Daniel; Genereux, Diane P; Johnson, Jeremy; van Grootheest, Gerard; Grünblatt, Edna; Andersson, Erik; Djurfeldt, Diana R; Patel, Paresh D; Koltookian, Michele; M Hultman, Christina; Pato, Michele T; Pato, Carlos N; Rasmussen, Steven A; Jenike, Michael A; Hanna, Gregory L; Stewart, S Evelyn; Knowles, James A; Ruhrmann, Stephan; Grabe, Hans-Jörgen; Wagner, Michael; Rück, Christian; Mathews, Carol A; Walitza, Susanne; Cath, Daniëlle C; Feng, Guoping; Karlsson, Elinor K; Lindblad-Toh, Kerstin
2017-10-17
Obsessive-compulsive disorder is a severe psychiatric disorder linked to abnormalities in glutamate signaling and the cortico-striatal circuit. We sequenced coding and regulatory elements for 608 genes potentially involved in obsessive-compulsive disorder in human, dog, and mouse. Using a new method that prioritizes likely functional variants, we compared 592 cases to 560 controls and found four strongly associated genes, validated in a larger cohort. NRXN1 and HTR2A are enriched for coding variants altering postsynaptic protein-binding domains. CTTNBP2 (synapse maintenance) and REEP3 (vesicle trafficking) are enriched for regulatory variants, of which at least six (35%) alter transcription factor-DNA binding in neuroblastoma cells. NRXN1 achieves genome-wide significance (p = 6.37 × 10 -11 ) when we include 33,370 population-matched controls. Our findings suggest synaptic adhesion as a key component in compulsive behaviors, and show that targeted sequencing plus functional annotation can identify potentially causative variants, even when genomic data are limited.Obsessive-compulsive disorder (OCD) is a neuropsychiatric disorder with symptoms including intrusive thoughts and time-consuming repetitive behaviors. Here Noh and colleagues identify genes enriched for functional variants associated with increased risk of OCD.
Verification and validation of RADMODL Version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimball, K.D.
1993-03-01
RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transportmore » of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.« less
WEC-SIM Validation Testing Plan FY14 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley Michelle
2016-02-01
The WEC-Sim project is currently on track, having met both the SNL and NREL FY14 Milestones, as shown in Table 1 and Table 2. This is also reflected in the Gantt chart uploaded to the WEC-Sim SharePoint site in the FY14 Q4 Deliverables folder. The work completed in FY14 includes code verification through code-to-code comparison (FY14 Q1 and Q2), preliminary code validation through comparison to experimental data (FY14 Q2 and Q3), presentation and publication of the WEC-Sim project at OMAE 2014 [1], [2], [3] and GMREC/METS 2014 [4] (FY14 Q3), WEC-Sim code development and public open-source release (FY14 Q3), andmore » development of a preliminary WEC-Sim validation test plan (FY14 Q4). This report presents the preliminary Validation Testing Plan developed in FY14 Q4. The validation test effort started in FY14 Q4 and will go on through FY15. Thus far the team has developed a device selection method, selected a device, and placed a contract with the testing facility, established several collaborations including industry contacts, and have working ideas on the testing details such as scaling, device design, and test conditions.« less
Madigan, Sheri; Wade, Mark; Plamondon, Andre; Jenkins, Jennifer
2015-01-01
The current study examined a temporal cascade linking mothers' history of abuse with their children's internalizing difficulties through proximal processes such as maternal postnatal depressive symptoms and responsive parenting. Participants consisted of 490 mother-child dyads assessed at three time points when children were, on average, 2 months old at Time 1 (T1), 18 months at Time 2 (T2), and 36 months at Time 3 (T3). Maternal abuse history and depressive symptoms were assessed via questionnaires at T1. Observations of responsive parenting were collected at T2 and were coded using a validated coding scheme. Children's internalizing difficulties were assessed in the preschool period using averaged parental reports. Path analysis revealed that maternal physical abuse was associated with depressive symptoms postnatally, which were in turn associated with children's internalizing behavior at 36 months of age. We also found that the association between physical abuse history and responsive parenting operated indirectly through maternal depressive symptoms. These findings remained after controlling for covariates including socioeconomic status, child gender, and age. After accounting for physical abuse history, sexual abuse history was not associated with child internalizing problems either directly or indirectly through maternal depressive symptoms and/or parenting behavior. Thus, mothers' physical abuse history is a risk factor for relatively poor mental health, which is itself predictive of both later parenting behavior and children's internalizing problems. © 2015 Michigan Association for Infant Mental Health.
Final Technical Report for GO17004 Regulatory Logic: Codes and Standards for the Hydrogen Economy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakarado, Gary L.
The objectives of this project are to: develop a robust supporting research and development program to provide critical hydrogen behavior data and a detailed understanding of hydrogen combustion and safety across a range of scenarios, needed to establish setback distances in building codes and minimize the overall data gaps in code development; support and facilitate the completion of technical specifications by the International Organization for Standardization (ISO) for gaseous hydrogen refueling (TS 20012) and standards for on-board liquid (ISO 13985) and gaseous or gaseous blend (ISO 15869) hydrogen storage by 2007; support and facilitate the effort, led by the NFPA,more » to complete the draft Hydrogen Technologies Code (NFPA 2) by 2008; with experimental data and input from Technology Validation Program element activities, support and facilitate the completion of standards for bulk hydrogen storage (e.g., NFPA 55) by 2008; facilitate the adoption of the most recently available model codes (e.g., from the International Code Council [ICC]) in key regions; complete preliminary research and development on hydrogen release scenarios to support the establishment of setback distances in building codes and provide a sound basis for model code development and adoption; support and facilitate the development of Global Technical Regulations (GTRs) by 2010 for hydrogen vehicle systems under the United Nations Economic Commission for Europe, World Forum for Harmonization of Vehicle Regulations and Working Party on Pollution and Energy Program (ECE-WP29/GRPE); and to Support and facilitate the completion by 2012 of necessary codes and standards needed for the early commercialization and market entry of hydrogen energy technologies.« less
Rudmik, Luke; Xu, Yuan; Kukec, Edward; Liu, Mingfu; Dean, Stafford; Quan, Hude
2016-11-01
Pharmacoepidemiological research using administrative databases has become increasingly popular for chronic rhinosinusitis (CRS); however, without a validated case definition the cohort evaluated may be inaccurate resulting in biased and incorrect outcomes. The objective of this study was to develop and validate a generalizable administrative database case definition for CRS using International Classification of Diseases, 9th edition (ICD-9)-coded claims. A random sample of 100 patients with a guideline-based diagnosis of CRS and 100 control patients were selected and then linked to a Canadian physician claims database from March 31, 2010, to March 31, 2015. The proportion of CRS ICD-9-coded claims (473.x and 471.x) for each of these 200 patients were reviewed and the validity of 7 different ICD-9-based coding algorithms was evaluated. The CRS case definition of ≥2 claims with a CRS ICD-9 code (471.x or 473.x) within 2 years of the reference case provides a balanced validity with a sensitivity of 77% and specificity of 79%. Applying this CRS case definition to the claims database produced a CRS cohort of 51,000 patients with characteristics that were consistent with published demographics and rates of comorbid asthma, allergic rhinitis, and depression. This study has validated several coding algorithms; based on the results a case definition of ≥2 physician claims of CRS (ICD-9 of 471.x or 473.x) within 2 years provides an optimal level of validity. Future studies will need to validate this administrative case definition from different health system perspectives and using larger retrospective chart reviews from multiple providers. © 2016 ARS-AAOA, LLC.
Caregiver Cognition and Behavior in Day-Care Classrooms.
ERIC Educational Resources Information Center
Holloway, Susan D.
A study examined the relationship between change in daycare children's classroom behavior and the teacher's socialization behavior. Various behaviors of 69 children in 24 classrooms were observed and coded in the fall and spring of the school year. Observers coded teacher behavior according to the Caregiver Interaction Scale, which assesses…
Validation of Hydrodynamic Load Models Using CFD for the OC4-DeepCwind Semisubmersible: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benitz, M. A.; Schmidt, D. P.; Lackner, M. A.
Computational fluid dynamics (CFD) simulations were carried out on the OC4-DeepCwind semi-submersible to obtain a better understanding of how to set hydrodynamic coefficients for the structure when using an engineering tool such as FAST to model the system. The focus here was on the drag behavior and the effects of the free-surface, free-ends and multi-member arrangement of the semi-submersible structure. These effects are investigated through code-to-code comparisons and flow visualizations. The implications on mean load predictions from engineering tools are addressed. The work presented here suggests that selection of drag coefficients should take into consideration a variety of geometric factors.more » Furthermore, CFD simulations demonstrate large time-varying loads due to vortex shedding, which FAST's hydrodynamic module, HydroDyn, does not model. The implications of these oscillatory loads on the fatigue life needs to be addressed.« less
Gilmore-Bykovskyi, Andrea L
2015-01-01
Mealtime behavioral symptoms are distressing and frequently interrupt eating for the individual experiencing them and others in the environment. A computer-assisted coding scheme was developed to measure caregiver person-centeredness and behavioral symptoms for nursing home residents with dementia during mealtime interactions. The purpose of this pilot study was to determine the feasibility, ease of use, and inter-observer reliability of the coding scheme, and to explore the clinical utility of the coding scheme. Trained observers coded 22 observations. Data collection procedures were acceptable to participants. Overall, the coding scheme proved to be feasible, easy to execute and yielded good to very good inter-observer agreement following observer re-training. The coding scheme captured clinically relevant, modifiable antecedents to mealtime behavioral symptoms, but would be enhanced by the inclusion of measures for resident engagement and consolidation of items for measuring caregiver person-centeredness that co-occurred and were difficult for observers to distinguish. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Pierazzo, E.; Artemieva, N.; Asphaug, E.; Baldwin, E. C.; Cazamias, J.; Coker, R.; Collins, G. S.; Crawford, D. A.; Davison, T.; Elbeshausen, D.; Holsapple, K. A.; Housen, K. R.; Korycansky, D. G.; Wünnemann, K.
2008-12-01
Over the last few decades, rapid improvement of computer capabilities has allowed impact cratering to be modeled with increasing complexity and realism, and has paved the way for a new era of numerical modeling of the impact process, including full, three-dimensional (3D) simulations. When properly benchmarked and validated against observation, computer models offer a powerful tool for understanding the mechanics of impact crater formation. This work presents results from the first phase of a project to benchmark and validate shock codes. A variety of 2D and 3D codes were used in this study, from commercial products like AUTODYN, to codes developed within the scientific community like SOVA, SPH, ZEUS-MP, iSALE, and codes developed at U.S. National Laboratories like CTH, SAGE/RAGE, and ALE3D. Benchmark calculations of shock wave propagation in aluminum-on-aluminum impacts were performed to examine the agreement between codes for simple idealized problems. The benchmark simulations show that variability in code results is to be expected due to differences in the underlying solution algorithm of each code, artificial stability parameters, spatial and temporal resolution, and material models. Overall, the inter-code variability in peak shock pressure as a function of distance is around 10 to 20%. In general, if the impactor is resolved by at least 20 cells across its radius, the underestimation of peak shock pressure due to spatial resolution is less than 10%. In addition to the benchmark tests, three validation tests were performed to examine the ability of the codes to reproduce the time evolution of crater radius and depth observed in vertical laboratory impacts in water and two well-characterized aluminum alloys. Results from these calculations are in good agreement with experiments. There appears to be a general tendency of shock physics codes to underestimate the radius of the forming crater. Overall, the discrepancy between the model and experiment results is between 10 and 20%, similar to the inter-code variability.
A Peer Helpers Code of Behavior.
ERIC Educational Resources Information Center
de Rosenroll, David A.
This document presents a guide for developing a peer helpers code of behavior. The first section discusses issues relevant to the trainers. These issues include whether to give a model directly to the group or whether to engender "ownership" of the code by the group; timing of introduction of the code; and addressing the issue of…
The Proteus Navier-Stokes code
NASA Technical Reports Server (NTRS)
Towne, Charles E.; Bui, Trong T.; Cavicchi, Richard H.; Conley, Julianne M.; Molls, Frank B.; Schwab, John R.
1992-01-01
An effort is currently underway at NASA Lewis to develop two- and three-dimensional Navier-Stokes codes, called Proteus, for aerospace propulsion applications. The emphasis in the development of Proteus is not algorithm development or research on numerical methods, but rather the development of the code itself. The objective is to develop codes that are user-oriented, easily-modified, and well-documented. Well-proven, state-of-the-art solution algorithms are being used. Code readability, documentation (both internal and external), and validation are being emphasized. This paper is a status report on the Proteus development effort. The analysis and solution procedure are described briefly, and the various features in the code are summarized. The results from some of the validation cases that have been run are presented for both the two- and three-dimensional codes.
2015-11-01
Memorandum Simulation of Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes... Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes by Charles R. Fisher...TYPE Technical Report 3. DATES COVERED (From - To) Dec 2013 – July 2015 4. TITLE AND SUBTITLE Simulation of Weld Mechanical Behavior to Include
2015-11-01
Memorandum Simulation of Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes... Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes by Charles R. Fisher...TYPE Technical Report 3. DATES COVERED (From - To) Dec 2013 – July 2015 4. TITLE AND SUBTITLE Simulation of Weld Mechanical Behavior to Include
NASA Technical Reports Server (NTRS)
Edighoffer, H.
1981-01-01
The studies examined for imposed sinusoidal and random motions of the shuttle skin and/or applied tile pressure. Studies are performed using the computer code DYNOTA which takes into account the highly nonlinear stiffening hysteresis and viscous behavior of the pad joining the tile to the shuttle skin. Where available, experimental data are used to confirm the validity of the analysis. Both analytical and experimental studies reveal that the system resonant frequency is very high for low amplitude oscillations but decreases rapidly to a minimum value with increasing amplitude.
Color Coded Cards for Student Behavior Management in Higher Education Environments
ERIC Educational Resources Information Center
Alhalabi, Wadee; Alhalabi, Mobeen
2017-01-01
The Color Coded Cards system as a possibly effective class management tool is the focus of this research. The Color Coded Cards system involves each student being given a card with a specific color based on his or her behavior. The main objective of the research is to find out whether this system effectively improves students' behavior, thus…
Gilmore-Bykovskyi, Andrea L.
2015-01-01
Mealtime behavioral symptoms are distressing and frequently interrupt eating for the individual experiencing them and others in the environment. In order to enable identification of potential antecedents to mealtime behavioral symptoms, a computer-assisted coding scheme was developed to measure caregiver person-centeredness and behavioral symptoms for nursing home residents with dementia during mealtime interactions. The purpose of this pilot study was to determine the acceptability and feasibility of procedures for video-capturing naturally-occurring mealtime interactions between caregivers and residents with dementia, to assess the feasibility, ease of use, and inter-observer reliability of the coding scheme, and to explore the clinical utility of the coding scheme. Trained observers coded 22 observations. Data collection procedures were feasible and acceptable to caregivers, residents and their legally authorized representatives. Overall, the coding scheme proved to be feasible, easy to execute and yielded good to very good inter-observer agreement following observer re-training. The coding scheme captured clinically relevant, modifiable antecedents to mealtime behavioral symptoms, but would be enhanced by the inclusion of measures for resident engagement and consolidation of items for measuring caregiver person-centeredness that co-occurred and were difficult for observers to distinguish. PMID:25784080
Hagemeier, Nicholas E; Tudiver, Fred; Brewster, Scott; Hagy, Elizabeth J; Hagaman, Angela; Pack, Robert P
Interpersonal communication is inherent in a majority of strategies seeking to engage prescriber and pharmacist health care professionals (HCPs) in the reduction and prevention of prescription drug abuse (PDA). However, research on HCP PDA communication behavioral engagement and factors that influence it is limited. This study quantitatively examined communication behaviors and trait-level communication metrics, and qualitatively described prescription drug abuse-related communication perceptions and behaviors among primary care prescribers and community pharmacists. Five focus groups (N = 35) were conducted within the Appalachian Research Network (AppNET), a rural primary care practice-based research network (PBRN) in South Central Appalachia between February and October, 2014. Focus groups were structured around the administration of three previously validated trait-level communication survey instruments, and one instrument developed by the investigators to gauge HCP prescription drug abuse communication engagement and perceived communication importance. Using a grounded theory approach, focus group themes were inductively derived and coded independently by study investigators. Member-checking interviews were conducted to validate derived themes. Respondents' trait-level communication self-perceptions indicated low communication apprehension, high self-perceived communication competence, and average willingness to communicate as compared to instrument specific criteria and norms. Significant variation in HCP communication behavior engagement was noted specific to PDA. Two overarching themes were noted for HCP-patient communication: 1) influencers of HCP communication and prescribing/dispensing behaviors, and 2) communication behaviors. Multiple sub-themes were identified within each theme. Similarities were noted in perceptions and behaviors across both prescribers and pharmacists. Despite the perceived importance of engaging in PDA communication, HCPs reported that prescription drug abuse communication is uncomfortable, variable, multifactorial, and often avoided. The themes that emerged from this analysis support the utility of communication science and health behavior theories to better understand and improve PDA communication behaviors of both prescribers and pharmacists, and thereby improve engagement in PDA prevention and treatment. Copyright © 2015 Elsevier Inc. All rights reserved.
Hagemeier, Nicholas E.; Tudiver, Fred; Brewster, Scott; Hagy, Elizabeth J.; Hagaman, Angela; Pack, Robert P.
2016-01-01
Background Interpersonal communication is inherent in a majority of strategies seeking to engage prescriber and pharmacist health care professionals (HCPs) in the reduction and prevention of prescription drug abuse (PDA). However, research on HCP PDA communication behavioral engagement and factors that influence it is limited. Objectives This study quantitatively examined communication behaviors and trait-level communication metrics, and qualitatively described prescription drug abuse-related communication perceptions and behaviors among primary care prescribers and community pharmacists. Methods Five focus groups (N=35) were conducted within the Appalachian Research Network (AppNET), a rural primary care practice-based research network (PBRN) in South Central Appalachia between February and October, 2014. Focus groups were structured around the administration of three previously validated trait-level communication survey instruments, and one instrument developed by the investigators to gauge HCP prescription drug abuse communication engagement and perceived communication importance. Using a grounded theory approach, focus group themes were inductively derived and coded independently by study investigators. Member-checking interviews were conducted to validate derived themes. Results Respondents’ trait-level communication self-perceptions indicated low communication apprehension, high self-perceived communication competence, and average willingness to communicate as compared to instrument specific criteria and norms. Significant variation in HCP communication behavior engagement was noted specific to PDA. Two overarching themes were noted for HCP-patient communication: 1) influencers of HCP communication and prescribing/dispensing behaviors, and 2) communication behaviors. Multiple sub-themes were identified within each theme. Similarities were noted in perceptions and behaviors across both prescribers and pharmacists. Conclusions Despite the perceived importance of engaging in PDA communication, HCPs reported that prescription drug abuse communication is uncomfortable, variable, multifactorial, and often avoided. The themes that emerged from this analysis support the utility of communication science and health behavior theories to better understand and improve PDA communication behaviors of both prescribers and pharmacists, and thereby improve engagement in PDA prevention and treatment. PMID:26806859
Becoming Inclusive: A Code of Conduct for Inclusion and Diversity.
Schmidt, Bonnie J; MacWilliams, Brent R; Neal-Boylan, Leslie
There are increasing concerns about exclusionary behaviors and lack of diversity in the nursing profession. Exclusionary behaviors, which may include incivility, bullying, and workplace violence, discriminate and isolate individuals and groups who are different, whereas inclusive behaviors encourage diversity. To address inclusion and diversity in nursing, this article offers a code of conduct. This code of conduct builds on existing nursing codes of ethics and applies to nursing students and nurses in both educational and practice settings. Inclusive behaviors that are demonstrated in nurses' relationships with patients, colleagues, the profession, and society are described. This code of conduct provides a basis for measureable change, empowerment, and unification of the profession. Recommendations, implications, and a pledge to action are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
Huo, Jinhai; Yang, Ming; Tina Shih, Ya-Chen
2018-03-01
The "meaningful use of certified electronic health record" policy requires eligible professionals to record smoking status for more than 50% of all individuals aged 13 years or older in 2011 to 2012. To explore whether the coding to document smoking behavior has increased over time and to assess the accuracy of smoking-related diagnosis and procedure codes in identifying previous and current smokers. We conducted an observational study with 5,423,880 enrollees from the year 2009 to 2014 in the Truven Health Analytics database. Temporal trends of smoking coding, sensitivity, specificity, positive predictive value, and negative predictive value were measured. The rate of coding of smoking behavior improved significantly by the end of the study period. The proportion of patients in the claims data recorded as current smokers increased 2.3-fold and the proportion of patients recorded as previous smokers increased 4-fold during the 6-year period. The sensitivity of each International Classification of Diseases, Ninth Revision, Clinical Modification code was generally less than 10%. The diagnosis code of tobacco use disorder (305.1X) was the most sensitive code (9.3%) for identifying smokers. The specificities of these codes and the Current Procedural Terminology codes were all more than 98%. A large improvement in the coding of current and previous smoking behavior has occurred since the inception of the meaningful use policy. Nevertheless, the use of diagnosis and procedure codes to identify smoking behavior in administrative data is still unreliable. This suggests that quality improvements toward medical coding on smoking behavior are needed to enhance the capability of claims data for smoking-related outcomes research. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Error floor behavior study of LDPC codes for concatenated codes design
NASA Astrophysics Data System (ADS)
Chen, Weigang; Yin, Liuguo; Lu, Jianhua
2007-11-01
Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.
Carnahan, Ryan M; Kee, Vicki R
2012-01-01
This paper aimed to systematically review algorithms to identify transfusion-related ABO incompatibility reactions in administrative data, with a focus on studies that have examined the validity of the algorithms. A literature search was conducted using PubMed, Iowa Drug Information Service database, and Embase. A Google Scholar search was also conducted because of the difficulty identifying relevant studies. Reviews were conducted by two investigators to identify studies using data sources from the USA or Canada because these data sources were most likely to reflect the coding practices of Mini-Sentinel data sources. One study was found that validated International Classification of Diseases (ICD-9-CM) codes representing transfusion reactions. None of these cases were ABO incompatibility reactions. Several studies consistently used ICD-9-CM code 999.6, which represents ABO incompatibility reactions, and a technical report identified the ICD-10 code for these reactions. One study included the E-code E8760 for mismatched blood in transfusion in the algorithm. Another study reported finding no ABO incompatibility reaction codes in the Healthcare Cost and Utilization Project Nationwide Inpatient Sample database, which contains data of 2.23 million patients who received transfusions, raising questions about the sensitivity of administrative data for identifying such reactions. Two studies reported perfect specificity, with sensitivity ranging from 21% to 83%, for the code identifying allogeneic red blood cell transfusions in hospitalized patients. There is no information to assess the validity of algorithms to identify transfusion-related ABO incompatibility reactions. Further information on the validity of algorithms to identify transfusions would also be useful. Copyright © 2012 John Wiley & Sons, Ltd.
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry.
Jørgensen, Laura Krogh; Dalgaard, Lars Skov; Østergaard, Lars Jørgen; Andersen, Nanna Skaarup; Nørgaard, Mette; Mogensen, Trine Hyrup
2016-01-01
Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE) in the Danish National Patient Registry (DNPR). The DNPR was used to identify all hospitalized patients, aged ≥15 years, with a first-time diagnosis of HSE according to the International Classification of Diseases, tenth revision (ICD-10), from 2004 to 2014. To validate the coding of HSE, we collected data from the Danish Microbiology Database, from departments of clinical microbiology, and from patient medical records. Cases were classified as confirmed, probable, or no evidence of HSE. We estimated the positive predictive value (PPV) of the HSE diagnosis coding stratified by diagnosis type, study period, and department type. Furthermore, we estimated the proportion of HSE cases coded with nonspecific ICD-10 codes of viral encephalitis and also the sensitivity of the HSE diagnosis coding. We were able to validate 398 (94.3%) of the 422 HSE diagnoses identified via the DNPR. Hereof, 202 (50.8%) were classified as confirmed cases and 29 (7.3%) as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0-62.9). For "Encephalitis due to herpes simplex virus" (ICD-10 code B00.4), the PPV was 56.6% (95% CI: 51.1-62.0). Similarly, the PPV for "Meningoencephalitis due to herpes simplex virus" (ICD-10 code B00.4A) was 56.8% (95% CI: 39.5-72.9). "Herpes viral encephalitis" (ICD-10 code G05.1E) had a PPV of 75.9% (95% CI: 56.5-89.7), thereby representing the highest PPV. The estimated sensitivity was 95.5%. The PPVs of the ICD-10 diagnosis coding for adult HSE in the DNPR were relatively low. Hence, the DNPR should be used with caution when studying patients with encephalitis caused by herpes simplex virus.
ERIC Educational Resources Information Center
Arffman, Inga
2016-01-01
Open-ended (OE) items are widely used to gather data on student performance in international achievement studies. However, several factors may threaten validity when using such items. This study examined Finnish coders' opinions about threats to validity when coding responses to OE items in the PISA 2012 problem-solving test. A total of 6…
Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing
NASA Astrophysics Data System (ADS)
Salamone, Joseph A., III
Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.
Validation of NASA Thermal Ice Protection Computer Codes Part 2 - LEWICE/Thermal
DOT National Transportation Integrated Search
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center: LEWICE/Thermal 1 (electrothermal de-icing and anti-icing), and ANTICE 2 (hot gas and el...
Capturing the spectrum of household food and beverage purchasing behavior: a review.
French, Simone A; Shimotsu, Scott T; Wall, Melanie; Gerlach, Anne Faricy
2008-12-01
The household setting may be the most important level at which to understand the food choices of individuals and how healthful food choices can be promoted. However, there are few available measures of the food purchase behaviors of households and little consensus on the best way to measure it. This review explores the currently available measures of household food purchasing behavior. Three main measures are described, evaluated, and compared: home food inventories, food and beverage purchase records and receipts, and Universal Product Code bar code scanning. The development of coding, aggregation, and analytical methods for these measures of household food purchasing behavior is described. Currently, annotated receipts and records are the most comprehensive, detailed measure of household food purchasing behavior, and are feasible for population-based samples. Universal Product Code scanning is not recommended due to its cost and complexity. Research directions to improve household food purchasing behavior measures are discussed.
Kolehmainen, Christine; Brennan, Meghan; Filut, Amarette; Isaac, Carol; Carnes, Molly
2014-09-01
Ineffective leadership during cardiopulmonary resuscitation ("code") can negatively affect a patient's likelihood of survival. In most teaching hospitals, internal medicine residents lead codes. In this study, the authors explored internal medicine residents' experiences leading codes, with a particular focus on how gender influences the code leadership experience. The authors conducted individual, semistructured telephone or in-person interviews with 25 residents (May 2012 to February 2013) from 9 U.S. internal medicine residency programs. They audio recorded and transcribed the interviews and then thematically analyzed the transcribed text. Participants viewed a successful code as one with effective leadership. They agreed that the ideal code leader was an authoritative presence; spoke with a deep, loud voice; used clear, direct communication; and appeared calm. Although equally able to lead codes as their male colleagues, female participants described feeling stress from having to violate gender behavioral norms in the role of code leader. In response, some female participants adopted rituals to signal the suspension of gender norms while leading a code. Others apologized afterwards for their counternormative behavior. Ideal code leadership embodies highly agentic, stereotypical male behaviors. Female residents employed strategies to better integrate the competing identities of code leader and female gender. In the future, residency training should acknowledge how female gender stereotypes may conflict with the behaviors required to enact code leadership and offer some strategies, such as those used by the female residents in this study, to help women integrate these dual identities.
Analysis of Compositional Data in Communication Disorders Research
ERIC Educational Resources Information Center
Pennington, Lindsay; James, Peter; McNally, Richard; Pay, Helen; McConachie, Helen
2009-01-01
The study of communication and its disorders often involves coding several behaviors and examining the proportions with which individual behaviors are produced within data sets. Problems are encountered when studying multiple behaviors between data sets, because of the interdependence of the proportions: as one coded behavior increases, at least…
Follow the Code: Rules or Guidelines for Academic Deans' Behavior?
ERIC Educational Resources Information Center
Bray, Nathaniel J.
2012-01-01
In the popular movie series "Pirates of the Caribbean," there is a pirate code that influences how pirates behave in unclear situations, with a running joke about whether the code is either a set of rules or guidelines for behavior. Codes of conduct in any social group or organization can have much the same feel; they can provide clarity and…
WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley; Michelen, Carlos; Bosma, Bret
2016-08-01
The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less
Validation of Ray Tracing Code Refraction Effects
NASA Technical Reports Server (NTRS)
Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.
2008-01-01
NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.
Validation of a Communication Process Measure for Coding Control in Counseling.
ERIC Educational Resources Information Center
Heatherington, Laurie
The increasingly popular view of the counseling process from an interactional perspective necessitates the development of new measurement instruments which are suitable to the study of the reciprocal interaction between people. The validity of the Relational Communication Coding System, an instrument which operationalizes the constructs of…
Comparing thin slices of verbal communication behavior of varying number and duration.
Carcone, April Idalski; Naar, Sylvie; Eggly, Susan; Foster, Tanina; Albrecht, Terrance L; Brogan, Kathryn E
2015-02-01
The aim of this study was to assess the accuracy of thin slices to characterize the verbal communication behavior of counselors and patients engaged in Motivational Interviewing sessions relative to fully coded sessions. Four thin slice samples that varied in number (four versus six slices) and duration (one- versus two-minutes) were extracted from a previously coded dataset. In the parent study, an observational code scheme was used to characterize specific counselor and patient verbal communication behaviors. For the current study, we compared the frequency of communication codes and the correlations among the full dataset and each thin slice sample. Both the proportion of communication codes and strength of the correlation demonstrated the highest degree of accuracy when a greater number (i.e., six versus four) and duration (i.e., two- versus one-minute) of slices were extracted. These results suggest that thin slice sampling may be a useful and accurate strategy to reduce coding burden when coding specific verbal communication behaviors within clinical encounters. We suggest researchers interested in using thin slice sampling in their own work conduct preliminary research to determine the number and duration of thin slices required to accurately characterize the behaviors of interest. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Huang, Yan-Hua; Yang, Sheng-Qi; Zhao, Jian
2016-12-01
A three-dimensional particle flow code (PFC3D) was used for a systematic numerical simulation of the strength failure and cracking behavior of rock-like material specimens containing two unparallel fissures under conventional triaxial compression. The micro-parameters of the parallel bond model were first calibrated using the laboratory results of intact specimens and then validated from the experimental results of pre-fissured specimens under triaxial compression. Numerically simulated stress-strain curves, strength and deformation parameters and macro-failure modes of pre-fissured specimens were all in good agreement with the experimental results. The relationship between stress and the micro-crack numbers was summarized. Crack initiation, propagation and coalescence process of pre-fissured specimens were analyzed in detail. Finally, horizontal and vertical cross sections of numerical specimens were derived from PFC3D. A detailed analysis to reveal the internal damage behavior of rock under triaxial compression was carried out. The experimental and simulated results are expected to improve the understanding of the strength failure and cracking behavior of fractured rock under triaxial compression.
Axial Crushing Behaviors of Thin-Walled Corrugated and Circular Tubes - A Comparative Study
NASA Astrophysics Data System (ADS)
Reyaz-Ur-Rahim, Mohd.; Bharti, P. K.; Umer, Afaque
2017-10-01
With the help of finite element analysis, this research paper deals with the energy absorption and collapse behavior with different corrugated section geometries of hollow tubes made of aluminum alloy 6060-T4. Literature available experimental data were used to validate the numerical models of the structures investigated. Based on the results available for symmetric crushing of circular tubes, models were developed to investigate corrugated thin-walled structures behavior. To study the collapse mechanism and energy absorbing ability in axial compression, the simulation was carried in ABAQUS /EXPLICIT code. In the simulation part, specimens were prepared and axially crushed to one-fourth length of the tube and the energy diagram of crushing force versus axial displacement is shown. The effect of various parameters such as pitch, mean diameter, corrugation, amplitude, the thickness is demonstrated with the help of diagrams. The overall result shows that the corrugated section geometry could be a good alternative to the conventional tubes.
Rapid Trust Establishment for Transient Use of Unmanaged Hardware
2006-12-01
unclassified b . ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: Establishing...Validate OS Trusted Host OS (From Disk) Validate App 1 Untrusted code Trusted code (a) Boot with trust initiator ( b ) Boot trusted Host OS (c) Launch...be validated. Execution of process with Id 3535 has been blocked to minimize security risks. ( b ) Notification to the user from the trust alerter
NASA Astrophysics Data System (ADS)
Setiawan, Jody; Nakazawa, Shoji
2017-10-01
This paper discusses about comparison of seismic response behaviors, seismic performance and seismic loss function of a conventional special moment frame steel structure (SMF) and a special moment frame steel structure with base isolation (BI-SMF). The validation of the proposed simplified estimation method of the maximum deformation of the base isolation system by using the equivalent linearization method and the validation of the design shear force of the superstructure are investigated from results of the nonlinear dynamic response analysis. In recent years, the constructions of steel office buildings with seismic isolation system are proceeding even in Indonesia where the risk of earthquakes is high. Although the design code for the seismic isolation structure has been proposed, there is no actual construction example for special moment frame steel structure with base isolation. Therefore, in this research, the SMF and BI-SMF buildings are designed by Indonesian Building Code which are assumed to be built at Padang City in Indonesia. The material of base isolation system is high damping rubber bearing. Dynamic eigenvalue analysis and nonlinear dynamic response analysis are carried out to show the dynamic characteristics and seismic performance. In addition, the seismic loss function is obtained from damage state probability and repair cost. For the response analysis, simulated ground accelerations, which have the phases of recorded seismic waves (El Centro NS, El Centro EW, Kobe NS and Kobe EW), adapted to the response spectrum prescribed by the Indonesian design code, that has, are used.
Method of assessing parent-child grocery store purchasing interactions using a micro-camcorder.
Calloway, Eric E; Roberts-Gray, Cindy; Ranjit, Nalini; Sweitzer, Sara J; McInnis, Katie A; Romo-Palafox, Maria J; Briley, Margaret E
2014-12-01
The purpose of this study was to assess the validity of using participant worn micro-camcorders (PWMC) to collect data on parent-child food and beverage purchasing interactions in the grocery store. Parent-child dyads (n = 32) were met at their usual grocery store and shopping time. Parents were mostly Caucasian (n = 27, 84.4%), mothers (n = 30, 93.8%). Children were 2-6 years old with 15 girls and 17 boys. A micro-camcorder was affixed to a baseball style hat worn by the child. The dyad proceeded to shop while being shadowed by an in-person observer. Video/audio data were coded for behavioral and environmental variables. The PWMC method was compared to in-person observation to assess sensitivity and relative validity for measuring parent-child interactions, and compared to receipt data to assess criterion validity for evaluating purchasing decisions. Inter-rater reliability for coding video/audio data collected using the PWMC method was also assessed. The PWMC method proved to be more sensitive than in-person observation revealing on average 1.4 (p < 0.01) more parent-child food and beverage purchasing interactions per shopping trip. Inter-rater reliability for coding PWMC data showed moderate to almost perfect agreement (Cohen's kappa = 0.461-0.937). The PWMC method was significantly correlated with in-person observation for measuring occurrences of parent-child food purchasing interactions (rho = 0.911, p < 0.01) and characteristics of those interactions (rho = 0.345-0.850, p < 0.01). Additionally, there was substantial agreement between the PWMC method and receipt data for measuring purchasing decisions (Cohen's kappa = 0.787). The PWMC method proved to be well suited to assess parent-child food and beverage purchasing interactions in the grocery store. Copyright © 2014 Elsevier Ltd. All rights reserved.
CFD Modeling of Free-Piston Stirling Engines
NASA Technical Reports Server (NTRS)
Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.
2001-01-01
NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.
Implementation of a kappa-epsilon turbulence model to RPLUS3D code
NASA Technical Reports Server (NTRS)
Chitsomboon, Tawit
1992-01-01
The RPLUS3D code has been developed at the NASA Lewis Research Center to support the National Aerospace Plane (NASP) project. The code has the ability to solve three dimensional flowfields with finite rate combustion of hydrogen and air. The combustion process of the hydrogen-air system are simulated by an 18 reaction path, 8 species chemical kinetic mechanism. The code uses a Lower-Upper (LU) decomposition numerical algorithm as its basis, making it a very efficient and robust code. Except for the Jacobian matrix for the implicit chemistry source terms, there is no inversion of a matrix even though a fully implicit numerical algorithm is used. A k-epsilon turbulence model has recently been incorporated into the code. Initial validations have been conducted for a flow over a flat plate. Results of the validation studies are shown. Some difficulties in implementing the k-epsilon equations to the code are also discussed.
Implementation of a kappa-epsilon turbulence model to RPLUS3D code
NASA Astrophysics Data System (ADS)
Chitsomboon, Tawit
1992-02-01
The RPLUS3D code has been developed at the NASA Lewis Research Center to support the National Aerospace Plane (NASP) project. The code has the ability to solve three dimensional flowfields with finite rate combustion of hydrogen and air. The combustion process of the hydrogen-air system are simulated by an 18 reaction path, 8 species chemical kinetic mechanism. The code uses a Lower-Upper (LU) decomposition numerical algorithm as its basis, making it a very efficient and robust code. Except for the Jacobian matrix for the implicit chemistry source terms, there is no inversion of a matrix even though a fully implicit numerical algorithm is used. A k-epsilon turbulence model has recently been incorporated into the code. Initial validations have been conducted for a flow over a flat plate. Results of the validation studies are shown. Some difficulties in implementing the k-epsilon equations to the code are also discussed.
Evaluation in industry of a draft code of practice for manual handling.
Ashby, Liz; Tappin, David; Bentley, Tim
2004-05-01
This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.
Validation of Living Donor Nephrectomy Codes
Lam, Ngan N.; Lentine, Krista L.; Klarenbach, Scott; Sood, Manish M.; Kuwornu, Paul J.; Naylor, Kyla L.; Knoll, Gregory A.; Kim, S. Joseph; Young, Ann; Garg, Amit X.
2018-01-01
Background: Use of administrative data for outcomes assessment in living kidney donors is increasing given the rarity of complications and challenges with loss to follow-up. Objective: To assess the validity of living donor nephrectomy in health care administrative databases compared with the reference standard of manual chart review. Design: Retrospective cohort study. Setting: 5 major transplant centers in Ontario, Canada. Patients: Living kidney donors between 2003 and 2010. Measurements: Sensitivity and positive predictive value (PPV). Methods: Using administrative databases, we conducted a retrospective study to determine the validity of diagnostic and procedural codes for living donor nephrectomies. The reference standard was living donor nephrectomies identified through the province’s tissue and organ procurement agency, with verification by manual chart review. Operating characteristics (sensitivity and PPV) of various algorithms using diagnostic, procedural, and physician billing codes were calculated. Results: During the study period, there were a total of 1199 living donor nephrectomies. Overall, the best algorithm for identifying living kidney donors was the presence of 1 diagnostic code for kidney donor (ICD-10 Z52.4) and 1 procedural code for kidney procurement/excision (1PC58, 1PC89, 1PC91). Compared with the reference standard, this algorithm had a sensitivity of 97% and a PPV of 90%. The diagnostic and procedural codes performed better than the physician billing codes (sensitivity 60%, PPV 78%). Limitations: The donor chart review and validation study was performed in Ontario and may not be generalizable to other regions. Conclusions: An algorithm consisting of 1 diagnostic and 1 procedural code can be reliably used to conduct health services research that requires the accurate determination of living kidney donors at the population level. PMID:29662679
Billing code algorithms to identify cases of peripheral artery disease from administrative data
Fan, Jin; Arruda-Olson, Adelaide M; Leibson, Cynthia L; Smith, Carin; Liu, Guanghui; Bailey, Kent R; Kullo, Iftikhar J
2013-01-01
Objective To construct and validate billing code algorithms for identifying patients with peripheral arterial disease (PAD). Methods We extracted all encounters and line item details including PAD-related billing codes at Mayo Clinic Rochester, Minnesota, between July 1, 1997 and June 30, 2008; 22 712 patients evaluated in the vascular laboratory were divided into training and validation sets. Multiple logistic regression analysis was used to create an integer code score from the training dataset, and this was tested in the validation set. We applied a model-based code algorithm to patients evaluated in the vascular laboratory and compared this with a simpler algorithm (presence of at least one of the ICD-9 PAD codes 440.20–440.29). We also applied both algorithms to a community-based sample (n=4420), followed by a manual review. Results The logistic regression model performed well in both training and validation datasets (c statistic=0.91). In patients evaluated in the vascular laboratory, the model-based code algorithm provided better negative predictive value. The simpler algorithm was reasonably accurate for identification of PAD status, with lesser sensitivity and greater specificity. In the community-based sample, the sensitivity (38.7% vs 68.0%) of the simpler algorithm was much lower, whereas the specificity (92.0% vs 87.6%) was higher than the model-based algorithm. Conclusions A model-based billing code algorithm had reasonable accuracy in identifying PAD cases from the community, and in patients referred to the non-invasive vascular laboratory. The simpler algorithm had reasonable accuracy for identification of PAD in patients referred to the vascular laboratory but was significantly less sensitive in a community-based sample. PMID:24166724
Automated Measurement of Facial Expression in Infant-Mother Interaction: A Pilot Study
Messinger, Daniel S.; Mahoor, Mohammad H.; Chow, Sy-Miin; Cohn, Jeffrey F.
2009-01-01
Automated facial measurement using computer vision has the potential to objectively document continuous changes in behavior. To examine emotional expression and communication, we used automated measurements to quantify smile strength, eye constriction, and mouth opening in two six-month-old/mother dyads who each engaged in a face-to-face interaction. Automated measurements showed high associations with anatomically based manual coding (concurrent validity); measurements of smiling showed high associations with mean ratings of positive emotion made by naive observers (construct validity). For both infants and mothers, smile strength and eye constriction (the Duchenne marker) were correlated over time, creating a continuous index of smile intensity. Infant and mother smile activity exhibited changing (nonstationary) local patterns of association, suggesting the dyadic repair and dissolution of states of affective synchrony. The study provides insights into the potential and limitations of automated measurement of facial action. PMID:19885384
Experimental program for real gas flow code validation at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Deiwert, George S.; Strawa, Anthony W.; Sharma, Surendra P.; Park, Chul
1989-01-01
The experimental program for validating real gas hypersonic flow codes at NASA Ames Rsearch Center is described. Ground-based test facilities used include ballistic ranges, shock tubes and shock tunnels, arc jet facilities and heated-air hypersonic wind tunnels. Also included are large-scale computer systems for kinetic theory simulations and benchmark code solutions. Flight tests consist of the Aeroassist Flight Experiment, the Space Shuttle, Project Fire 2, and planetary probes such as Galileo, Pioneer Venus, and PAET.
Validation: Codes to compare simulation data to various observations
NASA Astrophysics Data System (ADS)
Cohn, J. D.
2017-02-01
Validation provides codes to compare several observations to simulated data with stellar mass and star formation rate, simulated data stellar mass function with observed stellar mass function from PRIMUS or SDSS-GALEX in several redshift bins from 0.01-1.0, and simulated data B band luminosity function with observed stellar mass function, and to create plots for various attributes, including stellar mass functions, and stellar mass to halo mass. These codes can model predictions (in some cases alongside observational data) to test other mock catalogs.
Barnado, April; Casey, Carolyn; Carroll, Robert J; Wheless, Lee; Denny, Joshua C; Crofford, Leslie J
2017-05-01
To study systemic lupus erythematosus (SLE) in the electronic health record (EHR), we must accurately identify patients with SLE. Our objective was to develop and validate novel EHR algorithms that use International Classification of Diseases, Ninth Revision (ICD-9), Clinical Modification codes, laboratory testing, and medications to identify SLE patients. We used Vanderbilt's Synthetic Derivative, a de-identified version of the EHR, with 2.5 million subjects. We selected all individuals with at least 1 SLE ICD-9 code (710.0), yielding 5,959 individuals. To create a training set, 200 subjects were randomly selected for chart review. A subject was defined as a case if diagnosed with SLE by a rheumatologist, nephrologist, or dermatologist. Positive predictive values (PPVs) and sensitivity were calculated for combinations of code counts of the SLE ICD-9 code, a positive antinuclear antibody (ANA), ever use of medications, and a keyword of "lupus" in the problem list. The algorithms with the highest PPV were each internally validated using a random set of 100 individuals from the remaining 5,759 subjects. The algorithm with the highest PPV at 95% in the training set and 91% in the validation set was 3 or more counts of the SLE ICD-9 code, ANA positive (≥1:40), and ever use of both disease-modifying antirheumatic drugs and steroids, while excluding individuals with systemic sclerosis and dermatomyositis ICD-9 codes. We developed and validated the first EHR algorithm that incorporates laboratory values and medications with the SLE ICD-9 code to identify patients with SLE accurately. © 2016, American College of Rheumatology.
VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.
2015-12-01
A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.
NASA Rotor 37 CFD Code Validation: Glenn-HT Code
NASA Technical Reports Server (NTRS)
Ameri, Ali A.
2010-01-01
In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.
Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.
2016-01-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786
Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A
2017-02-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Rui; Sumner, Tyler S.
2016-04-17
An advanced system analysis tool SAM is being developed for fast-running, improved-fidelity, and whole-plant transient analyses at Argonne National Laboratory under DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. As an important part of code development, companion validation activities are being conducted to ensure the performance and validity of the SAM code. This paper presents the benchmark simulations of two EBR-II tests, SHRT-45R and BOP-302R, whose data are available through the support of DOE-NE’s Advanced Reactor Technology (ART) program. The code predictions of major primary coolant system parameter are compared with the test results. Additionally, the SAS4A/SASSYS-1 code simulationmore » results are also included for a code-to-code comparison.« less
Kahler, Christopher W; Caswell, Amy J; Laws, M Barton; Walthers, Justin; Magill, Molly; Mastroleo, Nadine R; Howe, Chanelle J; Souza, Timothy; Wilson, Ira; Bryant, Kendall; Monti, Peter M
2016-10-01
To elucidate patient language that supports changing a health behavior (change talk) or sustaining the behavior (sustain talk). We developed a novel coding system to characterize topics of patient speech in a motivational intervention targeting alcohol and HIV/sexual risk in 90 Emergency Department patients. We further coded patient language as change or sustain talk. For both alcohol and sex, discussions focusing on benefits of behavior change or change planning were most likely to involve change talk, and these topics comprised a large portion of all change talk. Greater discussion of barriers and facilitators of change also was associated with more change talk. For alcohol use, benefits of drinking behavior was the most common topic of sustain talk. For sex risk, benefits of sexual behavior were rarely discussed, and sustain talk centered more on patterns and contexts, negations of drawbacks, and drawbacks of sexual risk behavior change. Topic coding provided unique insights into the content of patient change and sustain talk. Patients are most likely to voice change talk when conversation focuses on behavior change rather than ongoing behavior. Interventions addressing multiple health behaviors should address the unique motivations for maintaining specific risky behaviors. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Sam, Ann; Reszka, Stephanie; Odom, Samuel; Hume, Kara; Boyd, Brian
2015-01-01
Momentary time sampling, partial-interval recording, and event coding are observational coding methods commonly used to examine the social and challenging behaviors of children at risk for or with developmental delays or disabilities. Yet there is limited research comparing the accuracy of and relationship between these three coding methods. By…
Sada, Yvonne; Hou, Jason; Richardson, Peter; El-Serag, Hashem; Davila, Jessica
2013-01-01
Background Accurate identification of hepatocellular cancer (HCC) cases from automated data is needed for efficient and valid quality improvement initiatives and research. We validated HCC ICD-9 codes, and evaluated whether natural language processing (NLP) by the Automated Retrieval Console (ARC) for document classification improves HCC identification. Methods We identified a cohort of patients with ICD-9 codes for HCC during 2005–2010 from Veterans Affairs administrative data. Pathology and radiology reports were reviewed to confirm HCC. The positive predictive value (PPV), sensitivity, and specificity of ICD-9 codes were calculated. A split validation study of pathology and radiology reports was performed to develop and validate ARC algorithms. Reports were manually classified as diagnostic of HCC or not. ARC generated document classification algorithms using the Clinical Text Analysis and Knowledge Extraction System. ARC performance was compared to manual classification. PPV, sensitivity, and specificity of ARC were calculated. Results 1138 patients with HCC were identified by ICD-9 codes. Based on manual review, 773 had HCC. The HCC ICD-9 code algorithm had a PPV of 0.67, sensitivity of 0.95, and specificity of 0.93. For a random subset of 619 patients, we identified 471 pathology reports for 323 patients and 943 radiology reports for 557 patients. The pathology ARC algorithm had PPV of 0.96, sensitivity of 0.96, and specificity of 0.97. The radiology ARC algorithm had PPV of 0.75, sensitivity of 0.94, and specificity of 0.68. Conclusion A combined approach of ICD-9 codes and NLP of pathology and radiology reports improves HCC case identification in automated data. PMID:23929403
Sada, Yvonne; Hou, Jason; Richardson, Peter; El-Serag, Hashem; Davila, Jessica
2016-02-01
Accurate identification of hepatocellular cancer (HCC) cases from automated data is needed for efficient and valid quality improvement initiatives and research. We validated HCC International Classification of Diseases, 9th Revision (ICD-9) codes, and evaluated whether natural language processing by the Automated Retrieval Console (ARC) for document classification improves HCC identification. We identified a cohort of patients with ICD-9 codes for HCC during 2005-2010 from Veterans Affairs administrative data. Pathology and radiology reports were reviewed to confirm HCC. The positive predictive value (PPV), sensitivity, and specificity of ICD-9 codes were calculated. A split validation study of pathology and radiology reports was performed to develop and validate ARC algorithms. Reports were manually classified as diagnostic of HCC or not. ARC generated document classification algorithms using the Clinical Text Analysis and Knowledge Extraction System. ARC performance was compared with manual classification. PPV, sensitivity, and specificity of ARC were calculated. A total of 1138 patients with HCC were identified by ICD-9 codes. On the basis of manual review, 773 had HCC. The HCC ICD-9 code algorithm had a PPV of 0.67, sensitivity of 0.95, and specificity of 0.93. For a random subset of 619 patients, we identified 471 pathology reports for 323 patients and 943 radiology reports for 557 patients. The pathology ARC algorithm had PPV of 0.96, sensitivity of 0.96, and specificity of 0.97. The radiology ARC algorithm had PPV of 0.75, sensitivity of 0.94, and specificity of 0.68. A combined approach of ICD-9 codes and natural language processing of pathology and radiology reports improves HCC case identification in automated data.
Gupta, Sumit; Nathan, Paul C; Baxter, Nancy N; Lau, Cindy; Daly, Corinne; Pole, Jason D
2018-06-01
Despite the importance of estimating population level cancer outcomes, most registries do not collect critical events such as relapse. Attempts to use health administrative data to identify these events have focused on older adults and have been mostly unsuccessful. We developed and tested administrative data-based algorithms in a population-based cohort of adolescents and young adults with cancer. We identified all Ontario adolescents and young adults 15-21 years old diagnosed with leukemia, lymphoma, sarcoma, or testicular cancer between 1992-2012. Chart abstraction determined the end of initial treatment (EOIT) date and subsequent cancer-related events (progression, relapse, second cancer). Linkage to population-based administrative databases identified fee and procedure codes indicating cancer treatment or palliative care. Algorithms determining EOIT based on a time interval free of treatment-associated codes, and new cancer-related events based on billing codes, were compared with chart-abstracted data. The cohort comprised 1404 patients. Time periods free of treatment-associated codes did not validly identify EOIT dates; using subsequent codes to identify new cancer events was thus associated with low sensitivity (56.2%). However, using administrative data codes that occurred after the EOIT date based on chart abstraction, the first cancer-related event was identified with excellent validity (sensitivity, 87.0%; specificity, 93.3%; positive predictive value, 81.5%; negative predictive value, 95.5%). Although administrative data alone did not validly identify cancer-related events, administrative data in combination with chart collected EOIT dates was associated with excellent validity. The collection of EOIT dates by cancer registries would significantly expand the potential of administrative data linkage to assess cancer outcomes.
A Large Scale Code Resolution Service Network in the Internet of Things
Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan
2012-01-01
In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT's advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207
A large scale code resolution service network in the Internet of Things.
Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan
2012-11-07
In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.
Estimation of equivalence ratio distribution in diesel spray using a computational fluid dynamics
NASA Astrophysics Data System (ADS)
Suzuki, Yasumasa; Tsujimura, Taku; Kusaka, Jin
2014-08-01
It is important to understand the mechanism of mixing and atomization of the diesel spray. In addition, the computational prediction of mixing behavior and internal structure of a diesel spray is expected to promote the further understanding about a diesel spray and development of the diesel engine including devices for fuel injection. In this study, we predicted the formation of diesel fuel spray with 3D-CFD code and validated the application by comparing experimental results of the fuel spray behavior and the equivalence ratio visualized by Layleigh-scatter imaging under some ambient, injection and fuel conditions. Using the applicable constants of KH-RT model, we can predict the liquid length spray on a quantitative level. under various fuel injection, ambient and fuel conditions. On the other hand, the change of the vapor penetration and the fuel mass fraction and equivalence ratio distribution with change of fuel injection and ambient conditions quantitatively. The 3D-CFD code used in this study predicts the spray cone angle and entrainment of ambient gas are predicted excessively, therefore there is the possibility of the improvement in the prediction accuracy by the refinement of fuel droplets breakup and evaporation model and the quantitative prediction of spray cone angle.
Wyker, Brett A; Davison, Kirsten K
2010-01-01
Drawing on the Theory of Planned Behavior (TPB) and the Transtheoretical Model (TTM), this study (1) examines links between stages of change for following a plant-based diet (PBD) and consuming more fruits and vegetables (FV); (2) tests an integrated theoretical model predicting intention to follow a PBD; and (3) identifies associated salient beliefs. Cross-sectional. Large public university in the northeastern United States. 204 college students. TPB and TTM constructs were assessed using validated scales. Outcome, normative, and control beliefs were measured using open-ended questions. The overlap between stages of change for FV consumption and adopting a PBD was assessed using Spearman rank correlation analysis and cross-tab comparisons. The proposed model predicting adoption of a PBD was tested using structural equation modeling (SEM). Salient beliefs were coded using automatic response coding software. No association was found between stages of change for FV consumption and following a PBD. Results from SEM analyses provided support for the proposed model predicting intention to follow a PBD. Gender differences in salient beliefs for following a PBD were found. Results demonstrate the potential for effective theory-driven and stage-tailored public health interventions to promote PBDs. Copyright 2010 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.
Application of Probabilistic Analysis to Aircraft Impact Dynamics
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.
CELES: CUDA-accelerated simulation of electromagnetic scattering by large ensembles of spheres
NASA Astrophysics Data System (ADS)
Egel, Amos; Pattelli, Lorenzo; Mazzamuto, Giacomo; Wiersma, Diederik S.; Lemmer, Uli
2017-09-01
CELES is a freely available MATLAB toolbox to simulate light scattering by many spherical particles. Aiming at high computational performance, CELES leverages block-diagonal preconditioning, a lookup-table approach to evaluate costly functions and massively parallel execution on NVIDIA graphics processing units using the CUDA computing platform. The combination of these techniques allows to efficiently address large electrodynamic problems (>104 scatterers) on inexpensive consumer hardware. In this paper, we validate near- and far-field distributions against the well-established multi-sphere T-matrix (MSTM) code and discuss the convergence behavior for ensembles of different sizes, including an exemplary system comprising 105 particles.
SENR /NRPy + : Numerical relativity in singular curvilinear coordinate systems
NASA Astrophysics Data System (ADS)
Ruchlin, Ian; Etienne, Zachariah B.; Baumgarte, Thomas W.
2018-03-01
We report on a new open-source, user-friendly numerical relativity code package called SENR /NRPy + . Our code extends previous implementations of the BSSN reference-metric formulation to a much broader class of curvilinear coordinate systems, making it ideally suited to modeling physical configurations with approximate or exact symmetries. In the context of modeling black hole dynamics, it is orders of magnitude more efficient than other widely used open-source numerical relativity codes. NRPy + provides a Python-based interface in which equations are written in natural tensorial form and output at arbitrary finite difference order as highly efficient C code, putting complex tensorial equations at the scientist's fingertips without the need for an expensive software license. SENR provides the algorithmic framework that combines the C codes generated by NRPy + into a functioning numerical relativity code. We validate against two other established, state-of-the-art codes, and achieve excellent agreement. For the first time—in the context of moving puncture black hole evolutions—we demonstrate nearly exponential convergence of constraint violation and gravitational waveform errors to zero as the order of spatial finite difference derivatives is increased, while fixing the numerical grids at moderate resolution in a singular coordinate system. Such behavior outside the horizons is remarkable, as numerical errors do not converge to zero near punctures, and all points along the polar axis are coordinate singularities. The formulation addresses such coordinate singularities via cell-centered grids and a simple change of basis that analytically regularizes tensor components with respect to the coordinates. Future plans include extending this formulation to allow dynamical coordinate grids and bispherical-like distribution of points to efficiently capture orbiting compact binary dynamics.
Theta phase precession and phase selectivity: a cognitive device description of neural coding
NASA Astrophysics Data System (ADS)
Zalay, Osbert C.; Bardakjian, Berj L.
2009-06-01
Information in neural systems is carried by way of phase and rate codes. Neuronal signals are processed through transformative biophysical mechanisms at the cellular and network levels. Neural coding transformations can be represented mathematically in a device called the cognitive rhythm generator (CRG). Incoming signals to the CRG are parsed through a bank of neuronal modes that orchestrate proportional, integrative and derivative transformations associated with neural coding. Mode outputs are then mixed through static nonlinearities to encode (spatio) temporal phase relationships. The static nonlinear outputs feed and modulate a ring device (limit cycle) encoding output dynamics. Small coupled CRG networks were created to investigate coding functionality associated with neuronal phase preference and theta precession in the hippocampus. Phase selectivity was found to be dependent on mode shape and polarity, while phase precession was a product of modal mixing (i.e. changes in the relative contribution or amplitude of mode outputs resulted in shifting phase preference). Nonlinear system identification was implemented to help validate the model and explain response characteristics associated with modal mixing; in particular, principal dynamic modes experimentally derived from a hippocampal neuron were inserted into a CRG and the neuron's dynamic response was successfully cloned. From our results, small CRG networks possessing disynaptic feedforward inhibition in combination with feedforward excitation exhibited frequency-dependent inhibitory-to-excitatory and excitatory-to-inhibitory transitions that were similar to transitions seen in a single CRG with quadratic modal mixing. This suggests nonlinear modal mixing to be a coding manifestation of the effect of network connectivity in shaping system dynamic behavior. We hypothesize that circuits containing disynaptic feedforward inhibition in the nervous system may be candidates for interpreting upstream rate codes to guide downstream processes such as phase precession, because of their demonstrated frequency-selective properties.
Howard, Matt C; Jayne, Bradley S
2015-03-01
Cyberpsychology is a recently emergent field that examines the impact of technology upon human cognition and behavior. Given its infancy, authors have rapidly created new measures to gauge their constructs of interest. Unfortunately, few of these authors have had the opportunity to test their scales' psychometric properties and validity. This is concerning, as many theoretical assumptions may be founded upon scales with inadequate attributes. If this were found to be true, then previous findings in cyberpsychology studies would need to be retested, and future research would need to shift its focus to creating psychometrically sound and valid measures. To provide inferences on this concern, the current study examines the article reporting, scale creation, and scale reliabilities of every article published in Cyberpsychology, Behavior, and Social Networking from its inception to July 2014. The final data set encompassed the coding of 1,478 individual articles, including 921 scales, and spanning 17 years. The results demonstrate that the simple survey methodology has become more popular over time. Authors are gradually applying empirically tested scales. However, self-created measures are still the most popular, leading to concerns about the measures' validity. Also, the use of multi-item measures has increased over time, but many articles still fail to report adequate information to assess the reliability of the applied scales. Lastly, the average scale reliability is 0.81, which barely meets standard cutoffs. Overall, these results are not overly concerning, but suggestions are given on methods to improve the reporting of measures, the creation of scales, and the state of cyberpsychology.
2010-01-01
Background In recent years, several primary care databases recording information from computerized medical records have been established and used for quality assessment of medical care and research. However, to be useful for research purposes, the data generated routinely from every day practice require registration of high quality. In this study we aimed to investigate (i) the frequency and validity of ICD code and drug prescription registration in the new Skaraborg primary care database (SPCD) and (ii) to investigate the sources of variation in this registration. Methods SPCD contains anonymous electronic medical records (ProfDoc III) automatically retrieved from all 24 public health care centres (HCC) in Skaraborg, Sweden. The frequencies of ICD code registration for the selected diagnoses diabetes mellitus, hypertension and chronic cardiovascular disease and the relevant drug prescriptions in the time period between May 2002 and October 2003 were analysed. The validity of data registration in the SPCD was assessed in a random sample of 50 medical records from each HCC (n = 1200 records) using the medical record text as gold standard. The variance of ICD code registration was studied with multi-level logistic regression analysis and expressed as median odds ratio (MOR). Results For diabetes mellitus and hypertension ICD codes were registered in 80-90% of cases, while for congestive heart failure and ischemic heart disease ICD codes were registered more seldom (60-70%). Drug prescription registration was overall high (88%). A correlation between the frequency of ICD coded visits and the sensitivity of the ICD code registration was found for hypertension and congestive heart failure but not for diabetes or ischemic heart disease. The frequency of ICD code registration varied from 42 to 90% between HCCs, and the greatest variation was found at the physician level (MORPHYSICIAN = 4.2 and MORHCC = 2.3). Conclusions Since the frequency of ICD code registration varies between different diagnoses, each diagnosis must be separately validated. Improved frequency and quality of ICD code registration might be achieved by interventions directed towards the physicians where the greatest amount of variation was found. PMID:20416069
Hjerpe, Per; Merlo, Juan; Ohlsson, Henrik; Bengtsson Boström, Kristina; Lindblad, Ulf
2010-04-23
In recent years, several primary care databases recording information from computerized medical records have been established and used for quality assessment of medical care and research. However, to be useful for research purposes, the data generated routinely from every day practice require registration of high quality. In this study we aimed to investigate (i) the frequency and validity of ICD code and drug prescription registration in the new Skaraborg primary care database (SPCD) and (ii) to investigate the sources of variation in this registration. SPCD contains anonymous electronic medical records (ProfDoc III) automatically retrieved from all 24 public health care centres (HCC) in Skaraborg, Sweden. The frequencies of ICD code registration for the selected diagnoses diabetes mellitus, hypertension and chronic cardiovascular disease and the relevant drug prescriptions in the time period between May 2002 and October 2003 were analysed. The validity of data registration in the SPCD was assessed in a random sample of 50 medical records from each HCC (n = 1200 records) using the medical record text as gold standard. The variance of ICD code registration was studied with multi-level logistic regression analysis and expressed as median odds ratio (MOR). For diabetes mellitus and hypertension ICD codes were registered in 80-90% of cases, while for congestive heart failure and ischemic heart disease ICD codes were registered more seldom (60-70%). Drug prescription registration was overall high (88%). A correlation between the frequency of ICD coded visits and the sensitivity of the ICD code registration was found for hypertension and congestive heart failure but not for diabetes or ischemic heart disease.The frequency of ICD code registration varied from 42 to 90% between HCCs, and the greatest variation was found at the physician level (MORPHYSICIAN = 4.2 and MORHCC = 2.3). Since the frequency of ICD code registration varies between different diagnoses, each diagnosis must be separately validated. Improved frequency and quality of ICD code registration might be achieved by interventions directed towards the physicians where the greatest amount of variation was found.
Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry
Jørgensen, Laura Krogh; Dalgaard, Lars Skov; Østergaard, Lars Jørgen; Andersen, Nanna Skaarup; Nørgaard, Mette; Mogensen, Trine Hyrup
2016-01-01
Background Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE) in the Danish National Patient Registry (DNPR). Methods The DNPR was used to identify all hospitalized patients, aged ≥15 years, with a first-time diagnosis of HSE according to the International Classification of Diseases, tenth revision (ICD-10), from 2004 to 2014. To validate the coding of HSE, we collected data from the Danish Microbiology Database, from departments of clinical microbiology, and from patient medical records. Cases were classified as confirmed, probable, or no evidence of HSE. We estimated the positive predictive value (PPV) of the HSE diagnosis coding stratified by diagnosis type, study period, and department type. Furthermore, we estimated the proportion of HSE cases coded with nonspecific ICD-10 codes of viral encephalitis and also the sensitivity of the HSE diagnosis coding. Results We were able to validate 398 (94.3%) of the 422 HSE diagnoses identified via the DNPR. Hereof, 202 (50.8%) were classified as confirmed cases and 29 (7.3%) as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0–62.9). For “Encephalitis due to herpes simplex virus” (ICD-10 code B00.4), the PPV was 56.6% (95% CI: 51.1–62.0). Similarly, the PPV for “Meningoencephalitis due to herpes simplex virus” (ICD-10 code B00.4A) was 56.8% (95% CI: 39.5–72.9). “Herpes viral encephalitis” (ICD-10 code G05.1E) had a PPV of 75.9% (95% CI: 56.5–89.7), thereby representing the highest PPV. The estimated sensitivity was 95.5%. Conclusion The PPVs of the ICD-10 diagnosis coding for adult HSE in the DNPR were relatively low. Hence, the DNPR should be used with caution when studying patients with encephalitis caused by herpes simplex virus. PMID:27330328
Cadieux, Geneviève; Tamblyn, Robyn; Buckeridge, David L; Dendukuri, Nandini
2017-08-01
Valid measurement of outcomes such as disease prevalence using health care utilization data is fundamental to the implementation of a "learning health system." Definitions of such outcomes can be complex, based on multiple diagnostic codes. The literature on validating such data demonstrates a lack of awareness of the need for a stratified sampling design and corresponding statistical methods. We propose a method for validating the measurement of diagnostic groups that have: (1) different prevalences of diagnostic codes within the group; and (2) low prevalence. We describe an estimation method whereby: (1) low-prevalence diagnostic codes are oversampled, and the positive predictive value (PPV) of the diagnostic group is estimated as a weighted average of the PPV of each diagnostic code; and (2) claims that fall within a low-prevalence diagnostic group are oversampled relative to claims that are not, and bias-adjusted estimators of sensitivity and specificity are generated. We illustrate our proposed method using an example from population health surveillance in which diagnostic groups are applied to physician claims to identify cases of acute respiratory illness. Failure to account for the prevalence of each diagnostic code within a diagnostic group leads to the underestimation of the PPV, because low-prevalence diagnostic codes are more likely to be false positives. Failure to adjust for oversampling of claims that fall within the low-prevalence diagnostic group relative to those that do not leads to the overestimation of sensitivity and underestimation of specificity.
Abraham, N S; Cohen, D C; Rivers, B; Richardson, P
2006-07-15
To validate veterans affairs (VA) administrative data for the diagnosis of nonsteroidal anti-inflammatory drug (NSAID)-related upper gastrointestinal events (UGIE) and to develop a diagnostic algorithm. A retrospective study of veterans prescribed an NSAID as identified from the national pharmacy database merged with in-patient and out-patient data, followed by primary chart abstraction. Contingency tables were constructed to allow comparison with a random sample of patients prescribed an NSAID, but without UGIE. Multivariable logistic regression analysis was used to derive a predictive algorithm. Once derived, the algorithm was validated in a separate cohort of veterans. Of 906 patients, 606 had a diagnostic code for UGIE; 300 were a random subsample of 11 744 patients (control). Only 161 had a confirmed UGIE. The positive predictive value (PPV) of diagnostic codes was poor, but improved from 27% to 51% with the addition of endoscopic procedural codes. The strongest predictors of UGIE were an in-patient ICD-9 code for gastric ulcer, duodenal ulcer and haemorrhage combined with upper endoscopy. This algorithm had a PPV of 73% when limited to patients >or=65 years (c-statistic 0.79). Validation of the algorithm revealed a PPV of 80% among patients with an overlapping NSAID prescription. NSAID-related UGIE can be assessed using VA administrative data. The optimal algorithm includes an in-patient ICD-9 code for gastric or duodenal ulcer and gastrointestinal bleeding combined with a procedural code for upper endoscopy.
Kolehmainen, Christine; Brennan, Meghan; Filut, Amarette; Isaac, Carol; Carnes, Molly
2014-01-01
Purpose Ineffective leadership during cardiopulmonary resuscitation (“code”) can negatively affect a patient’s likelihood of survival. In most teaching hospitals, internal medicine residents lead codes. In this study, the authors explored internal medicine residents’ experiences leading codes, with a particular focus on how gender influences the code leadership experience. Method The authors conducted individual, semi-structured telephone or in-person interviews with 25 residents (May 2012 to February 2013) from 9 U.S. internal medicine residency programs. They audio recorded and transcribed the interviews then thematically analyzed the transcribed text. Results Participants viewed a successful code as one with effective leadership. They agreed that the ideal code leader was an authoritative presence; spoke with a deep, loud voice; used clear, direct communication; and appeared calm. Although equally able to lead codes as their male colleagues, female participants described feeling stress from having to violate gender behavioral norms in the role of code leader. In response, some female participants adopted rituals to signal the suspension of gender norms while leading a code. Others apologized afterwards for their counter normative behavior. Conclusions Ideal code leadership embodies highly agentic, stereotypical male behaviors. Female residents employed strategies to better integrate the competing identities of code leader and female gender. In the future, residency training should acknowledge how female gender stereotypes may conflict with the behaviors required to enact code leadership and offer some strategies, such as those used by the female residents in this study, to help women integrate these dual identities. PMID:24979289
Overview of hypersonic CFD code calibration studies
NASA Technical Reports Server (NTRS)
Miller, Charles G.
1987-01-01
The topics are presented in viewgraph form and include the following: definitions of computational fluid dynamics (CFD) code validation; climate in hypersonics and LaRC when first 'designed' CFD code calibration studied was initiated; methodology from the experimentalist's perspective; hypersonic facilities; measurement techniques; and CFD code calibration studies.
Baiao, R; Baptista, J; Carneiro, A; Pinto, R; Toscano, C; Fearon, P; Soares, I; Mesquita, A R
2018-07-01
The preschool years are a period of great developmental achievements, which impact critically on a child's interactive skills. Having valid and reliable measures to assess interactive behaviour at this stage is therefore crucial. The aim of this study was to describe the adaptation and validation of the child coding of the Coding System for Mother-Child Interactions and discuss its applications and implications in future research and practice. Two hundred twenty Portuguese preschoolers and their mothers were videotaped during a structured task. Child and mother interactive behaviours were coded based on the task. Maternal reports on the child's temperament and emotional and behaviour problems were also collected, along with family psychosocial information. Interrater agreement was confirmed. The use of child Cooperation, Enthusiasm, and Negativity as subscales was supported by their correlations across tasks. Moreover, these subscales were correlated with each other, which supports the use of a global child interactive behaviour score. Convergent validity with a measure of emotional and behavioural problems (Child Behaviour Checklist 1 ½-5) was established, as well as divergent validity with a measure of temperament (Children's Behaviour Questionnaire-Short Form). Regarding associations with family variables, child interactive behaviour was only associated with maternal behaviour. Findings suggest that this coding system is a valid and reliable measure for assessing child interactive behaviour in preschool age children. It therefore represents an important alternative to this area of research and practice, with reduced costs and with more flexible training requirements. Attention should be given in future research to expanding this work to clinical populations and different age groups. © 2018 John Wiley & Sons Ltd.
Mertz, Marcel; Sofaer, Neema; Strech, Daniel
2014-09-27
The systematic review of reasons is a new way to obtain comprehensive information about specific ethical topics. One such review was carried out for the question of why post-trial access to trial drugs should or need not be provided. The objective of this study was to empirically validate this review using an author check method. The article also reports on methodological challenges faced by our study. We emailed a questionnaire to the 64 corresponding authors of those papers that were assessed in the review of reasons on post-trial access. The questionnaire consisted of all quotations ("reason mentions") that were identified by the review to represent a reason in a given author's publication, together with a set of codings for the quotations. The authors were asked to rate the correctness of the codings. We received 19 responses, from which only 13 were completed questionnaires. In total, 98 quotations and their related codes in the 13 questionnaires were checked by the addressees. For 77 quotations (79%), all codings were deemed correct, for 21 quotations (21%), some codings were deemed to need correction. Most corrections were minor and did not imply a complete misunderstanding of the citation. This first attempt to validate a review of reasons leads to four crucial methodological questions relevant to the future conduct of such validation studies: 1) How can a description of a reason be deemed incorrect? 2) Do the limited findings of this author check study enable us to determine whether the core results of the analysed SRR are valid? 3) Why did the majority of surveyed authors refrain from commenting on our understanding of their reasoning? 4) How can the method for validating reviews of reasons be improved?
NASA Technical Reports Server (NTRS)
Thompson, David E.
2005-01-01
Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.
CASL Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mousseau, Vincent Andrew; Dinh, Nam
2016-06-30
This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less
Assessing parental empathy: a role for empathy in child attachment.
Stern, Jessica A; Borelli, Jessica L; Smiley, Patricia A
2015-01-01
Although empathy has been associated with helping behavior and relationship quality, little research has evaluated the role of parental empathy in the development of parent-child relationships. The current study (1) establishes preliminary validity of the Parental Affective and Cognitive Empathy Scale (PACES), a method for coding empathy from parents' narrative responses to the Parent Development Interview - Revised for School-Aged Children (PDI-R-SC), and (2) tests a theoretical model of empathy and attachment. Sixty caregivers and their children completed a battery of questionnaire and interview measures, including the PDI-R-SC and the Child Attachment Interview (CAI). Caregivers' interview narratives were scored for empathy using PACES. PACES showed good interrater reliability and good convergent validity with a self-report empathy measure. Parent empathy was positively related to child attachment security (using a continuous score for narrative coherence) and emotional openness on the CAI, as well as to child perceptions of parental warmth. Moreover, parent empathy mediated the relation between parents' self-reported attachment style and their children's attachment security. Implications for attachment theory and future directions for establishing scale validity are discussed.
Benchmarking the ATLAS software through the Kit Validation engine
NASA Astrophysics Data System (ADS)
De Salvo, Alessandro; Brasolin, Franco
2010-04-01
The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.
Validation and Continued Development of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2016-10-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.
Perugia, Giulia; van Berkel, Roos; Díaz-Boladeras, Marta; Català-Mallofré, Andreu; Rauterberg, Matthias; Barakova, Emilia
2018-01-01
Engagement in activities is of crucial importance for people with dementia. State of the art assessment techniques rely exclusively on behavior observation to measure engagement in dementia. These techniques are either too general to grasp how engagement is naturally expressed through behavior or too complex to be traced back to an overall engagement state. We carried out a longitudinal study to develop a coding system of engagement-related behavior that could tackle these issues and to create an evidence-based model of engagement to make meaning of such a coding system. Fourteen elderlies with mild to moderate dementia took part in the study. They were involved in two activities: a game-based cognitive stimulation and a robot-based free play. The coding system was developed with a mixed approach: ethographic and Laban-inspired. First, we developed two ethograms to describe the behavior of participants in the two activities in detail. Then, we used Laban Movement Analysis (LMA) to identify a common structure to the behaviors in the two ethograms and unify them in a unique coding system. The inter-rater reliability (IRR) of the coding system proved to be excellent for cognitive games (kappa = 0.78) and very good for robot play (kappa = 0.74). From the scoring of the videos, we developed an evidence-based model of engagement. This was based on the most frequent patterns of body part organization (i.e., the way body parts are connected in movement) observed during activities. Each pattern was given a meaning in terms of engagement by making reference to the literature. The model was tested using structural equation modeling (SEM). It achieved an excellent goodness of fit and all the hypothesized relations between variables were significant. We called the coding system that we developed the Ethographic and Laban-Inspired Coding System of Engagement (ELICSE) and the model the Evidence-based Model of Engagement-related Behavior (EMODEB). To the best of our knowledge, the ELICSE and the EMODEB constitute the first formalization of engagement-related behavior for dementia that describes how behavior unfolds over time and what it means in terms of engagement. PMID:29881360
Perugia, Giulia; van Berkel, Roos; Díaz-Boladeras, Marta; Català-Mallofré, Andreu; Rauterberg, Matthias; Barakova, Emilia
2018-01-01
Engagement in activities is of crucial importance for people with dementia. State of the art assessment techniques rely exclusively on behavior observation to measure engagement in dementia. These techniques are either too general to grasp how engagement is naturally expressed through behavior or too complex to be traced back to an overall engagement state. We carried out a longitudinal study to develop a coding system of engagement-related behavior that could tackle these issues and to create an evidence-based model of engagement to make meaning of such a coding system. Fourteen elderlies with mild to moderate dementia took part in the study. They were involved in two activities: a game-based cognitive stimulation and a robot-based free play. The coding system was developed with a mixed approach: ethographic and Laban-inspired. First, we developed two ethograms to describe the behavior of participants in the two activities in detail. Then, we used Laban Movement Analysis (LMA) to identify a common structure to the behaviors in the two ethograms and unify them in a unique coding system. The inter-rater reliability (IRR) of the coding system proved to be excellent for cognitive games (kappa = 0.78) and very good for robot play (kappa = 0.74). From the scoring of the videos, we developed an evidence-based model of engagement. This was based on the most frequent patterns of body part organization (i.e., the way body parts are connected in movement) observed during activities. Each pattern was given a meaning in terms of engagement by making reference to the literature. The model was tested using structural equation modeling (SEM). It achieved an excellent goodness of fit and all the hypothesized relations between variables were significant. We called the coding system that we developed the Ethographic and Laban-Inspired Coding System of Engagement (ELICSE) and the model the Evidence-based Model of Engagement-related Behavior (EMODEB). To the best of our knowledge, the ELICSE and the EMODEB constitute the first formalization of engagement-related behavior for dementia that describes how behavior unfolds over time and what it means in terms of engagement.
CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, III, F. G.
2016-07-29
One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less
Validation Results for LEWICE 2.0. [Supplement
NASA Technical Reports Server (NTRS)
Wright, William B.; Rutkowski, Adam
1999-01-01
Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.
Using Prospect Theory to Investigate Decision-Making Bias Within an Information Security Context
2005-12-01
risk was acceptable, 5 when to the CA the risk was so bad...Population Proportion Lower Tail: Risk Averse (A) Coded as 0. Risk Seeking (B) Coded as 1. Ho (indifferent in risk behavior): p = . 5 Ha ( risk averse...Averse (A) Coded as 0. Risk Seeking (B) Coded as 1. Ho (indifferent in risk behavior): p = . 5 Ha ( risk averse thus significantly below . 5 ): p < . 5
Colour cyclic code for Brillouin distributed sensors
NASA Astrophysics Data System (ADS)
Le Floch, Sébastien; Sauser, Florian; Llera, Miguel; Rochat, Etienne
2015-09-01
For the first time, a colour cyclic coding (CCC) is theoretically and experimentally demonstrated for Brillouin optical time-domain analysis (BOTDA) distributed sensors. Compared to traditional intensity-modulated cyclic codes, the code presents an additional gain of √2 while keeping the same number of sequences as for a colour coding. A comparison with a standard BOTDA sensor is realized and validates the theoretical coding gain.
Validation of NASA Thermal Ice Protection Computer Codes. Part 3; The Validation of Antice
NASA Technical Reports Server (NTRS)
Al-Khalil, Kamel M.; Horvath, Charles; Miller, Dean R.; Wright, William B.
2001-01-01
An experimental program was generated by the Icing Technology Branch at NASA Glenn Research Center to validate two ice protection simulation codes: (1) LEWICE/Thermal for transient electrothermal de-icing and anti-icing simulations, and (2) ANTICE for steady state hot gas and electrothermal anti-icing simulations. An electrothermal ice protection system was designed and constructed integral to a 36 inch chord NACA0012 airfoil. The model was fully instrumented with thermo-couples, RTD'S, and heat flux gages. Tests were conducted at several icing environmental conditions during a two week period at the NASA Glenn Icing Research Tunnel. Experimental results of running-wet and evaporative cases were compared to the ANTICE computer code predictions and are presented in this paper.
Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van
2018-04-01
In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fuel-injector/air-swirl characterization
NASA Technical Reports Server (NTRS)
Mcvey, J. B.; Kennedy, J. B.; Bennett, J. C.
1985-01-01
The objectives of this program are to establish an experimental data base documenting the behavior of gas turbine engine fuel injector sprays as the spray interacts with the swirling gas flow existing in the combustor dome, and to conduct an assessment of the validity of current analytical techniques for predicting fuel spray behavior. Emphasis is placed on the acquisition of data using injector/swirler components which closely resemble components currently in use in advanced aircraft gas turbine engines, conducting tests under conditions that closely simulate or closely approximate those developed in actual combustors, and conducting a well-controlled experimental effort which will comprise using a combination of low-risk experiments and experiments requiring the use of state-of-the-art diagnostic instrumentation. Analysis of the data is to be conducted using an existing, TEACH-type code which employs a stochastic analysis of the motion of the dispersed phase in the turbulent continuum flow field.
Mehl, Matthias R.
2016-01-01
This article reviews the Electronically Activated Recorder or EAR as an ambulatory ecological momentary assessment tool for the real-world observation of daily behavior. Technically, the EAR is an audio recorder that intermittently records snippets of ambient sounds while participants go about their lives. Conceptually, it is a naturalistic observation method that yields an acoustic log of a person’s day as it unfolds. The power of the EAR lies in unobtrusively collecting authentic real-life observational data. In preserving a high degree of naturalism at the level of the raw recordings, it resembles ethnographic methods; through its sampling and coding, it enables larger empirical studies. The article provides an overview of the EAR method, reviews its validity, utility, and limitations, and discusses it in the context of current developments in ambulatory assessment, specifically the emerging field of mobile sensing. PMID:28529411
Mehl, Matthias R
2017-04-01
This article reviews the Electronically Activated Recorder or EAR as an ambulatory ecological momentary assessment tool for the real-world observation of daily behavior. Technically, the EAR is an audio recorder that intermittently records snippets of ambient sounds while participants go about their lives. Conceptually, it is a naturalistic observation method that yields an acoustic log of a person's day as it unfolds. The power of the EAR lies in unobtrusively collecting authentic real-life observational data. In preserving a high degree of naturalism at the level of the raw recordings, it resembles ethnographic methods; through its sampling and coding, it enables larger empirical studies. The article provides an overview of the EAR method, reviews its validity, utility, and limitations, and discusses it in the context of current developments in ambulatory assessment, specifically the emerging field of mobile sensing.
Guay, Stéphane; Nachar, Nadim; Lavoie, Marc E; Marchand, André; O'Connor, Kieron P
2017-01-01
Social support is one of the three strongest predictors of posttraumatic stress disorder (PTSD). In the present study, we aimed to assess the buffering power of overt socially supportive and unsupportive behaviors from the significant other, in a group with PTSD and a comparison group. A total of 46 individuals with PTSD and 42 individuals with obsessive-compulsive disorder (OCD) or panic disorder (PD) completed diagnostic interviews and an anxiety-oriented social interaction with a significant other. Heart rate of participants was continuously measured during this interaction and overt social behaviors from the significant other were recorded on videotape and coded using a validated system. Changes in heart rate in PTSD participants correlated negatively with changes in overt socially supportive behaviors from their significant other (r from -.36 to -.50, p < .05), while changes in overt unsupportive social behaviors from their significant other did not yield any significant correlation (r from -.01 to .05, p > .05). No such statistically significant association emerged in the group with OCD or PD (r from .01 to -.27, p > .05). This study sustain the buffering power of overt supportive behaviors from the significant other on heart rate changes in PTSD.
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Obando, Rodrigo A.
1993-01-01
The modeling and design of a fault-tolerant multiprocessor system is addressed. In particular, the behavior of the system during recovery and restoration after a fault has occurred is investigated. Given that a multicomputer system is designed using the Algorithm to Architecture to Mapping Model (ATAMM), and that a fault (death of a computing resource) occurs during its normal steady-state operation, a model is presented as a viable research tool for predicting the performance bounds of the system during its recovery and restoration phases. Furthermore, the bounds of the performance behavior of the system during this transient mode can be assessed. These bounds include: time to recover from the fault (t(sub rec)), time to restore the system (t(sub rec)) and whether there is a permanent delay in the system's Time Between Input and Output (TBIO) after the system has reached a steady state. An implementation of an ATAMM based computer was developed with the Generic VHSIC Spaceborne Computer (GVSC) as the target system. A simulation of the GVSC was also written based on the code used in ATAMM Multicomputer Operating System (AMOS). The simulation is in turn used to validate the new model in the usefulness and accuracy in tracking the propagation of the delay through the system and predicting the behavior in the transient state of recovery and restoration. The model is validated as an accurate method to predict the transient behavior of an ATAMM based multicomputer during recovery and restoration.
Summary of papers on current and anticipated uses of thermal-hydraulic codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especiallymore » faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).« less
Initial verification and validation of RAZORBACK - A research reactor transient analysis code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talley, Darren G.
2015-09-01
This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actualmore » ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adamek, Julian; Daverio, David; Durrer, Ruth
We present a new N-body code, gevolution , for the evolution of large scale structure in the Universe. Our code is based on a weak field expansion of General Relativity and calculates all six metric degrees of freedom in Poisson gauge. N-body particles are evolved by solving the geodesic equation which we write in terms of a canonical momentum such that it remains valid also for relativistic particles. We validate the code by considering the Schwarzschild solution and, in the Newtonian limit, by comparing with the Newtonian N-body codes Gadget-2 and RAMSES . We then proceed with a simulation ofmore » large scale structure in a Universe with massive neutrinos where we study the gravitational slip induced by the neutrino shear stress. The code can be extended to include different kinds of dark energy or modified gravity models and going beyond the usually adopted quasi-static approximation. Our code is publicly available.« less
Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Garg Vijay; Ameri, Ali
2005-01-01
The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.
Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems
NASA Astrophysics Data System (ADS)
Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.
2008-08-01
This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.
SMART (Sandia's Modular Architecture for Robotics and Teleoperation) Ver. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Robert
"SMART Ver. 0.8 Beta" provides a system developer with software tools to create a telerobotic control system, i.e., a system whereby an end-user can interact with mechatronic equipment. It consists of three main components: the SMART Editor (tsmed), the SMART Real-time kernel (rtos), and the SMART Supervisor (gui). The SMART Editor is a graphical icon-based code generation tool for creating end-user systems, given descriptions of SMART modules. The SMART real-time kernel implements behaviors that combine modules representing input devices, sensors, constraints, filters, and robotic devices. Included with this software release is a number of core modules, which can be combinedmore » with additional project and device specific modules to create a telerobotic controller. The SMART Supervisor is a graphical front-end for running a SMART system. It is an optional component of the SMART Environment and utilizes the TeVTk windowing and scripting environment. Although the code contained within this release is complete, and can be utilized for defining, running, and interfacing to a sample end-user SMART system, most systems will include additional project and hardware specific modules developed either by the system developer or obtained independently from a SMART module developer. SMART is a software system designed to integrate the different robots, input devices, sensors and dynamic elements required for advanced modes of telerobotic control. "SMART Ver. 0.8 Beta" defines and implements a telerobotic controller. A telerobotic system consists of combinations of modules that implement behaviors. Each real-time module represents an input device, robot device, sensor, constraint, connection or filter. The underlying theory utilizes non-linear discretized multidimensional network elements to model each individual module, and guarantees that upon a valid connection, the resulting system will perform in a stable fashion. Different combinations of modules implement different behaviors. Each module must have at a minimum an initialization routine, a parameter adjustment routine, and an update routine. The SMART runtime kernel runs continuously within a real-time embedded system. Each module is first set-up by the kernel, initialized, and then updated at a fixed rate whenever it is in context. The kernel responds to operator directed commands by changing the state of the system, changing parameters on individual modules, and switching behavioral modes. The SMART Editor is a tool used to define, verify, configure and generate source code for a SMART control system. It uses icon representations of the modules, code patches from valid configurations of the modules, and configuration files describing how a module can be connected into a system to lead the end-user in through the steps needed to create a final system. The SMART Supervisor serves as an interface to a SMART run-time system. It provides an interface on a host computer that connects to the embedded system via TCPIIP ASCII commands. It utilizes a scripting language (Tel) and a graphics windowing environment (Tk). This system can either be customized to fit an end-user's needs or completely replaced as needed.« less
Lorencatto, Fabiana; West, Robert; Seymour, Natalie; Michie, Susan
2013-06-01
There is a difference between interventions as planned and as delivered in practice. Unless we know what was actually delivered, we cannot understand "what worked" in effective interventions. This study aimed to (a) assess whether an established taxonomy of 53 smoking cessation behavior change techniques (BCTs) may be applied or adapted as a method for reliably specifying the content of smoking cessation behavioral support consultations and (b) develop an effective method for training researchers and practitioners in the reliable application of the taxonomy. Fifteen transcripts of audio-recorded consultations delivered by England's Stop Smoking Services were coded into component BCTs using the taxonomy. Interrater reliability and potential adaptations to the taxonomy to improve coding were discussed following 3 coding waves. A coding training manual was developed through expert consensus and piloted on 10 trainees, assessing coding reliability and self-perceived competence before and after training. An average of 33 BCTs from the taxonomy were identified at least once across sessions and coding waves. Consultations contained on average 12 BCTs (range = 8-31). Average interrater reliability was high (88% agreement). The taxonomy was adapted to simplify coding by merging co-occurring BCTs and refining BCT definitions. Coding reliability and self-perceived competence significantly improved posttraining for all trainees. It is possible to apply a taxonomy to reliably identify and classify BCTs in smoking cessation behavioral support delivered in practice, and train inexperienced coders to do so reliably. This method can be used to investigate variability in provision of behavioral support across services, monitor fidelity of delivery, and identify training needs.
Greenberg, Jacob K; Ladner, Travis R; Olsen, Margaret A; Shannon, Chevis N; Liu, Jingxia; Yarbrough, Chester K; Piccirillo, Jay F; Wellons, John C; Smyth, Matthew D; Park, Tae Sung; Limbrick, David D
2015-08-01
The use of administrative billing data may enable large-scale assessments of treatment outcomes for Chiari Malformation type I (CM-1). However, to utilize such data sets, validated International Classification of Diseases, Ninth Revision (ICD-9-CM) code algorithms for identifying CM-1 surgery are needed. To validate 2 ICD-9-CM code algorithms identifying patients undergoing CM-1 decompression surgery. We retrospectively analyzed the validity of 2 ICD-9-CM code algorithms for identifying adult CM-1 decompression surgery performed at 2 academic medical centers between 2001 and 2013. Algorithm 1 included any discharge diagnosis code of 348.4 (CM-1), as well as a procedure code of 01.24 (cranial decompression) or 03.09 (spinal decompression, or laminectomy). Algorithm 2 restricted this group to patients with a primary diagnosis of 348.4. The positive predictive value (PPV) and sensitivity of each algorithm were calculated. Among 340 first-time admissions identified by Algorithm 1, the overall PPV for CM-1 decompression was 65%. Among the 214 admissions identified by Algorithm 2, the overall PPV was 99.5%. The PPV for Algorithm 1 was lower in the Vanderbilt (59%) cohort, males (40%), and patients treated between 2009 and 2013 (57%), whereas the PPV of Algorithm 2 remained high (≥99%) across subgroups. The sensitivity of Algorithms 1 (86%) and 2 (83%) were above 75% in all subgroups. ICD-9-CM code Algorithm 2 has excellent PPV and good sensitivity to identify adult CM-1 decompression surgery. These results lay the foundation for studying CM-1 treatment outcomes by using large administrative databases.
ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes
NASA Astrophysics Data System (ADS)
Yuan, Gary; Gygi, Francois
2011-03-01
ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.
Supersonic Coaxial Jet Experiment for CFD Code Validation
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Carty, A. A.; Doerner, S. E.; Diskin, G. S.; Drummond, J. P.
1999-01-01
A supersonic coaxial jet facility has been designed to provide experimental data suitable for the validation of CFD codes used to analyze high-speed propulsion flows. The center jet is of a light gas and the coflow jet is of air, and the mixing layer between them is compressible. Various methods have been employed in characterizing the jet flow field, including schlieren visualization, pitot, total temperature and gas sampling probe surveying, and RELIEF velocimetry. A Navier-Stokes code has been used to calculate the nozzle flow field and the results compared to the experiment.
Effects of Inlet Distortion on Aeromechanical Stability of a Forward-Swept High-Speed Fan
NASA Technical Reports Server (NTRS)
Herrick, Gregory P.
2011-01-01
Concerns regarding noise, propulsive efficiency, and fuel burn are inspiring aircraft designs wherein the propulsive turbomachines are partially (or fully) embedded within the airframe; such designs present serious concerns with regard to aerodynamic and aeromechanic performance of the compression system in response to inlet distortion. Separately, a forward-swept high-speed fan was developed to address noise concerns of modern podded turbofans; however this fan encounters aeroelastic instability (flutter) as it approaches stall. A three-dimensional, unsteady, Navier-Stokes computational fluid dynamics code is applied to analyze and corroborate fan performance with clean inlet flow. This code, already validated in its application to assess aerodynamic damping of vibrating blades at various flow conditions, is modified and then applied in a computational study to preliminarily assess the effects of inlet distortion on aeroelastic stability of the fan. Computational engineering application and implementation issues are discussed, followed by an investigation into the aeroelastic behavior of the fan with clean and distorted inlets.
Modeling Vortex Generators in the Wind-US Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2010-01-01
A source term model which simulates the effects of vortex generators was implemented into the Wind-US Navier Stokes code. The source term added to the Navier-Stokes equations simulates the lift force which would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, supersonic flow in a rectangular duct with a counterrotating vortex generator pair, and subsonic flow in an S-duct with 22 co-rotating vortex generators. The validation results indicate that the source term vortex generator model provides a useful tool for screening vortex generator configurations and gives comparable results to solutions computed using a gridded vane.
Analysis of xRAGE and flag high explosive burn models with PBX 9404 cylinder tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrier, Danielle; Andersen, Kyle Richard
High explosives are energetic materials that release their chemical energy in a short interval of time. They are able to generate extreme heat and pressure by a shock driven chemical decomposition reaction, which makes them valuable tools that must be understood. This study investigated the accuracy and performance of two Los Alamos National Laboratory hydrodynamic codes, which are used to determine the behavior of explosives within a variety of systems: xRAGE which utilizes an Eulerian mesh, and FLAG with utilizes a Lagrangian mesh. Various programmed and reactive burn models within both codes were tested using a copper cylinder expansion test.more » The test was based on a recent experimental setup which contained the plastic bonded explosive PBX 9404. Detonation velocity versus time curves for this explosive were obtained using Photon Doppler Velocimetry (PDV). The modeled results from each of the burn models tested were then compared to one another and to the experimental results. This study validate« less
Scaled experiments of explosions in cavities
Grun, J.; Cranch, G. A.; Lunsford, R.; ...
2016-05-11
Consequences of an explosion inside an air-filled cavity under the earth's surface are partly duplicated in a laboratory experiment on spatial scales 1000 smaller. The experiment measures shock pressures coupled into a block of material by an explosion inside a gas-filled cavity therein. The explosion is generated by suddenly heating a thin foil that is located near the cavity center with a short laser pulse, which turns the foil into expanding plasma, most of whose energy drives a blast wave in the cavity gas. Variables in the experiment are the cavity radius and explosion energy. Measurements and GEODYN code simulationsmore » show that shock pressuresmeasured in the block exhibit a weak dependence on scaled cavity radius up to ~25 m/kt 1/3, above which they decrease rapidly. Possible mechanisms giving rise to this behavior are described. As a result, the applicability of this work to validating codes used to simulate full-scale cavityexplosions is discussed.« less
Civil Behavior, Safe-School Planning, and Dress Codes
ERIC Educational Resources Information Center
Studak, Cathryn M.; Workman, Jane E.
2007-01-01
This research examined news reports in order to identify incidents that precipitated dress code revisions. News reports were examined within the framework of rules for civil behavior. Using key words "school dress codes" and "violence," LEXIS/NEXIS was used to access 104 articles from 44 U.S. newspapers from December 3, 2004 to December 2, 2005.…
A Correlational Study: Code of Ethics in Testing and EFL Instructors' Professional Behavior
ERIC Educational Resources Information Center
Ashraf, Hamid; Kafi, Zahra; Saeedan, Azaam
2018-01-01
The present study has aimed at delving the code of ethics in testing in English language institutions to see how far adhering to these ethical codes will result in EFL teachers' professional behavior. Therefore, 300 EFL instructors teaching at English language schools in Khorasan Razavi Province, Zabansara Language School, as well as Khorasan…
Methodology, status and plans for development and assessment of Cathare code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bestion, D.; Barre, F.; Faydide, B.
1997-07-01
This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less
Accuracy of external cause-of-injury coding in VA polytrauma patient discharge records.
Carlson, Kathleen F; Nugent, Sean M; Grill, Joseph; Sayer, Nina A
2010-01-01
Valid and efficient methods of identifying the etiology of treated injuries are critical for characterizing patient populations and developing prevention and rehabilitation strategies. We examined the accuracy of external cause-of-injury codes (E-codes) in Veterans Health Administration (VHA) administrative data for a population of injured patients. Chart notes and E-codes were extracted for 566 patients treated at any one of four VHA Polytrauma Rehabilitation Center sites between 2001 and 2006. Two expert coders, blinded to VHA E-codes, used chart notes to assign "gold standard" E-codes to injured patients. The accuracy of VHA E-coding was examined based on these gold standard E-codes. Only 382 of 517 (74%) injured patients were assigned E-codes in VHA records. Sensitivity of VHA E-codes varied significantly by site (range: 59%-91%, p < 0.001). Sensitivity was highest for combat-related injuries (81%) and lowest for fall-related injuries (60%). Overall specificity of E-codes was high (92%). E-coding accuracy was markedly higher when we restricted analyses to records that had been assigned VHA E-codes. E-codes may not be valid for ascertaining source-of-injury data for all injuries among VHA rehabilitation inpatients at this time. Enhanced training and policies may ensure more widespread, standardized use and accuracy of E-codes for injured veterans treated in the VHA.
TOUGH Simulations of the Updegraff's Set of Fluid and Heat Flow Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moridis, G.J.; Pruess
1992-11-01
The TOUGH code [Pruess, 1987] for two-phase flow of water, air, and heat in penneable media has been exercised on a suite of test problems originally selected and simulated by C. D. Updegraff [1989]. These include five 'verification' problems for which analytical or numerical solutions are available, and three 'validation' problems that model laboratory fluid and heat flow experiments. All problems could be run without any code modifications (*). Good and efficient numerical performance, as well as accurate results were obtained throughout. Additional code verification and validation problems from the literature are briefly summarized, and suggestions are given for propermore » applications of TOUGH and related codes.« less
NASA Radiation Protection Research for Exploration Missions
NASA Technical Reports Server (NTRS)
Wilson, John W.; Cucinotta, Francis A.; Tripathi, Ram K.; Heinbockel, John H.; Tweed, John; Mertens, Christopher J.; Walker, Steve A.; Blattnig, Steven R.; Zeitlin, Cary J.
2006-01-01
The HZETRN code was used in recent trade studies for renewed lunar exploration and currently used in engineering development of the next generation of space vehicles, habitats, and EVA equipment. A new version of the HZETRN code capable of simulating high charge and energy (HZE) ions, light-ions and neutrons with either laboratory or space boundary conditions with enhanced neutron and light-ion propagation is under development. Atomic and nuclear model requirements to support that development will be discussed. Such engineering design codes require establishing validation processes using laboratory ion beams and space flight measurements in realistic geometries. We discuss limitations of code validation due to the currently available data and recommend priorities for new data sets.
Validation of Multitemperature Nozzle Flow Code
NASA Technical Reports Server (NTRS)
Park, Chul; Lee, Seung -Ho.
1994-01-01
A computer code nozzle in n-temperatures (NOZNT), which calculates one-dimensional flows of partially dissociated and ionized air in an expanding nozzle, is tested against three existing sets of experimental data taken in arcjet wind tunnels. The code accounts for the differences among various temperatures, i.e., translational-rotational temperature, vibrational temperatures of individual molecular species, and electron-electronic temperature, and the effects of impurities. The experimental data considered are (1) the spectroscopic emission data; (2) electron beam data on vibrational temperature; and (3) mass-spectrometric species concentration data. It is shown that the impurities are inconsequential for the arcjet flows, and the NOZNT code is validated by numerically reproducing the experimental data.
Mahajan, Anubha; Wessel, Jennifer; Willems, Sara M; Zhao, Wei; Robertson, Neil R; Chu, Audrey Y; Gan, Wei; Kitajima, Hidetoshi; Taliun, Daniel; Rayner, N William; Guo, Xiuqing; Lu, Yingchang; Li, Man; Jensen, Richard A; Hu, Yao; Huo, Shaofeng; Lohman, Kurt K; Zhang, Weihua; Cook, James P; Prins, Bram Peter; Flannick, Jason; Grarup, Niels; Trubetskoy, Vassily Vladimirovich; Kravic, Jasmina; Kim, Young Jin; Rybin, Denis V; Yaghootkar, Hanieh; Müller-Nurasyid, Martina; Meidtner, Karina; Li-Gao, Ruifang; Varga, Tibor V; Marten, Jonathan; Li, Jin; Smith, Albert Vernon; An, Ping; Ligthart, Symen; Gustafsson, Stefan; Malerba, Giovanni; Demirkan, Ayse; Tajes, Juan Fernandez; Steinthorsdottir, Valgerdur; Wuttke, Matthias; Lecoeur, Cécile; Preuss, Michael; Bielak, Lawrence F; Graff, Marielisa; Highland, Heather M; Justice, Anne E; Liu, Dajiang J; Marouli, Eirini; Peloso, Gina Marie; Warren, Helen R; Afaq, Saima; Afzal, Shoaib; Ahlqvist, Emma; Almgren, Peter; Amin, Najaf; Bang, Lia B; Bertoni, Alain G; Bombieri, Cristina; Bork-Jensen, Jette; Brandslund, Ivan; Brody, Jennifer A; Burtt, Noël P; Canouil, Mickaël; Chen, Yii-Der Ida; Cho, Yoon Shin; Christensen, Cramer; Eastwood, Sophie V; Eckardt, Kai-Uwe; Fischer, Krista; Gambaro, Giovanni; Giedraitis, Vilmantas; Grove, Megan L; de Haan, Hugoline G; Hackinger, Sophie; Hai, Yang; Han, Sohee; Tybjærg-Hansen, Anne; Hivert, Marie-France; Isomaa, Bo; Jäger, Susanne; Jørgensen, Marit E; Jørgensen, Torben; Käräjämäki, Annemari; Kim, Bong-Jo; Kim, Sung Soo; Koistinen, Heikki A; Kovacs, Peter; Kriebel, Jennifer; Kronenberg, Florian; Läll, Kristi; Lange, Leslie A; Lee, Jung-Jin; Lehne, Benjamin; Li, Huaixing; Lin, Keng-Hung; Linneberg, Allan; Liu, Ching-Ti; Liu, Jun; Loh, Marie; Mägi, Reedik; Mamakou, Vasiliki; McKean-Cowdin, Roberta; Nadkarni, Girish; Neville, Matt; Nielsen, Sune F; Ntalla, Ioanna; Peyser, Patricia A; Rathmann, Wolfgang; Rice, Kenneth; Rich, Stephen S; Rode, Line; Rolandsson, Olov; Schönherr, Sebastian; Selvin, Elizabeth; Small, Kerrin S; Stančáková, Alena; Surendran, Praveen; Taylor, Kent D; Teslovich, Tanya M; Thorand, Barbara; Thorleifsson, Gudmar; Tin, Adrienne; Tönjes, Anke; Varbo, Anette; Witte, Daniel R; Wood, Andrew R; Yajnik, Pranav; Yao, Jie; Yengo, Loïc; Young, Robin; Amouyel, Philippe; Boeing, Heiner; Boerwinkle, Eric; Bottinger, Erwin P; Chowdhury, Rajiv; Collins, Francis S; Dedoussis, George; Dehghan, Abbas; Deloukas, Panos; Ferrario, Marco M; Ferrières, Jean; Florez, Jose C; Frossard, Philippe; Gudnason, Vilmundur; Harris, Tamara B; Heckbert, Susan R; Howson, Joanna M M; Ingelsson, Martin; Kathiresan, Sekar; Kee, Frank; Kuusisto, Johanna; Langenberg, Claudia; Launer, Lenore J; Lindgren, Cecilia M; Männistö, Satu; Meitinger, Thomas; Melander, Olle; Mohlke, Karen L; Moitry, Marie; Morris, Andrew D; Murray, Alison D; de Mutsert, Renée; Orho-Melander, Marju; Owen, Katharine R; Perola, Markus; Peters, Annette; Province, Michael A; Rasheed, Asif; Ridker, Paul M; Rivadineira, Fernando; Rosendaal, Frits R; Rosengren, Anders H; Salomaa, Veikko; Sheu, Wayne H-H; Sladek, Rob; Smith, Blair H; Strauch, Konstantin; Uitterlinden, André G; Varma, Rohit; Willer, Cristen J; Blüher, Matthias; Butterworth, Adam S; Chambers, John Campbell; Chasman, Daniel I; Danesh, John; van Duijn, Cornelia; Dupuis, Josée; Franco, Oscar H; Franks, Paul W; Froguel, Philippe; Grallert, Harald; Groop, Leif; Han, Bok-Ghee; Hansen, Torben; Hattersley, Andrew T; Hayward, Caroline; Ingelsson, Erik; Kardia, Sharon L R; Karpe, Fredrik; Kooner, Jaspal Singh; Köttgen, Anna; Kuulasmaa, Kari; Laakso, Markku; Lin, Xu; Lind, Lars; Liu, Yongmei; Loos, Ruth J F; Marchini, Jonathan; Metspalu, Andres; Mook-Kanamori, Dennis; Nordestgaard, Børge G; Palmer, Colin N A; Pankow, James S; Pedersen, Oluf; Psaty, Bruce M; Rauramaa, Rainer; Sattar, Naveed; Schulze, Matthias B; Soranzo, Nicole; Spector, Timothy D; Stefansson, Kari; Stumvoll, Michael; Thorsteinsdottir, Unnur; Tuomi, Tiinamaija; Tuomilehto, Jaakko; Wareham, Nicholas J; Wilson, James G; Zeggini, Eleftheria; Scott, Robert A; Barroso, Inês; Frayling, Timothy M; Goodarzi, Mark O; Meigs, James B; Boehnke, Michael; Saleheen, Danish; Morris, Andrew P; Rotter, Jerome I; McCarthy, Mark I
2018-04-01
We aggregated coding variant data for 81,412 type 2 diabetes cases and 370,832 controls of diverse ancestry, identifying 40 coding variant association signals (P < 2.2 × 10 -7 ); of these, 16 map outside known risk-associated loci. We make two important observations. First, only five of these signals are driven by low-frequency variants: even for these, effect sizes are modest (odds ratio ≤1.29). Second, when we used large-scale genome-wide association data to fine-map the associated variants in their regional context, accounting for the global enrichment of complex trait associations in coding sequence, compelling evidence for coding variant causality was obtained for only 16 signals. At 13 others, the associated coding variants clearly represent 'false leads' with potential to generate erroneous mechanistic inference. Coding variant associations offer a direct route to biological insight for complex diseases and identification of validated therapeutic targets; however, appropriate mechanistic inference requires careful specification of their causal contribution to disease predisposition.
Preliminary SAGE Simulations of Volcanic Jets Into a Stratified Atmosphere
NASA Astrophysics Data System (ADS)
Peterson, A. H.; Wohletz, K. H.; Ogden, D. E.; Gisler, G. R.; Glatzmaier, G. A.
2007-12-01
The SAGE (SAIC Adaptive Grid Eulerian) code employs adaptive mesh refinement in solving Eulerian equations of complex fluid flow desirable for simulation of volcanic eruptions. The goal of modeling volcanic eruptions is to better develop a code's predictive capabilities in order to understand the dynamics that govern the overall behavior of real eruption columns. To achieve this goal, we focus on the dynamics of underexpended jets, one of the fundamental physical processes important to explosive eruptions. Previous simulations of laboratory jets modeled in cylindrical coordinates were benchmarked with simulations in CFDLib (Los Alamos National Laboratory), which solves the full Navier-Stokes equations (includes viscous stress tensor), and showed close agreement, indicating that adaptive mesh refinement used in SAGE may offset the need for explicit calculation of viscous dissipation.We compare gas density contours of these previous simulations with the same initial conditions in cylindrical and Cartesian geometries to laboratory experiments to determine both the validity of the model and the robustness of the code. The SAGE results in both geometries are within several percent of the experiments for position and density of the incident (intercepting) and reflected shocks, slip lines, shear layers, and Mach disk. To expand our study into a volcanic regime, we simulate large-scale jets in a stratified atmosphere to establish the code's ability to model a sustained jet into a stable atmosphere.
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...
2018-06-14
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
Overview of NASA Multi-dimensional Stirling Convertor Code Development and Validation Effort
NASA Technical Reports Server (NTRS)
Tew, Roy C.; Cairelli, James E.; Ibrahim, Mounir B.; Simon, Terrence W.; Gedeon, David
2002-01-01
A NASA grant has been awarded to Cleveland State University (CSU) to develop a multi-dimensional (multi-D) Stirling computer code with the goals of improving loss predictions and identifying component areas for improvements. The University of Minnesota (UMN) and Gedeon Associates are teamed with CSU. Development of test rigs at UMN and CSU and validation of the code against test data are part of the effort. The one-dimensional (1-D) Stirling codes used for design and performance prediction do not rigorously model regions of the working space where abrupt changes in flow area occur (such as manifolds and other transitions between components). Certain hardware experiences have demonstrated large performance gains by varying manifolds and heat exchanger designs to improve flow distributions in the heat exchangers. 1-D codes were not able to predict these performance gains. An accurate multi-D code should improve understanding of the effects of area changes along the main flow axis, sensitivity of performance to slight changes in internal geometry, and, in general, the understanding of various internal thermodynamic losses. The commercial CFD-ACE code has been chosen for development of the multi-D code. This 2-D/3-D code has highly developed pre- and post-processors, and moving boundary capability. Preliminary attempts at validation of CFD-ACE models of MIT gas spring and "two space" test rigs were encouraging. Also, CSU's simulations of the UMN oscillating-flow fig compare well with flow visualization results from UMN. A complementary Department of Energy (DOE) Regenerator Research effort is aiding in development of regenerator matrix models that will be used in the multi-D Stirling code. This paper reports on the progress and challenges of this
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
Toward Supersonic Retropropulsion CFD Validation
NASA Technical Reports Server (NTRS)
Kleb, Bil; Schauerhamer, D. Guy; Trumble, Kerry; Sozer, Emre; Barnhardt, Michael; Carlson, Jan-Renee; Edquist, Karl
2011-01-01
This paper begins the process of verifying and validating computational fluid dynamics (CFD) codes for supersonic retropropulsive flows. Four CFD codes (DPLR, FUN3D, OVERFLOW, and US3D) are used to perform various numerical and physical modeling studies toward the goal of comparing predictions with a wind tunnel experiment specifically designed to support CFD validation. Numerical studies run the gamut in rigor from code-to-code comparisons to observed order-of-accuracy tests. Results indicate that this complex flowfield, involving time-dependent shocks and vortex shedding, design order of accuracy is not clearly evident. Also explored is the extent of physical modeling necessary to predict the salient flowfield features found in high-speed Schlieren images and surface pressure measurements taken during the validation experiment. Physical modeling studies include geometric items such as wind tunnel wall and sting mount interference, as well as turbulence modeling that ranges from a RANS (Reynolds-Averaged Navier-Stokes) 2-equation model to DES (Detached Eddy Simulation) models. These studies indicate that tunnel wall interference is minimal for the cases investigated; model mounting hardware effects are confined to the aft end of the model; and sparse grid resolution and turbulence modeling can damp or entirely dissipate the unsteadiness of this self-excited flow.
Validation of a multi-layer Green's function code for ion beam transport
NASA Astrophysics Data System (ADS)
Walker, Steven; Tweed, John; Tripathi, Ram; Badavi, Francis F.; Miller, Jack; Zeitlin, Cary; Heilbronn, Lawrence
To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiations is needed. In consequence, a new version of the HZETRN code capable of simulating high charge and energy (HZE) ions with either laboratory or space boundary conditions is currently under development. The new code, GRNTRN, is based on a Green's function approach to the solution of Boltzmann's transport equation and like its predecessor is deterministic in nature. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and down shift. Code validation in the laboratory environment is addressed by showing that GRNTRN accurately predicts energy loss spectra as measured by solid-state detectors in ion beam experiments with multi-layer targets. In order to validate the code with space boundary conditions, measured particle fluences are propagated through several thicknesses of shielding using both GRNTRN and the current version of HZETRN. The excellent agreement obtained indicates that GRNTRN accurately models the propagation of HZE ions in the space environment as well as in laboratory settings and also provides verification of the HZETRN propagator.
ERIC Educational Resources Information Center
Minahan, Jessica
2014-01-01
Since its publication in 2012, "The Behavior Code: A Practical Guide to Understanding and Teaching the Most Challenging Students" has helped countless classroom teachers, special educators, and others implement an effective, new approach to teaching focused on skill-building, practical interventions, and purposeful, positive interactions…
ERIC Educational Resources Information Center
Sweeny, Timothy D.; Haroz, Steve; Whitney, David
2013-01-01
Many species, including humans, display group behavior. Thus, perceiving crowds may be important for social interaction and survival. Here, we provide the first evidence that humans use ensemble-coding mechanisms to perceive the behavior of a crowd of people with surprisingly high sensitivity. Observers estimated the headings of briefly presented…
Fiber Optic Distributed Sensors for High-resolution Temperature Field Mapping.
Lomperski, Stephen; Gerardi, Craig; Lisowski, Darius
2016-11-07
The reliability of computational fluid dynamics (CFD) codes is checked by comparing simulations with experimental data. A typical data set consists chiefly of velocity and temperature readings, both ideally having high spatial and temporal resolution to facilitate rigorous code validation. While high resolution velocity data is readily obtained through optical measurement techniques such as particle image velocimetry, it has proven difficult to obtain temperature data with similar resolution. Traditional sensors such as thermocouples cannot fill this role, but the recent development of distributed sensing based on Rayleigh scattering and swept-wave interferometry offers resolution suitable for CFD code validation work. Thousands of temperature measurements can be generated along a single thin optical fiber at hundreds of Hertz. Sensors function over large temperature ranges and within opaque fluids where optical techniques are unsuitable. But this type of sensor is sensitive to strain and humidity as well as temperature and so accuracy is affected by handling, vibration, and shifts in relative humidity. Such behavior is quite unlike traditional sensors and so unconventional installation and operating procedures are necessary to ensure accurate measurements. This paper demonstrates implementation of a Rayleigh scattering-type distributed temperature sensor in a thermal mixing experiment involving two air jets at 25 and 45 °C. We present criteria to guide selection of optical fiber for the sensor and describe installation setup for a jet mixing experiment. We illustrate sensor baselining, which links readings to an absolute temperature standard, and discuss practical issues such as errors due to flow-induced vibration. This material can aid those interested in temperature measurements having high data density and bandwidth for fluid dynamics experiments and similar applications. We highlight pitfalls specific to these sensors for consideration in experiment design and operation.
Examples of Use of SINBAD Database for Nuclear Data and Code Validation
NASA Astrophysics Data System (ADS)
Kodeli, Ivan; Žerovnik, Gašper; Milocco, Alberto
2017-09-01
The SINBAD database currently contains compilations and evaluations of over 100 shielding benchmark experiments. The SINBAD database is widely used for code and data validation. Materials covered include: Air, N. O, H2O, Al, Be, Cu, graphite, concrete, Fe, stainless steel, Pb, Li, Ni, Nb, SiC, Na, W, V and mixtures thereof. Over 40 organisations from 14 countries and 2 international organisations have contributed data and work in support of SINBAD. Examples of the use of the database in the scope of different international projects, such as the Working Party on Evaluation Cooperation of the OECD and the European Fusion Programme demonstrate the merit and possible usage of the database for the validation of modern nuclear data evaluations and new computer codes.
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
2009-01-01
In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.
Wheel-Sleeper Impact Model in Rail Vehicles Analysis
NASA Astrophysics Data System (ADS)
Brabie, Dan
The current paper establishes the necessary prerequisites for studying post-derailment dynamic behavior of high-speed rail vehicles by means of multi-body system (MBS) software. A finite-element (FE) model of one rail vehicle wheel impacting a limited concrete sleeper volume is built in LS-DYNA. A novel simulation scheme is employed for obtaining the necessary wheel-sleeper impact data, transferred to the MBS code as pre-defined look-up tables of the wheel's impulse variation during impact. The FE model is tentatively validated successfully by comparing the indentation marks with one photograph from an authentic derailment for a continuous impact sequence over three subsequent sleepers. A post-derailment module is developed and implemented in the MBS simulation tool GENSYS, which detects the wheel contact with sleepers and applies valid longitudinal, lateral and vertical force resultants based on the existing impact conditions. The accuracy of the MBS code in terms of the wheels three-dimensional trajectory over 24 consecutive sleepers is successfully compared with its FE counterpart for an arbitrary impact scenario. An axle mounted brake disc is tested as an alternative substitute guidance mechanism after flange climbing derailments at 100 and 200 km/h on the Swedish high-speed tilting train X 2000. Certain combinations of brake disc geometrical parameters manage to stop the lateral deviation of the wheelsets in circular curve sections at high lateral track plane acceleration.
Drugs, Guns, and Disadvantaged Youths: Co-Occurring Behavior and the Code of the Street
ERIC Educational Resources Information Center
Allen, Andrea N.; Lo, Celia C.
2012-01-01
Guided by Anderson's theory of the code of the street, this study explored social mechanisms linking individual-level disadvantage factors with the adoption of beliefs grounded in the code of the street and with drug trafficking and gun carrying--the co-occurring behavior shaping violence among young men in urban areas. Secondary data were…
Teacher-Child Dyadic Interaction: A Manual for Coding Classroom Behavior. Report Series No. 27.
ERIC Educational Resources Information Center
Brophy, Jere E.; Good, Thomas L.
This manual presents the rationale and coding system for the study of dyadic interaction between teachers and children in classrooms. The introduction notes major differences between this system and others in common use: 1) it is not a universal system that attempts to code all classroom behavior, and 2) the teacher's interactions in his class are…
Infant differential behavioral responding to discrete emotions.
Walle, Eric A; Reschke, Peter J; Camras, Linda A; Campos, Joseph J
2017-10-01
Emotional communication regulates the behaviors of social partners. Research on individuals' responding to others' emotions typically compares responses to a single negative emotion compared with responses to a neutral or positive emotion. Furthermore, coding of such responses routinely measure surface level features of the behavior (e.g., approach vs. avoidance) rather than its underlying function (e.g., the goal of the approach or avoidant behavior). This investigation examined infants' responding to others' emotional displays across 5 discrete emotions: joy, sadness, fear, anger, and disgust. Specifically, 16-, 19-, and 24-month-old infants observed an adult communicate a discrete emotion toward a stimulus during a naturalistic interaction. Infants' responses were coded to capture the function of their behaviors (e.g., exploration, prosocial behavior, and security seeking). The results revealed a number of instances indicating that infants use different functional behaviors in response to discrete emotions. Differences in behaviors across emotions were clearest in the 24-month-old infants, though younger infants also demonstrated some differential use of behaviors in response to discrete emotions. This is the first comprehensive study to identify differences in how infants respond with goal-directed behaviors to discrete emotions. Additionally, the inclusion of a function-based coding scheme and interpersonal paradigms may be informative for future emotion research with children and adults. Possible developmental accounts for the observed behaviors and the benefits of coding techniques emphasizing the function of social behavior over their form are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Language Recognition via Sparse Coding
2016-09-08
a posteriori (MAP) adaptation scheme that further optimizes the discriminative quality of sparse-coded speech fea - tures. We empirically validate the...significantly improve the discriminative quality of sparse-coded speech fea - tures. In Section 4, we evaluate the proposed approaches against an i-vector
NEAMS Update. Quarterly Report for October - December 2011.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, K.
2012-02-16
The Advanced Modeling and Simulation Office within the DOE Office of Nuclear Energy (NE) has been charged with revolutionizing the design tools used to build nuclear power plants during the next 10 years. To accomplish this, the DOE has brought together the national laboratories, U.S. universities, and the nuclear energy industry to establish the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program. The mission of NEAMS is to modernize computer modeling of nuclear energy systems and improve the fidelity and validity of modeling results using contemporary software environments and high-performance computers. NEAMS will create a set of engineering-level codes aimedmore » at designing and analyzing the performance and safety of nuclear power plants and reactor fuels. The truly predictive nature of these codes will be achieved by modeling the governing phenomena at the spatial and temporal scales that dominate the behavior. These codes will be executed within a simulation environment that orchestrates code integration with respect to spatial meshing, computational resources, and execution to give the user a common 'look and feel' for setting up problems and displaying results. NEAMS is building upon a suite of existing simulation tools, including those developed by the federal Scientific Discovery through Advanced Computing and Advanced Simulation and Computing programs. NEAMS also draws upon existing simulation tools for materials and nuclear systems, although many of these are limited in terms of scale, applicability, and portability (their ability to be integrated into contemporary software and hardware architectures). NEAMS investments have directly and indirectly supported additional NE research and development programs, including those devoted to waste repositories, safeguarded separations systems, and long-term storage of used nuclear fuel. NEAMS is organized into two broad efforts, each comprising four elements. The quarterly highlights October-December 2011 are: (1) Version 1.0 of AMP, the fuel assembly performance code, was tested on the JAGUAR supercomputer and released on November 1, 2011, a detailed discussion of this new simulation tool is given; (2) A coolant sub-channel model and a preliminary UO{sub 2} smeared-cracking model were implemented in BISON, the single-pin fuel code, more information on how these models were developed and benchmarked is given; (3) The Object Kinetic Monte Carlo model was implemented to account for nucleation events in meso-scale simulations and a discussion of the significance of this advance is given; (4) The SHARP neutronics module, PROTEUS, was expanded to be applicable to all types of reactors, and a discussion of the importance of PROTEUS is given; (5) A plan has been finalized for integrating the high-fidelity, three-dimensional reactor code SHARP with both the systems-level code RELAP7 and the fuel assembly code AMP. This is a new initiative; (6) Work began to evaluate the applicability of AMP to the problem of dry storage of used fuel and to define a relevant problem to test the applicability; (7) A code to obtain phonon spectra from the force-constant matrix for a crystalline lattice has been completed. This important bridge between subcontinuum and continuum phenomena is discussed; (8) Benchmarking was begun on the meso-scale, finite-element fuels code MARMOT to validate its new variable splitting algorithm; (9) A very computationally demanding simulation of diffusion-driven nucleation of new microstructural features has been completed. An explanation of the difficulty of this simulation is given; (10) Experiments were conducted with deformed steel to validate a crystal plasticity finite-element code for bodycentered cubic iron; (11) The Capability Transfer Roadmap was completed and published as an internal laboratory technical report; (12) The AMP fuel assembly code input generator was integrated into the NEAMS Integrated Computational Environment (NiCE). More details on the planned NEAMS computing environment is given; and (13) The NEAMS program website (neams.energy.gov) is nearly ready to launch.« less
Tan, Michael; Wilson, Ian; Braganza, Vanessa; Ignatiadis, Sophia; Boston, Ray; Sundararajan, Vijaya; Cook, Mark J; D'Souza, Wendyl J
2015-10-01
We report the diagnostic validity of a selection algorithm for identifying epilepsy cases. Retrospective validation study of International Classification of Diseases 10th Revision Australian Modification (ICD-10AM)-coded hospital records and pharmaceutical data sampled from 300 consecutive potential epilepsy-coded cases and 300 randomly chosen cases without epilepsy from 3/7/2012 to 10/7/2013. Two epilepsy specialists independently validated the diagnosis of epilepsy. A multivariable logistic regression model was fitted to identify the optimum coding algorithm for epilepsy and was internally validated. One hundred fifty-eight out of three hundred (52.6%) epilepsy-coded records and 0/300 (0%) nonepilepsy records were confirmed to have epilepsy. The kappa for interrater agreement was 0.89 (95% CI=0.81-0.97). The model utilizing epilepsy (G40), status epilepticus (G41) and ≥1 antiepileptic drug (AED) conferred the highest positive predictive value of 81.4% (95% CI=73.1-87.9) and a specificity of 99.9% (95% CI=99.9-100.0). The area under the receiver operating curve was 0.90 (95% CI=0.88-0.93). When combined with pharmaceutical data, the precision of case identification for epilepsy data linkage design was considerably improved and could provide considerable potential for efficient and reasonably accurate case ascertainment in epidemiological studies. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.
2014-01-01
The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.
2014-01-01
The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (1) mode blockage, (2) liner insertion loss, (3) short ducts, and (4) mode reflection.
Summary of EASM Turbulence Models in CFL3D With Validation Test Cases
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Gatski, Thomas B.
2003-01-01
This paper summarizes the Explicit Algebraic Stress Model in k-omega form (EASM-ko) and in k-epsilon form (EASM-ke) in the Reynolds-averaged Navier-Stokes code CFL3D. These models have been actively used over the last several years in CFL3D, and have undergone some minor modifications during that time. Details of the equations and method for coding the latest versions of the models are given, and numerous validation cases are presented. This paper serves as a validation archive for these models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell
In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff.
Sequential analysis of child pain behavior and maternal responses: an observational study.
Langer, Shelby L; Romano, Joan; Brown, Jonathon D; Nielson, Heather; Ou, Bobby; Rauch, Christina; Zullo, Lirra; Levy, Rona L
2017-09-01
This laboratory-based study examined lagged associations between child pain behavior and maternal responses as a function of maternal catastrophizing (CAT). Mothers completed the parent version of the Pain Catastrophizing Scale. Children participated in a validated water ingestion procedure to induce abdominal discomfort with mothers present. Video recordings of their interactions were edited into 30-second segments and coded by 2 raters for presence of child pain behavior, maternal solicitousness, and nontask conversation. Kappa reliabilities ranged from 0.83 to 0.95. Maternal CAT was positively associated with child pain behavior and maternal solicitousness, P values <0.05. In lagged analyses, child pain behavior during a given segment (T) was positively associated with child pain behavior during the subsequent segment (T + 1), P <0.05. Maternal CAT moderated the association between (1) child pain behavior at T and maternal solicitousness at T + 1, and (2) solicitousness at T and child pain behavior at T + 1, P values <0.05. Mothers higher in CAT responded solicitously at T + 1 irrespective of their child's preceding pain behavior, and their children exhibited pain behavior at T + 1 irrespective of the mother's preceding solicitousness. Mothers lower in CAT were more likely to respond solicitously at T + 1 after child pain behavior, and their children were more likely to exhibit pain behavior at T + 1 after maternal solicitousness. These findings indicate that high CAT mothers and their children exhibit inflexible patterns of maternal solicitousness and child pain behavior, and that such families may benefit from interventions to decrease CAT and develop more adaptive responses.
Dai, Lengshi; Shinn-Cunningham, Barbara G
2016-01-01
Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics and task demands.
Validation of the Classroom Behavior Inventory
ERIC Educational Resources Information Center
Blunden, Dale; And Others
1974-01-01
Factor-analytic methods were used toassess contruct validity of the Classroom Behavior Inventory, a scale for rating behaviors associated with hyperactivity. The Classroom Behavior Inventory measures three dimensions of behavior: Hyperactivity, Hostility, and Sociability. Significant concurrent validity was obtained for only one Classroom Behavior…
Lord, Vanessa M; Reiboldt, Wendy; Gonitzke, Dariella; Parker, Emily; Peterson, Caitlin
2018-02-01
In this study, qualitative methods were employed to analyze secondary data from the anonymous postings of a pro-recovery website in an effort to investigate the changes in thinking of binge-eating disorder (BED) sufferers who were able to recover from the disorder, understand more fully how guilt and self-blame affect recovery, and explore the perceived motivators and challenges to recovery. 681 messages from 65 participants pertaining to BED were analyzed from January 1, 2014-January 1, 2015 through thematic analysis. Coding strategies were employed to reveal patterns within the experiences of the participants. The researchers identified three themes surrounding "changes in thinking" from analysis of the message board postings: admitting the disorder, recognizing unhealthy coping behaviors, and seeing recovery. Further analysis of postings suggested that guilt and self-blame hinder recovery by promoting a feedback cycle of binging, which leads to further guilt and self-blame. The data ultimately identified experiences that resulted in or hindered recovery. The experience of validation appeared to result in recovery; those who experienced validation were less inclined to engage in disordered eating behaviors. Conversely, weight loss or attempts at weight loss hindered recovery by ultimately promoting more disordered eating behaviors. This qualitative analysis of message board postings offers authentic, credible data with a unique perspective. Practitioners working in the field of eating disorders such as registered dietitian nutritionists or therapists might use evidence from the data to guide their practice.
Bai, Jinbing; Swanson, Kristen M; Santacroce, Sheila J
2018-01-01
Parent interactions with their child can influence the child's pain and distress during painful procedures. Reliable and valid interaction analysis systems (IASs) are valuable tools for capturing these interactions. The extent to which IASs are used in observational research of parent-child interactions is unknown in pediatric populations. To identify and evaluate studies that focus on assessing psychometric properties of initial iterations/publications of observational coding systems of parent-child interactions during painful procedures. To identify and evaluate studies that focus on assessing psychometric properties of initial iterations/publications of observational coding systems of parent-child interactions during painful procedures. Computerized databases searched included PubMed, CINAHL, PsycINFO, Health and Psychosocial Instruments, and Scopus. Timeframes covered from inception of the database to January 2017. Studies were included if they reported use or psychometrics of parent-child IASs. First assessment was whether the parent-child IASs were theory-based; next, using the Society of Pediatric Psychology Assessment Task Force criteria IASs were assigned to one of three categories: well-established, approaching well-established, or promising. A total of 795 studies were identified through computerized searches. Eighteen studies were ultimately determined to be eligible for inclusion in the review and 17 parent-child IASs were identified from these 18 studies. Among the 17 coding systems, 14 were suitable for use in children age 3 years or more; two were theory-based; and 11 included verbal and nonverbal parent behaviors that promoted either child coping or child distress. Four IASs were assessed as well-established; seven approached well-established; and six were promising. Findings indicate a need for the development of theory-based parent-child IASs that consider both verbal and nonverbal parent behaviors during painful procedures. Findings also suggest a need for further testing of those parent-child IASs deemed "approaching well-established" or "promising". © 2017 World Institute of Pain.
Eckert, Katharina G; Lange, Martin A
2015-03-14
Physical activity questionnaires (PAQ) have been extensively used to determine physical activity (PA) levels. Most PAQ are derived from an energy expenditure-based perspective and assess activities with a certain intensity level. Activities with a moderate or vigorous intensity level are predominantly used to determine a person's PA level in terms of quantity. Studies show that the time spent engaging in moderate and vigorous intensity PA does not appropriately reflect the actual PA behavior of older people because they perform more functional, everyday activities. Those functional activities are more likely to be considered low-intense and represent an important qualitative health-promoting activity. For the elderly, functional, light intensity activities are of special interest but are assessed differently in terms of quantity and quality. The aim was to analyze the content of PAQ for the elderly. N = 18 sufficiently validated PAQ applicable to adults (60+) were included. Each item (N = 414) was linked to the corresponding code of the International Classification of Functioning, Disability and Health (ICF) using established linking rules. Kappa statistics were calculated to determine rater agreement. Items were linked to 598 ICF codes and 62 different ICF categories. A total of 43.72% of the codes were for sports-related activities and 14.25% for walking-related activities. Only 9.18% of all codes were related to household tasks. Light intensity, functional activities are emphasized differently and are underrepresented in most cases. Additionally, sedentary activities are underrepresented (5.55%). κ coefficients were acceptable for n = 16 questionnaires (0.48-1.00). There is a large inconsistency in the understandings of PA in elderly. Further research should focus (1) on a conceptual understanding of PA in terms of the behavior of the elderly and (2) on developing questionnaires that inquire functional, light intensity PA, as well as sedentary activities more explicitly.
CACTI: free, open-source software for the sequential coding of behavioral interactions.
Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.
Development and validation of a registry-based definition of eosinophilic esophagitis in Denmark
Dellon, Evan S; Erichsen, Rune; Pedersen, Lars; Shaheen, Nicholas J; Baron, John A; Sørensen, Henrik T; Vyberg, Mogens
2013-01-01
AIM: To develop and validate a case definition of eosinophilic esophagitis (EoE) in the linked Danish health registries. METHODS: For case definition development, we queried the Danish medical registries from 2006-2007 to identify candidate cases of EoE in Northern Denmark. All International Classification of Diseases-10 (ICD-10) and prescription codes were obtained, and archived pathology slides were obtained and re-reviewed to determine case status. We used an iterative process to select inclusion/exclusion codes, refine the case definition, and optimize sensitivity and specificity. We then re-queried the registries from 2008-2009 to yield a validation set. The case definition algorithm was applied, and sensitivity and specificity were calculated. RESULTS: Of the 51 and 49 candidate cases identified in both the development and validation sets, 21 and 24 had EoE, respectively. Characteristics of EoE cases in the development set [mean age 35 years; 76% male; 86% dysphagia; 103 eosinophils per high-power field (eos/hpf)] were similar to those in the validation set (mean age 42 years; 83% male; 67% dysphagia; 77 eos/hpf). Re-review of archived slides confirmed that the pathology coding for esophageal eosinophilia was correct in greater than 90% of cases. Two registry-based case algorithms based on pathology, ICD-10, and pharmacy codes were successfully generated in the development set, one that was sensitive (90%) and one that was specific (97%). When these algorithms were applied to the validation set, they remained sensitive (88%) and specific (96%). CONCLUSION: Two registry-based definitions, one highly sensitive and one highly specific, were developed and validated for the linked Danish national health databases, making future population-based studies feasible. PMID:23382628
Numerical study of droplet impact and rebound on superhydrophobic surface
NASA Astrophysics Data System (ADS)
Cai, Xuan; Wu, Yanchen; Woerner, Martin; Frohnapfel, Bettina
2017-11-01
Droplet impact and rebound on superhydrophobic surface is an important process in many applications; among them are developing self-cleaning or anti-icing materials and limiting liquid film formation of Diesel Exhaust Fluid (DEF) in exhaust gas pipe. In the latter field, rebound of DEF droplet from wall is desired as an effective mean for avoiding or reducing unwanted solid deposition. Our goal is to numerically study influence of surface wettability on DEF droplet impact and rebound behavior. A phase-field method is chosen, which was implemented in OpenFOAM by us and validated for wetting-related interfacial flow problems. In the present contribution we first numerically reproduce relevant experimental studies in literature, to validate the code for droplet impact and rebound problem. There we study droplet-surface contact time, maximum/instantaneous spreading factor and droplet shape evolution. Our numerical results show good agreement with experimental data. Next we investigate for DEF droplets the effects of diameter, impact velocity and surface wettability on rebound behavior and jumping height. Based on Weber number and equilibrium contact angle, two regimes are identified. We show that surface wettability is a deciding factor for achieving rebound event. This work is supported by Foundation ``Friedrich-und-Elisabeth Boysen Stiftung fuer Forschung und Innovation'' (BOY-127-TP1).
Zieve, Garret G; Richardson, Laura P; Katzman, Katherine; Spielvogle, Heather; Whitehouse, Sandy; McCarty, Carolyn A
2017-07-20
Electronic health screening tools for primary care present an opportunity to go beyond data collection to provide education and feedback to adolescents in order to motivate behavior change. However, there is limited research to guide feedback message development. The aim of this study was to explore youth perceptions of and preferences for receiving personalized feedback for multiple health risk behaviors and reinforcement for health promoting behaviors from an electronic health screening tool for primary care settings, using qualitative methodology. In total, 31 adolescents aged 13-18 years completed the screening tool, received the electronic feedback, and subsequently participated in individual, semistructured, qualitative interviews lasting approximately 60 min. Participants were queried about their overall impressions of the tool, perceptions regarding various types of feedback messages, and additional features that would help motivate health behavior change. Using thematic analysis, interview transcripts were coded to identify common themes expressed across participants. Overall, the tool was well-received by participants who perceived it as a way to enhance-but not replace-their interactions with providers. They appreciated receiving nonjudgmental feedback from the tool and responded positively to information regarding the consequences of behaviors, comparisons with peer norms and health guidelines, tips for behavior change, and reinforcement of healthy choices. A small but noteworthy minority of participants dismissed the peer norms as not real or relevant and national guidelines as not valid or reasonable. When prompted for possible adaptations to the tool, adolescents expressed interest in receiving follow-up information, setting health-related goals, tracking their behaviors over time, and communicating with providers electronically between appointments. Adolescents in this qualitative study desired feedback that validates their healthy behavior choices and supports them as independent decision makers by neutrally presenting health information, facilitating goal setting, and offering ongoing technological supports. ©Garret G Zieve, Laura P Richardson, Katherine Katzman, Heather Spielvogle, Sandy Whitehouse, Carolyn A McCarty. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 20.07.2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talley, Darren G.
2017-04-01
This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code showsmore » good agreement between simulation and actual ACRR operations.« less
Computational Modeling and Validation for Hypersonic Inlets
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.
1996-01-01
Hypersonic inlet research activity at NASA is reviewed. The basis for the paper is the experimental tests performed with three inlets: the NASA Lewis Research Center Mach 5, the McDonnell Douglas Mach 12, and the NASA Langley Mach 18. Both three-dimensional PNS and NS codes have been used to compute the flow within the three inlets. Modeling assumptions in the codes involve the turbulence model, the nature of the boundary layer, shock wave-boundary layer interaction, and the flow spilled to the outside of the inlet. Use of the codes and the experimental data are helping to develop a clearer understanding of the inlet flow physics and to focus on the modeling improvements required in order to arrive at validated codes.
Behavior Change Techniques in Apps for Medication Adherence: A Content Analysis.
Morrissey, Eimear C; Corbett, Teresa K; Walsh, Jane C; Molloy, Gerard J
2016-05-01
There are a vast number of smartphone applications (apps) aimed at promoting medication adherence on the market; however, the theory and evidence base in terms of applying established health behavior change techniques underpinning these apps remains unclear. This study aimed to code these apps using the Behavior Change Technique Taxonomy (v1) for the presence or absence of established behavior change techniques. The sample of apps was identified through systematic searches in both the Google Play Store and Apple App Store in February 2015. All apps that fell into the search categories were downloaded for analysis. The downloaded apps were screened with exclusion criteria, and suitable apps were reviewed and coded for behavior change techniques in March 2015. Two researchers performed coding independently. In total, 166 medication adherence apps were identified and coded. The number of behavior change techniques contained in an app ranged from zero to seven (mean=2.77). A total of 12 of a possible 96 behavior change techniques were found to be present across apps. The most commonly included behavior change techniques were "action planning" and "prompt/cues," which were included in 96% of apps, followed by "self-monitoring" (37%) and "feedback on behavior" (36%). The current extent to which established behavior change techniques are used in medication adherence apps is limited. The development of medication adherence apps may not have benefited from advances in the theory and practice of health behavior change. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Computational Fluid Dynamics Technology for Hypersonic Applications
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2003-01-01
Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Marshall, William BJ J
In the course of criticality code validation, outlier cases are frequently encountered. Historically, the causes of these unexpected results could be diagnosed only through comparison with other similar cases or through the known presence of a unique component of the critical experiment. The sensitivity and uncertainty (S/U) analysis tools available in the SCALE 6.1 code system provide a much broader range of options to examine underlying causes of outlier cases. This paper presents some case studies performed as a part of the recent validation of the KENO codes in SCALE 6.1 using S/U tools to examine potential causes of biases.
Proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification
NASA Technical Reports Server (NTRS)
Ewen, Denney, W. (Editor); Jensen, Thomas (Editor)
2009-01-01
This NASA conference publication contains the proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification, held as part of LICS in Los Angeles, CA, USA, on August 15, 2009. Software certification demonstrates the reliability, safety, or security of software systems in such a way that it can be checked by an independent authority with minimal trust in the techniques and tools used in the certification process itself. It can build on existing validation and verification (V&V) techniques but introduces the notion of explicit software certificates, Vvilich contain all the information necessary for an independent assessment of the demonstrated properties. One such example is proof-carrying code (PCC) which is an important and distinctive approach to enhancing trust in programs. It provides a practical framework for independent assurance of program behavior; especially where source code is not available, or the code author and user are unknown to each other. The workshop wiII address theoretical foundations of logic-based software certification as well as practical examples and work on alternative application domains. Here "certificate" is construed broadly, to include not just mathematical derivations and proofs but also safety and assurance cases, or any fonnal evidence that supports the semantic analysis of programs: that is, evidence about an intrinsic property of code and its behaviour that can be independently checked by any user, intermediary, or third party. These guarantees mean that software certificates raise trust in the code itself, distinct from and complementary to any existing trust in the creator of the code, the process used to produce it, or its distributor. In addition to the contributed talks, the workshop featured two invited talks, by Kelly Hayhurst and Andrew Appel. The PCC 2009 website can be found at http://ti.arc.nasa.gov /event/pcc 091.
Development of a dynamic coupled hydro-geomechanical code and its application to induced seismicity
NASA Astrophysics Data System (ADS)
Miah, Md Mamun
This research describes the importance of a hydro-geomechanical coupling in the geologic sub-surface environment from fluid injection at geothermal plants, large-scale geological CO2 sequestration for climate mitigation, enhanced oil recovery, and hydraulic fracturing during wells construction in the oil and gas industries. A sequential computational code is developed to capture the multiphysics interaction behavior by linking a flow simulation code TOUGH2 and a geomechanics modeling code PyLith. Numerical formulation of each code is discussed to demonstrate their modeling capabilities. The computational framework involves sequential coupling, and solution of two sub-problems- fluid flow through fractured and porous media and reservoir geomechanics. For each time step of flow calculation, pressure field is passed to the geomechanics code to compute effective stress field and fault slips. A simplified permeability model is implemented in the code that accounts for the permeability of porous and saturated rocks subject to confining stresses. The accuracy of the TOUGH-PyLith coupled simulator is tested by simulating Terzaghi's 1D consolidation problem. The modeling capability of coupled poroelasticity is validated by benchmarking it against Mandel's problem. The code is used to simulate both quasi-static and dynamic earthquake nucleation and slip distribution on a fault from the combined effect of far field tectonic loading and fluid injection by using an appropriate fault constitutive friction model. Results from the quasi-static induced earthquake simulations show a delayed response in earthquake nucleation. This is attributed to the increased total stress in the domain and not accounting for pressure on the fault. However, this issue is resolved in the final chapter in simulating a single event earthquake dynamic rupture. Simulation results show that fluid pressure has a positive effect on slip nucleation and subsequent crack propagation. This is confirmed by running a sensitivity analysis that shows an increase in injection well distance results in delayed slip nucleation and rupture propagation on the fault.
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Validity of administrative coding in identifying patients with upper urinary tract calculi.
Semins, Michelle J; Trock, Bruce J; Matlaga, Brian R
2010-07-01
Administrative databases are increasingly used for epidemiological investigations. We performed a study to assess the validity of ICD-9 codes for upper urinary tract stone disease in an administrative database. We retrieved the records of all inpatients and outpatients at Johns Hopkins Hospital between November 2007 and October 2008 with an ICD-9 code of 592, 592.0, 592.1 or 592.9 as one of the first 3 diagnosis codes. A random number generator selected 100 encounters for further review. We considered a patient to have a true diagnosis of an upper tract stone if the medical records specifically referenced a kidney stone event, or included current or past treatment for a kidney stone. Descriptive and comparative analyses were performed. A total of 8,245 encounters coded as upper tract calculus were identified and 100 were randomly selected for review. Two patients could not be identified within the electronic medical record and were excluded from the study. The positive predictive value of using all ICD-9 codes for an upper tract calculus (592, 592.0, 592.1) to identify subjects with renal or ureteral stones was 95.9%. For 592.0 only the positive predictive value was 85%. However, although the positive predictive value for 592.1 only was 100%, 26 subjects (76%) with a ureteral stone were not appropriately billed with this code. ICD-9 coding for urinary calculi is likely to be sufficiently valid to be useful in studies using administrative data to analyze stone disease. However, ICD-9 coding is not a reliable means to distinguish between subjects with renal and ureteral calculi. Copyright (c) 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Pang, Jack X Q; Ross, Erin; Borman, Meredith A; Zimmer, Scott; Kaplan, Gilaad G; Heitman, Steven J; Swain, Mark G; Burak, Kelly W; Quan, Hude; Myers, Robert P
2015-09-11
Epidemiologic studies of alcoholic hepatitis (AH) have been hindered by the lack of a validated International Classification of Disease (ICD) coding algorithm for use with administrative data. Our objective was to validate coding algorithms for AH using a hospitalization database. The Hospital Discharge Abstract Database (DAD) was used to identify consecutive adults (≥18 years) hospitalized in the Calgary region with a diagnosis code for AH (ICD-10, K70.1) between 01/2008 and 08/2012. Medical records were reviewed to confirm the diagnosis of AH, defined as a history of heavy alcohol consumption, elevated AST and/or ALT (<300 U/L), serum bilirubin >34 μmol/L, and elevated INR. Subgroup analyses were performed according to the diagnosis field in which the code was recorded (primary vs. secondary) and AH severity. Algorithms that incorporated ICD-10 codes for cirrhosis and its complications were also examined. Of 228 potential AH cases, 122 patients had confirmed AH, corresponding to a positive predictive value (PPV) of 54% (95% CI 47-60%). PPV improved when AH was the primary versus a secondary diagnosis (67% vs. 21%; P < 0.001). Algorithms that included diagnosis codes for ascites (PPV 75%; 95% CI 63-86%), cirrhosis (PPV 60%; 47-73%), and gastrointestinal hemorrhage (PPV 62%; 51-73%) had improved performance, however, the prevalence of these diagnoses in confirmed AH cases was low (29-39%). In conclusion the low PPV of the diagnosis code for AH suggests that caution is necessary if this hospitalization database is used in large-scale epidemiologic studies of this condition.
Byrd, Gary D; Winkelstein, Peter
2014-10-01
Based on the authors' shared interest in the interprofessional challenges surrounding health information management, this study explores the degree to which librarians, informatics professionals, and core health professionals in medicine, nursing, and public health share common ethical behavior norms grounded in moral principles. Using the "Principlism" framework from a widely cited textbook of biomedical ethics, the authors analyze the statements in the ethical codes for associations of librarians (Medical Library Association [MLA], American Library Association, and Special Libraries Association), informatics professionals (American Medical Informatics Association [AMIA] and American Health Information Management Association), and core health professionals (American Medical Association, American Nurses Association, and American Public Health Association). This analysis focuses on whether and how the statements in these eight codes specify core moral norms (Autonomy, Beneficence, Non-Maleficence, and Justice), core behavioral norms (Veracity, Privacy, Confidentiality, and Fidelity), and other norms that are empirically derived from the code statements. These eight ethical codes share a large number of common behavioral norms based most frequently on the principle of Beneficence, then on Autonomy and Justice, but rarely on Non-Maleficence. The MLA and AMIA codes share the largest number of common behavioral norms, and these two associations also share many norms with the other six associations. The shared core of behavioral norms among these professions, all grounded in core moral principles, point to many opportunities for building effective interprofessional communication and collaboration regarding the development, management, and use of health information resources and technologies.
Byrd, Gary D.; Winkelstein, Peter
2014-01-01
Objective: Based on the authors' shared interest in the interprofessional challenges surrounding health information management, this study explores the degree to which librarians, informatics professionals, and core health professionals in medicine, nursing, and public health share common ethical behavior norms grounded in moral principles. Methods: Using the “Principlism” framework from a widely cited textbook of biomedical ethics, the authors analyze the statements in the ethical codes for associations of librarians (Medical Library Association [MLA], American Library Association, and Special Libraries Association), informatics professionals (American Medical Informatics Association [AMIA] and American Health Information Management Association), and core health professionals (American Medical Association, American Nurses Association, and American Public Health Association). This analysis focuses on whether and how the statements in these eight codes specify core moral norms (Autonomy, Beneficence, Non-Maleficence, and Justice), core behavioral norms (Veracity, Privacy, Confidentiality, and Fidelity), and other norms that are empirically derived from the code statements. Results: These eight ethical codes share a large number of common behavioral norms based most frequently on the principle of Beneficence, then on Autonomy and Justice, but rarely on Non-Maleficence. The MLA and AMIA codes share the largest number of common behavioral norms, and these two associations also share many norms with the other six associations. Implications: The shared core of behavioral norms among these professions, all grounded in core moral principles, point to many opportunities for building effective interprofessional communication and collaboration regarding the development, management, and use of health information resources and technologies. PMID:25349543
Modeling the rubbing contact in honeycomb seals
NASA Astrophysics Data System (ADS)
Fischer, Tim; Welzenbach, Sarah; Meier, Felix; Werner, Ewald; kyzy, Sonun Ulan; Munz, Oliver
2018-03-01
Metallic honeycomb labyrinth seals are commonly used as sealing systems in gas turbine engines. Because of their capability to withstand high thermo-mechanical loads and oxidation, polycrystalline nickel-based superalloys, such as Hastelloy X and Haynes 214, are used as sealing material. In addition, these materials must exhibit a tolerance against rubbing between the rotating part and the stationary seal component. The tolerance of the sealing material against rubbing preserves the integrity of the rotating part. In this article, the rubbing behavior at the rotor-stator interface is considered numerically. A simulation model is incorporated into the commercial finite element code ABAQUS/explicit and is utilized to simulate a simplified rubbing process. A user-defined interaction routine between the contact surfaces accounts for the thermal and mechanical interfacial behavior. Furthermore, an elasto-plastic constitutive material law captures the extreme temperature conditions and the damage behavior of the alloys. To validate the model, representative quantities of the rubbing process are determined and compared with experimental data from the literature. The simulation results correctly reproduce the observations made on a test rig with a reference stainless steel material (AISI 304). A parametric study using the nickel-based superalloys reveals a clear dependency of the rubbing behavior on the sliding and incursion velocity. Compared to each other, the two superalloys studied exhibit a different rubbing behavior.
FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju
To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid validation data package (composed of experimental and dummy data) will provide a clear and complete instance delineating the structure of the desired validation data and enabling effective communication among the modeler, the experimentalist, and the knowledgebase developer. With a good common understanding of the desired data structure by the three parties of subject matter experts, further existing data hunting will be effectively conducted, new experimental data generation will be realistically pursued, knowledgebase schema will be practically designed; and code validation will be confidently planned.« less
Fernandez, Rosemarie; Pearce, Marina; Grand, James A; Rench, Tara A; Jones, Kerin A; Chao, Georgia T; Kozlowski, Steve W J
2013-11-01
To determine the impact of a low-resource-demand, easily disseminated computer-based teamwork process training intervention on teamwork behaviors and patient care performance in code teams. A randomized comparison trial of computer-based teamwork training versus placebo training was conducted from August 2010 through March 2011. This study was conducted at the simulation suite within the Kado Family Clinical Skills Center, Wayne State University School of Medicine. Participants (n = 231) were fourth-year medical students and first-, second-, and third-year emergency medicine residents at Wayne State University. Each participant was assigned to a team of four to six members (nteams = 45). Teams were randomly assigned to receive either a 25-minute computer-based training module targeting appropriate resuscitation teamwork behaviors or a placebo training module. Teamwork behaviors and patient care behaviors were video recorded during high-fidelity simulated patient resuscitations and coded by trained raters blinded to condition assignment and study hypotheses. Teamwork behavior items (e.g., "chest radiograph findings communicated to team" and "team member assists with intubation preparation") were standardized before combining to create overall teamwork scores. Similarly, patient care items ("chest radiograph correctly interpreted"; "time to start of compressions") were standardized before combining to create overall patient care scores. Subject matter expert reviews and pilot testing of scenario content, teamwork items, and patient care items provided evidence of content validity. When controlling for team members' medically relevant experience, teams in the training condition demonstrated better teamwork (F [1, 42] = 4.81, p < 0.05; ηp = 10%) and patient care (F [1, 42] = 4.66, p < 0.05; ηp = 10%) than did teams in the placebo condition. Computer-based team training positively impacts teamwork and patient care during simulated patient resuscitations. This low-resource team training intervention may help to address the dissemination and sustainability issues associated with larger, more costly team training programs.
Real-Time Sensor Validation System Developed for Reusable Launch Vehicle Testbed
NASA Technical Reports Server (NTRS)
Jankovsky, Amy L.
1997-01-01
A real-time system for validating sensor health has been developed for the reusable launch vehicle (RLV) program. This system, which is part of the propulsion checkout and control system (PCCS), was designed for use in an integrated propulsion technology demonstrator testbed built by Rockwell International and located at the NASA Marshall Space Flight Center. Work on the sensor health validation system, a result of an industry-NASA partnership, was completed at the NASA Lewis Research Center, then delivered to Marshall for integration and testing. The sensor validation software performs three basic functions: it identifies failed sensors, it provides reconstructed signals for failed sensors, and it identifies off-nominal system transient behavior that cannot be attributed to a failed sensor. The code is initiated by host software before the start of a propulsion system test, and it is called by the host program every control cycle. The output is posted to global memory for use by other PCCS modules. Output includes a list indicating the status of each sensor (i.e., failed, healthy, or reconstructed) and a list of features that are not due to a sensor failure. If a sensor failure is found, the system modifies that sensor's data array by substituting a reconstructed signal, when possible, for use by other PCCS modules.
Scherr, Karen A.; Fagerlin, Angela; Williamson, Lillie D.; Davis, J. Kelly; Fridman, Ilona; Atyeo, Natalie; Ubel, Peter A.
2016-01-01
Background Physicians’ recommendations affect patients’ treatment choices. However, most research relies on physicians’ or patients’ retrospective reports of recommendations, which offer a limited perspective and have limitations such as recall bias. Objective To develop a reliable and valid method to measure the strength of physician recommendations using direct observation of clinical encounters. Methods Clinical encounters (n = 257) were recorded as part of a larger study of prostate cancer decision making. We used an iterative process to create the 5-point Physician Recommendation Coding System (PhyReCS). To determine reliability, research assistants double-coded 50 transcripts. To establish content validity, we used one-way ANOVAs to determine whether relative treatment recommendation scores differed as a function of which treatment patients received. To establish concurrent validity, we examined whether patients’ perceived treatment recommendations matched our coded recommendations. Results The PhyReCS was highly reliable (Krippendorf’s alpha =. 89, 95% CI [.86, .91]). The average relative treatment recommendation score for each treatment was higher for individuals who received that particular treatment. For example, the average relative surgery recommendation score was higher for individuals who received surgery versus radiation (mean difference = .98, SE = .18, p < .001) or active surveillance (mean difference = 1.10, SE = .14, p < .001). Patients’ perceived recommendations matched coded recommendations 81% of the time. Conclusion The PhyReCS is a reliable and valid way to capture the strength of physician recommendations. We believe that the PhyReCS would be helpful for other researchers who wish to study physician recommendations, an important part of patient decision making. PMID:27343015
Scherr, Karen A; Fagerlin, Angela; Williamson, Lillie D; Davis, J Kelly; Fridman, Ilona; Atyeo, Natalie; Ubel, Peter A
2017-01-01
Physicians' recommendations affect patients' treatment choices. However, most research relies on physicians' or patients' retrospective reports of recommendations, which offer a limited perspective and have limitations such as recall bias. To develop a reliable and valid method to measure the strength of physician recommendations using direct observation of clinical encounters. Clinical encounters (n = 257) were recorded as part of a larger study of prostate cancer decision making. We used an iterative process to create the 5-point Physician Recommendation Coding System (PhyReCS). To determine reliability, research assistants double-coded 50 transcripts. To establish content validity, we used 1-way analyses of variance to determine whether relative treatment recommendation scores differed as a function of which treatment patients received. To establish concurrent validity, we examined whether patients' perceived treatment recommendations matched our coded recommendations. The PhyReCS was highly reliable (Krippendorf's alpha = 0.89, 95% CI [0.86, 0.91]). The average relative treatment recommendation score for each treatment was higher for individuals who received that particular treatment. For example, the average relative surgery recommendation score was higher for individuals who received surgery versus radiation (mean difference = 0.98, SE = 0.18, P < 0.001) or active surveillance (mean difference = 1.10, SE = 0.14, P < 0.001). Patients' perceived recommendations matched coded recommendations 81% of the time. The PhyReCS is a reliable and valid way to capture the strength of physician recommendations. We believe that the PhyReCS would be helpful for other researchers who wish to study physician recommendations, an important part of patient decision making. © The Author(s) 2016.
Calderwood, Michael S.; Kleinman, Ken; Murphy, Michael V.; Platt, Richard; Huang, Susan S.
2014-01-01
Background Deep and organ/space surgical site infections (D/OS SSI) cause significant morbidity, mortality, and costs. Rates are publicly reported and increasingly used as quality metrics affecting hospital payment. Lack of standardized surveillance methods threaten the accuracy of reported data and decrease confidence in comparisons based upon these data. Methods We analyzed data from national validation studies that used Medicare claims to trigger chart review for SSI confirmation after coronary artery bypass graft surgery (CABG) and hip arthroplasty. We evaluated code performance (sensitivity and positive predictive value) to select diagnosis codes that best identified D/OS SSI. Codes were analyzed individually and in combination. Results Analysis included 143 patients with D/OS SSI after CABG and 175 patients with D/OS SSI after hip arthroplasty. For CABG, 9 International Classification of Diseases, 9th Revision (ICD-9) diagnosis codes identified 92% of D/OS SSI, with 1 D/OS SSI identified for every 4 cases with a diagnosis code. For hip arthroplasty, 6 ICD-9 diagnosis codes identified 99% of D/OS SSI, with 1 D/OS SSI identified for every 2 cases with a diagnosis code. Conclusions This standardized and efficient approach for identifying D/OS SSI can be used by hospitals to improve case detection and public reporting. This method can also be used to identify potential D/OS SSI cases for review during hospital audits for data validation. PMID:25734174
Natural language processing of clinical notes for identification of critical limb ischemia.
Afzal, Naveed; Mallipeddi, Vishnu Priya; Sohn, Sunghwan; Liu, Hongfang; Chaudhry, Rajeev; Scott, Christopher G; Kullo, Iftikhar J; Arruda-Olson, Adelaide M
2018-03-01
Critical limb ischemia (CLI) is a complication of advanced peripheral artery disease (PAD) with diagnosis based on the presence of clinical signs and symptoms. However, automated identification of cases from electronic health records (EHRs) is challenging due to absence of a single definitive International Classification of Diseases (ICD-9 or ICD-10) code for CLI. In this study, we extend a previously validated natural language processing (NLP) algorithm for PAD identification to develop and validate a subphenotyping NLP algorithm (CLI-NLP) for identification of CLI cases from clinical notes. We compared performance of the CLI-NLP algorithm with CLI-related ICD-9 billing codes. The gold standard for validation was human abstraction of clinical notes from EHRs. Compared to billing codes the CLI-NLP algorithm had higher positive predictive value (PPV) (CLI-NLP 96%, billing codes 67%, p < 0.001), specificity (CLI-NLP 98%, billing codes 74%, p < 0.001) and F1-score (CLI-NLP 90%, billing codes 76%, p < 0.001). The sensitivity of these two methods was similar (CLI-NLP 84%; billing codes 88%; p < 0.12). The CLI-NLP algorithm for identification of CLI from narrative clinical notes in an EHR had excellent PPV and has potential for translation to patient care as it will enable automated identification of CLI cases for quality projects, clinical decision support tools and support a learning healthcare system. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Improvement of COBRA-TF for modeling of PWR cold- and hot-legs during reactor transients
NASA Astrophysics Data System (ADS)
Salko, Robert K.
COBRA-TF is a two-phase, three-field (liquid, vapor, droplets) thermal-hydraulic modeling tool that has been developed by the Pacific Northwest Laboratory under sponsorship of the NRC. The code was developed for Light Water Reactor analysis starting in the 1980s; however, its development has continued to this current time. COBRA-TF still finds wide-spread use throughout the nuclear engineering field, including nuclear-power vendors, academia, and research institutions. It has been proposed that extension of the COBRA-TF code-modeling region from vessel-only components to Pressurized Water Reactor (PWR) coolant-line regions can lead to improved Loss-of-Coolant Accident (LOCA) analysis. Improved modeling is anticipated due to COBRA-TF's capability to independently model the entrained-droplet flow-field behavior, which has been observed to impact delivery to the core region[1]. Because COBRA-TF was originally developed for vertically-dominated, in-vessel, sub-channel flow, extension of the COBRA-TF modeling region to the horizontal-pipe geometries of the coolant-lines required several code modifications, including: • Inclusion of the stratified flow regime into the COBRA-TF flow regime map, along with associated interfacial drag, wall drag and interfacial heat transfer correlations, • Inclusion of a horizontal-stratification force between adjacent mesh cells having unequal levels of stratified flow, and • Generation of a new code-input interface for the modeling of coolant-lines. The sheer number of COBRA-TF modifications that were required to complete this work turned this project into a code-development project as much as it was a study of thermal-hydraulics in reactor coolant-lines. The means for achieving these tasks shifted along the way, ultimately leading the development of a separate, nearly completely independent one-dimensional, two-phase-flow modeling code geared toward reactor coolant-line analysis. This developed code has been named CLAP, for Coolant-Line-Analysis Package. Versions were created that were both coupled to COBRA-TF and standalone, with the most recent version being a standalone code. This code performs a separate, simplified, 1-D solution of the conservation equations while making special considerations for coolant-line geometry and flow phenomena. The end of this project saw a functional code package that demonstrates a stable numerical solution and that has gone through a series of Validation and Verification tests using the Two-Phase Testing Facility (TPTF) experimental data[2]. The results indicate that CLAP is under-performing RELAP5-MOD3 in predicting the experimental void of the TPTF facility in some cases. There is no apparent pattern, however, to point to a consistent type of case that the code fails to predict properly (e.g., low-flow, high-flow, discharging to full vessel, or discharging to empty vessel). Pressure-profile predictions are sometimes unrealistic, which indicates that there may be a problem with test-case boundary conditions or with the coupling of continuity and momentum equations in the solution algorithm. The code does predict the flow regime correctly for all cases with the stratification-force model off. Turning the stratification model on can cause the low-flow case void profiles to over-react to the force and the flow regime to transition out of stratified flow. The code would benefit from an increased amount of Validation & Verification testing. The development of CLAP was significant, as it is a cleanly written, logical representation of the reactor coolant-line geometry. It is stable and capable of modeling basic flow physics in the reactor coolant-line. Code development and debugging required the temporary removal of the energy equation and mass-transfer terms in governing equations. The reintroduction of these terms will allow future coupling to RELAP and re-coupling with COBRA-TF. Adding in more applicable entrainment and de-entrainment models would allow the capture of more advanced physics in the coolant-line that can be expected during Loss-of-Coolant Accident. One of the package's benefits is its ability to be used as a platform for future coolant-line model development and implementation, including capturing of the important de-entrainment behavior in reactor hot-legs (steam-binding effect) and flow convection in the upper-plenum region of the vessel.
VLF Trimpi modelling on the path NWC-Dunedin using both finite element and 3D Born modelling
NASA Astrophysics Data System (ADS)
Nunn, D.; Hayakawa, K. B. M.
1998-10-01
This paper investigates the numerical modelling of VLF Trimpis, produced by a D region inhomogeneity on the great circle path. Two different codes are used to model Trimpis on the path NWC-Dunedin. The first is a 2D Finite Element Method Code (FEM), whose solutions are rigorous and valid in the strong scattering or non-Born limit. The second code is a 3D model that invokes the Born approximation. The predicted Trimpis from these codes compare very closely, thus confirming the validity of both models. The modal scattering matrices for both codes are analysed in some detail and are found to have a comparable structure. They indicate strong scattering between the dominant TM modes. Analysis of the scattering matrix from the FEM code shows that departure from linear Born behaviour occurs when the inhomogeneity has a horizontal scale size of about 100 km and a maximum electron density enhancement at 75 km altitude of about 6 electrons.
A measure of short-term visual memory based on the WISC-R coding subtest.
Collaer, M L; Evans, J R
1982-07-01
Adapted the Coding subtest of the WISC-R to provide a measure of visual memory. Three hundred and five children, aged 8 through 12, were administered the Coding test using standard directions. A few seconds after completion the key was taken away, and each was given a paper with only the digits and asked to write the appropriate matching symbol below each. This was termed "Coding Recall." To provide validity data, a subgroup of 50 Ss also was administered the Attention Span for Letters subtest from the Detroit Tests of Learning Aptitude (as a test of visual memory for sequences of letters) and a Bender Gestalt recall test (as a measure of visual memory for geometric forms). Coding Recall means and standard deviations are reported separately by sex and age level. Implications for clinicans are discussed. Reservations about clinical use of the data are given in view of the possible lack of representativeness of the sample used and the limited reliability and validity of Coding Recall.
Main, Alexandra; Paxton, Alexandra; Dale, Rick
2016-09-01
Dynamic patterns of influence between parents and children have long been considered key to understanding family relationships. Despite this, most observational research on emotion in parent-child interactions examines global behaviors at the expense of exploring moment-to-moment fluctuations in emotions that are important for relational outcomes. Using recurrence quantification analysis (RQA) and growth curve analysis, this investigation explored emotion dynamics during parent-adolescent conflict interactions, focusing not only on concurrently shared emotional states but also on time-lagged synchrony of parents' and adolescents' emotions relative to one another. Mother-adolescent dyads engaged in a 10-min conflict discussion and reported on their satisfaction with the process and outcome of the discussion. Emotions were coded using the Specific Affect Coding System (SPAFF) and were collapsed into the following categories: negativity, positivity, and validation/interest. RQA and growth curve analyses revealed that negative and positive emotions were characterized by a concurrently synchronous pattern across all dyads, with the highest recurrence rates occurring around simultaneity. However, lower levels of concurrent synchrony of negative emotions were associated with higher discussion satisfaction. We also found that patterns of negativity differed with age: Mothers led negativity in dyads with younger adolescents, and adolescents led negativity in dyads with older adolescents. In contrast to negative and positive emotions, validation/interest showed the time-lagged pattern characteristic of turn-taking, and more highly satisfied dyads showed stronger patterns of time-lagged coordination in validation/interest. Our findings underscore the dynamic nature of emotions in parent-adolescent interactions and highlight the important contributions of these moment-to-moment dynamics toward overall interaction quality. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
The Nuclear Energy Knowledge and Validation Center Summary of Activities Conducted in FY16
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans David
The Nuclear Energy Knowledge and Validation Center (NEKVaC) is a new initiative by the Department of Energy (DOE) and Idaho National Laboratory (INL) to coordinate and focus the resources and expertise that exist with the DOE toward solving issues in modern nuclear code validation and knowledge management. In time, code owners, users, and developers will view the NEKVaC as a partner and essential resource for acquiring the best practices and latest techniques for validating codes, providing guidance in planning and executing experiments, facilitating access to and maximizing the usefulness of existing data, and preserving knowledge for continual use by nuclearmore » professionals and organizations for their own validation needs. The scope of the NEKVaC covers many interrelated activities that will need to be cultivated carefully in the near term and managed properly once the NEKVaC is fully functional. Three areas comprise the principal mission: (1) identify and prioritize projects that extend the field of validation science and its application to modern codes, (2) develop and disseminate best practices and guidelines for high-fidelity multiphysics/multiscale analysis code development and associated experiment design, and (3) define protocols for data acquisition and knowledge preservation and provide a portal for access to databases currently scattered among numerous organizations. These mission areas, while each having a unique focus, are interdependent and complementary. Likewise, all activities supported by the NEKVaC, both near term and long term, must possess elements supporting all three areas. This cross cutting nature is essential to ensuring that activities and supporting personnel do not become “stove piped” (i.e., focused a specific function that the activity itself becomes the objective rather than achieving the larger vision). This report begins with a description of the mission areas; specifically, the role played by each major committee and the types of activities for which they are responsible. It then lists and describes the proposed near term tasks upon which future efforts can build.« less
2014-01-01
Background The systematic review of reasons is a new way to obtain comprehensive information about specific ethical topics. One such review was carried out for the question of why post-trial access to trial drugs should or need not be provided. The objective of this study was to empirically validate this review using an author check method. The article also reports on methodological challenges faced by our study. Methods We emailed a questionnaire to the 64 corresponding authors of those papers that were assessed in the review of reasons on post-trial access. The questionnaire consisted of all quotations (“reason mentions”) that were identified by the review to represent a reason in a given author’s publication, together with a set of codings for the quotations. The authors were asked to rate the correctness of the codings. Results We received 19 responses, from which only 13 were completed questionnaires. In total, 98 quotations and their related codes in the 13 questionnaires were checked by the addressees. For 77 quotations (79%), all codings were deemed correct, for 21 quotations (21%), some codings were deemed to need correction. Most corrections were minor and did not imply a complete misunderstanding of the citation. Conclusions This first attempt to validate a review of reasons leads to four crucial methodological questions relevant to the future conduct of such validation studies: 1) How can a description of a reason be deemed incorrect? 2) Do the limited findings of this author check study enable us to determine whether the core results of the analysed SRR are valid? 3) Why did the majority of surveyed authors refrain from commenting on our understanding of their reasoning? 4) How can the method for validating reviews of reasons be improved? PMID:25262532
Development of the 3DHZETRN code for space radiation protection
NASA Astrophysics Data System (ADS)
Wilson, John; Badavi, Francis; Slaba, Tony; Reddell, Brandon; Bahadori, Amir; Singleterry, Robert
Space radiation protection requires computationally efficient shield assessment methods that have been verified and validated. The HZETRN code is the engineering design code used for low Earth orbit dosimetric analysis and astronaut record keeping with end-to-end validation to twenty percent in Space Shuttle and International Space Station operations. HZETRN treated diffusive leakage only at the distal surface limiting its application to systems with a large radius of curvature. A revision of HZETRN that included forward and backward diffusion allowed neutron leakage to be evaluated at both the near and distal surfaces. That revision provided a deterministic code of high computational efficiency that was in substantial agreement with Monte Carlo (MC) codes in flat plates (at least to the degree that MC codes agree among themselves). In the present paper, the 3DHZETRN formalism capable of evaluation in general geometry is described. Benchmarking will help quantify uncertainty with MC codes (Geant4, FLUKA, MCNP6, and PHITS) in simple shapes such as spheres within spherical shells and boxes. Connection of the 3DHZETRN to general geometry will be discussed.
Quantum Dense Coding About a Two-Qubit Heisenberg XYZ Model
NASA Astrophysics Data System (ADS)
Xu, Hui-Yun; Yang, Guo-Hui
2017-09-01
By taking into account the nonuniform magnetic field, the quantum dense coding with thermal entangled states of a two-qubit anisotropic Heisenberg XYZ chain are investigated in detail. We mainly show the different properties about the dense coding capacity ( χ) with the changes of different parameters. It is found that dense coding capacity χ can be enhanced by decreasing the magnetic field B, the degree of inhomogeneity b and temperature T, or increasing the coupling constant along z-axis J z . In addition, we also find χ remains the stable value as the change of the anisotropy of the XY plane Δ in a certain temperature condition. Through studying different parameters effect on χ, it presents that we can properly turn the values of B, b, J z , Δ or adjust the temperature T to obtain a valid dense coding capacity ( χ satisfies χ > 1). Moreover, the temperature plays a key role in adjusting the value of dense coding capacity χ. The valid dense coding capacity could be always obtained in the lower temperature-limit case.
miRNAome expression profiles in the gonads of adult Melopsittacus undulatus
Jiang, Lan; Wang, Qingqing; Yu, Jue; Gowda, Vinita; Johnson, Gabriel; Yang, Jianke
2018-01-01
The budgerigar (Melopsittacus undulatus) is one of the most widely studied parrot species, serving as an excellent animal model for behavior and neuroscience research. Until recently, it was unknown how sexual differences in the behavior, physiology, and development of organisms are regulated by differential gene expression. MicroRNAs (miRNAs) are endogenous short non-coding RNA molecules that can post-transcriptionally regulate gene expression and play a critical role in gonadal differentiation as well as early development of animals. However, very little is known about the role gonadal miRNAs play in the early development of birds. Research on the sex-biased expression of miRNAs in avian gonads are limited, and little is known about M. undulatus. In the current study, we sequenced two small non-coding RNA libraries made from the gonads of adult male and female budgerigars using Illumina paired-end sequencing technology. We obtained 254 known and 141 novel miRNAs, and randomly validated five miRNAs. Of these, three miRNAs were differentially expressed miRNAs and 18 miRNAs involved in sexual differentiation as determined by functional analysis with GO annotation and KEGG pathway analysis. In conclusion, this work is the first report of sex-biased miRNAs expression in the budgerigar, and provides additional sequences to the avian miRNAome database which will foster further functional genomic research. PMID:29666766
Creating a culture of mutual respect.
Kaplan, Kathryn; Mestel, Pamela; Feldman, David L
2010-04-01
The Joint Commission mandates that hospitals seeking accreditation have a process to define and address disruptive behavior. Leaders at Maimonides Medical Center, Brooklyn, New York, took the initiative to create a code of mutual respect that not only requires respectful behavior, but also encourages sensitivity and awareness to the causes of frustration that often lead to inappropriate behavior. Steps to implementing the code included selecting code advocates, setting up a system for mediating disputes, tracking and addressing operational system issues, providing training for personnel, developing a formal accountability process, and measuring the results. Copyright 2010 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Ladd, Benjamin O; Garcia, Tracey A; Anderson, Kristen G
2016-09-01
The current study explored whether laboratory-based techniques can provide a strategy for studying client language as a mechanism of behavior change. Specifically, this study examined the potential of a simulation task to elicit healthy talk, or self-motivational statements in favor of healthy behavior, related to marijuana and alcohol use. Participants (N = 84) were adolescents reporting at least 10 lifetime substance use episodes recruited from various community settings in an urban Pacific Northwest setting. Participants completed the Adolescent Simulated Intoxication Digital Elicitation (A-SIDE), a validated paradigm for assessing substance use decision making in peer contexts. Participants responded to 4 types of offers in the A-SIDE: (a) marijuana, (b) food (marijuana control), (c) alcohol, and (d) soda (alcohol control). Using a validated coding scheme adapted for the current study, client language during a structured interview assessing participants' response to the simulated offers was evaluated. Associations between percent healthy talk (PHT, calculated by dividing the number of healthy statements by the sum of all substance-related statements) and cross-sectional outcomes of interest (previous substance use, substance use expectancies, and behavioral willingness) were explored. The frequency of substance-related statements differed in response to offer type; rate of PHT did not. PHT was associated with behavioral willingness to accept the offer. However, PHT was not associated with decontextualized measures of substance use. Associations between PHT and global expectancies were limited. Simulation methods may be useful in investigating the impact of context on self-talk and to systematically explore client language as a mechanism of change. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.
1990-01-01
Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results ofmore » such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech.« less
Validating LES for Jet Aeroacoustics
NASA Technical Reports Server (NTRS)
Bridges, James
2011-01-01
Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that result in having dreams come true. This paper primarily addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. It also addresses the latter problem in discussing what are relevant measures critical for aeroacoustics that should be used in validating LES codes. These new diagnostic techniques deliver measurements and flow statistics of increasing sophistication and capability, but what of their accuracy? And what are the measures to be used in validation? This paper argues that the issue of accuracy be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it is argued that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound.
Gnjidic, Danijela; Pearson, Sallie-Anne; Hilmer, Sarah N; Basilakis, Jim; Schaffer, Andrea L; Blyth, Fiona M; Banks, Emily
2015-03-30
Increasingly, automated methods are being used to code free-text medication data, but evidence on the validity of these methods is limited. To examine the accuracy of automated coding of previously keyed in free-text medication data compared with manual coding of original handwritten free-text responses (the 'gold standard'). A random sample of 500 participants (475 with and 25 without medication data in the free-text box) enrolled in the 45 and Up Study was selected. Manual coding involved medication experts keying in free-text responses and coding using Anatomical Therapeutic Chemical (ATC) codes (i.e. chemical substance 7-digit level; chemical subgroup 5-digit; pharmacological subgroup 4-digit; therapeutic subgroup 3-digit). Using keyed-in free-text responses entered by non-experts, the automated approach coded entries using the Australian Medicines Terminology database and assigned corresponding ATC codes. Based on manual coding, 1377 free-text entries were recorded and, of these, 1282 medications were coded to ATCs manually. The sensitivity of automated coding compared with manual coding was 79% (n = 1014) for entries coded at the exact ATC level, and 81.6% (n = 1046), 83.0% (n = 1064) and 83.8% (n = 1074) at the 5, 4 and 3-digit ATC levels, respectively. The sensitivity of automated coding for blank responses was 100% compared with manual coding. Sensitivity of automated coding was highest for prescription medications and lowest for vitamins and supplements, compared with the manual approach. Positive predictive values for automated coding were above 95% for 34 of the 38 individual prescription medications examined. Automated coding for free-text prescription medication data shows very high to excellent sensitivity and positive predictive values, indicating that automated methods can potentially be useful for large-scale, medication-related research.
Grayson-Sneed, Katelyn A; Smith, Robert C
2018-04-01
Develop a reliable coding method of a Behavioral Health Treatment Model for patients with Medically Unexplained Symptoms (BHTM-MUS). Two undergraduates trained for 30h coded videotaped interviews from 161 resident-simulated patient (SP) interactions. Trained on 45 videotapes, coders coded 33 (20%) of 161 study set tapes for the BHTM-MUS. Guetzkow's U, Cohen's Kappa, and percent of agreement were used to measure coders' reliability in unitizing and coding residents' skills for eliciting: education and informing (4 yes/no items), motivating (2), treatment statements (5), commitment and goals (2), negotiates plan (8), non-emotion patient-centered skills (4), and patient-centered emotional skills (8). 60 items were dichotomized a priori from analysis of the BHTM-MUS and were reduced to 33 during training. Guetzkow's U ranged from .00 to .082. Kappa ranged from 0.76 to 0.97 for the 7 variables and 33 individual items. The overall kappa was 0.87, and percent of agreement was 95.7%. Percent of agreement by item ranged from 85 to 100%. A highly reliable coding method is recommended to evaluate medical clinicians' behavioral care skills in patients with unexplained symptoms. A way to rate behavioral care in patients with unexplained symptoms. Copyright © 2017 Elsevier B.V. All rights reserved.
Green's function methods in heavy ion shielding
NASA Technical Reports Server (NTRS)
Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.
1993-01-01
An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.
NASA Astrophysics Data System (ADS)
Popota, F. D.; Aguiar, P.; España, S.; Lois, C.; Udias, J. M.; Ros, D.; Pavia, J.; Gispert, J. D.
2015-01-01
In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system’s sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system’s dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.
Popota, F D; Aguiar, P; España, S; Lois, C; Udias, J M; Ros, D; Pavia, J; Gispert, J D
2015-01-07
In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system's sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system's dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J
Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less
Adaptive neural coding: from biological to behavioral decision-making
Louie, Kenway; Glimcher, Paul W.; Webb, Ryan
2015-01-01
Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aly, A.; Avramova, Maria; Ivanov, Kostadin
To correctly describe and predict this hydrogen distribution there is a need for multi-physics coupling to provide accurate three-dimensional azimuthal, radial, and axial temperature distributions in the cladding. Coupled high-fidelity reactor-physics codes with a sub-channel code as well as with a computational fluid dynamics (CFD) tool have been used to calculate detailed temperature distributions. These high-fidelity coupled neutronics/thermal-hydraulics code systems are coupled further with the fuel-performance BISON code with a kernel (module) for hydrogen. Both hydrogen migration and precipitation/dissolution are included in the model. Results from this multi-physics analysis is validated utilizing calculations of hydrogen distribution using models informed bymore » data from hydrogen experiments and PIE data.« less
Coding gestural behavior with the NEUROGES--ELAN system.
Lausberg, Hedda; Sloetjes, Han
2009-08-01
We present a coding system combined with an annotation tool for the analysis of gestural behavior. The NEUROGES coding system consists of three modules that progress from gesture kinetics to gesture function. Grounded on empirical neuropsychological and psychological studies, the theoretical assumption behind NEUROGES is that its main kinetic and functional movement categories are differentially associated with specific cognitive, emotional, and interactive functions. ELAN is a free, multimodal annotation tool for digital audio and video media. It supports multileveled transcription and complies with such standards as XML and Unicode. ELAN allows gesture categories to be stored with associated vocabularies that are reusable by means of template files. The combination of the NEUROGES coding system and the annotation tool ELAN creates an effective tool for empirical research on gestural behavior.
Wakschlag, Lauren S; Briggs-Gowan, Margaret J; Hill, Carri; Danis, Barbara; Leventhal, Bennett L; Keenan, Kate; Egger, Helen L; Cicchetti, Domenic; Burns, James; Carter, Alice S
2008-06-01
To examine the validity of the Disruptive Behavior Diagnostic Observation Schedule (DB-DOS), a new observational method for assessing preschool disruptive behavior. A total of 327 behaviorally heterogeneous preschoolers from low-income environments comprised the validation sample. Parent and teacher reports were used to identify children with clinically significant disruptive behavior. The DB-DOS assessed observed disruptive behavior in two domains, problems in Behavioral Regulation and Anger Modulation, across three interactional contexts: Examiner Engaged, Examiner Busy, and Parent. Convergent and divergent validity of the DB-DOS were tested in relation to parent and teacher reports and independently observed behavior. Clinical validity was tested in terms of criterion and incremental validity of the DB-DOS for discriminating disruptive behavior status and impairment, concurrently and longitudinally. DB-DOS scores were significantly associated with reported and independently observed behavior in a theoretically meaningful fashion. Scores from both DB-DOS domains and each of the three DB-DOS contexts contributed uniquely to discrimination of disruptive behavior status, concurrently and predictively. Observed behavior on the DB-DOS also contributed incrementally to prediction of impairment over time, beyond variance explained by meeting DSM-IV disruptive behavior disorder symptom criteria based on parent/teacher report. The multidomain, multicontext approach of the DB-DOS is a valid method for direct assessment of preschool disruptive behavior. This approach shows promise for enhancing accurate identification of clinically significant disruptive behavior in young children and for characterizing subtypes in a manner that can directly inform etiological and intervention research.
NASA Technical Reports Server (NTRS)
Nguyen, Lac; Kenney, Patrick J.
1993-01-01
Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.
Supplementing Public Health Inspection via Social Media
Schomberg, John P.; Haimson, Oliver L.; Hayes, Gillian R.; Anton-Culver, Hoda
2016-01-01
Foodborne illness is prevented by inspection and surveillance conducted by health departments across America. Appropriate restaurant behavior is enforced and monitored via public health inspections. However, surveillance coverage provided by state and local health departments is insufficient in preventing the rising number of foodborne illness outbreaks. To address this need for improved surveillance coverage we conducted a supplementary form of public health surveillance using social media data: Yelp.com restaurant reviews in the city of San Francisco. Yelp is a social media site where users post reviews and rate restaurants they have personally visited. Presence of keywords related to health code regulations and foodborne illness symptoms, number of restaurant reviews, number of Yelp stars, and restaurant price range were included in a model predicting a restaurant’s likelihood of health code violation measured by the assigned San Francisco public health code rating. For a list of major health code violations see (S1 Table). We built the predictive model using 71,360 Yelp reviews of restaurants in the San Francisco Bay Area. The predictive model was able to predict health code violations in 78% of the restaurants receiving serious citations in our pilot study of 440 restaurants. Training and validation data sets each pulled data from 220 restaurants in San Francisco. Keyword analysis of free text within Yelp not only improved detection of high-risk restaurants, but it also served to identify specific risk factors related to health code violation. To further validate our model we applied the model generated in our pilot study to Yelp data from 1,542 restaurants in San Francisco. The model achieved 91% sensitivity 74% specificity, area under the receiver operator curve of 98%, and positive predictive value of 29% (given a substandard health code rating prevalence of 10%). When our model was applied to restaurant reviews in New York City we achieved 74% sensitivity, 54% specificity, area under the receiver operator curve of 77%, and positive predictive value of 25% (given a prevalence of 12%). Model accuracy improved when reviews ranked highest by Yelp were utilized. Our results indicate that public health surveillance can be improved by using social media data to identify restaurants at high risk for health code violation. Additionally, using highly ranked Yelp reviews improves predictive power and limits the number of reviews needed to generate prediction. Use of this approach as an adjunct to current risk ranking of restaurants prior to inspection may enhance detection of those restaurants participating in high risk practices that may have gone previously undetected. This model represents a step forward in the integration of social media into meaningful public health interventions. PMID:27023681
Supplementing Public Health Inspection via Social Media.
Schomberg, John P; Haimson, Oliver L; Hayes, Gillian R; Anton-Culver, Hoda
2016-01-01
Foodborne illness is prevented by inspection and surveillance conducted by health departments across America. Appropriate restaurant behavior is enforced and monitored via public health inspections. However, surveillance coverage provided by state and local health departments is insufficient in preventing the rising number of foodborne illness outbreaks. To address this need for improved surveillance coverage we conducted a supplementary form of public health surveillance using social media data: Yelp.com restaurant reviews in the city of San Francisco. Yelp is a social media site where users post reviews and rate restaurants they have personally visited. Presence of keywords related to health code regulations and foodborne illness symptoms, number of restaurant reviews, number of Yelp stars, and restaurant price range were included in a model predicting a restaurant's likelihood of health code violation measured by the assigned San Francisco public health code rating. For a list of major health code violations see (S1 Table). We built the predictive model using 71,360 Yelp reviews of restaurants in the San Francisco Bay Area. The predictive model was able to predict health code violations in 78% of the restaurants receiving serious citations in our pilot study of 440 restaurants. Training and validation data sets each pulled data from 220 restaurants in San Francisco. Keyword analysis of free text within Yelp not only improved detection of high-risk restaurants, but it also served to identify specific risk factors related to health code violation. To further validate our model we applied the model generated in our pilot study to Yelp data from 1,542 restaurants in San Francisco. The model achieved 91% sensitivity 74% specificity, area under the receiver operator curve of 98%, and positive predictive value of 29% (given a substandard health code rating prevalence of 10%). When our model was applied to restaurant reviews in New York City we achieved 74% sensitivity, 54% specificity, area under the receiver operator curve of 77%, and positive predictive value of 25% (given a prevalence of 12%). Model accuracy improved when reviews ranked highest by Yelp were utilized. Our results indicate that public health surveillance can be improved by using social media data to identify restaurants at high risk for health code violation. Additionally, using highly ranked Yelp reviews improves predictive power and limits the number of reviews needed to generate prediction. Use of this approach as an adjunct to current risk ranking of restaurants prior to inspection may enhance detection of those restaurants participating in high risk practices that may have gone previously undetected. This model represents a step forward in the integration of social media into meaningful public health interventions.
Cohen, Alysia; McDonald, Samantha; McIver, Kerry; Pate, Russell; Trost, Stewart
2014-05-01
The purpose of this study was to evaluate the validity and interrater reliability of the Observational System for Recording Activity in Children: Youth Sports (OSRAC:YS). Children (N = 29) participating in a parks and recreation soccer program were observed during regularly scheduled practices. Physical activity (PA) intensity and contextual factors were recorded by momentary time-sampling procedures (10-second observe, 20-second record). Two observers simultaneously observed and recorded children's PA intensity, practice context, social context, coach behavior, and coach proximity. Interrater reliability was based on agreement (Kappa) between the observer's coding for each category, and the Intraclass Correlation Coefficient (ICC) for percent of time spent in MVPA. Validity was assessed by calculating the correlation between OSRAC:YS estimated and objectively measured MVPA. Kappa statistics for each category demonstrated substantial to almost perfect interobserver agreement (Kappa = 0.67-0.93). The ICC for percent time in MVPA was 0.76 (95% C.I. = 0.49-0.90). A significant correlation (r = .73) was observed for MVPA recorded by observation and MVPA measured via accelerometry. The results indicate the OSRAC:YS is a reliable and valid tool for measuring children's PA and contextual factors during a youth soccer practice.
BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements.
Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang
2017-10-27
This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm.
BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements
Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang
2017-01-01
This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm. PMID:29076998
MHD code using multi graphical processing units: SMAUG+
NASA Astrophysics Data System (ADS)
Gyenge, N.; Griffiths, M. K.; Erdélyi, R.
2018-01-01
This paper introduces the Sheffield Magnetohydrodynamics Algorithm Using GPUs (SMAUG+), an advanced numerical code for solving magnetohydrodynamic (MHD) problems, using multi-GPU systems. Multi-GPU systems facilitate the development of accelerated codes and enable us to investigate larger model sizes and/or more detailed computational domain resolutions. This is a significant advancement over the parent single-GPU MHD code, SMAUG (Griffiths et al., 2015). Here, we demonstrate the validity of the SMAUG + code, describe the parallelisation techniques and investigate performance benchmarks. The initial configuration of the Orszag-Tang vortex simulations are distributed among 4, 16, 64 and 100 GPUs. Furthermore, different simulation box resolutions are applied: 1000 × 1000, 2044 × 2044, 4000 × 4000 and 8000 × 8000 . We also tested the code with the Brio-Wu shock tube simulations with model size of 800 employing up to 10 GPUs. Based on the test results, we observed speed ups and slow downs, depending on the granularity and the communication overhead of certain parallel tasks. The main aim of the code development is to provide massively parallel code without the memory limitation of a single GPU. By using our code, the applied model size could be significantly increased. We demonstrate that we are able to successfully compute numerically valid and large 2D MHD problems.
Apodaca, Timothy R; Jackson, Kristina M; Borsari, Brian; Magill, Molly; Longabaugh, Richard; Mastroleo, Nadine R; Barnett, Nancy P
2016-02-01
To identify individual therapist behaviors which elicit client change talk or sustain talk in motivational interviewing sessions. Motivational interviewing sessions from a single-session alcohol intervention delivered to college students were audio-taped, transcribed, and coded using the Motivational Interviewing Skill Code (MISC), a therapy process coding system. Participants included 92 college students and eight therapists who provided their treatment. The MISC was used to code 17 therapist behaviors related to the use of motivational interviewing, and client language reflecting movement toward behavior change (change talk), away from behavior change (sustain talk), or unrelated to the target behavior (follow/neutral). Client change talk was significantly more likely to immediately follow individual therapist behaviors [affirm (p=.013), open question (p<.001), simple reflection (p<.001), and complex reflection (p<.001)], but significantly less likely to immediately follow others (giving information (p<.001) and closed question (p<.001)]. Sustain talk was significantly more likely to follow therapist use of open questions (p<.001), simple reflections (p<.001), and complex reflections (p<.001), and significantly less likely to occur following therapist use of therapist affirm (p=.012), giving information (p<.001), and closed questions (p<.001). Certain individual therapist behaviors within motivational interviewing can either elicit both client change talk and sustain talk or suppress both types of client language. Affirm was the only therapist behavior that both increased change talk and also reduced sustain talk. Copyright © 2015 Elsevier Inc. All rights reserved.
Owens, Mandy D; Rowell, Lauren N; Moyers, Theresa
2017-10-01
Motivational Interviewing (MI) is an evidence-based approach shown to be helpful for a variety of behaviors across many populations. Treatment fidelity is an important tool for understanding how and with whom MI may be most helpful. The Motivational Interviewing Treatment Integrity coding system was recently updated to incorporate new developments in the research and theory of MI, including the relational and technical hypotheses of MI (MITI 4.2). To date, no studies have examined the MITI 4.2 with forensic populations. In this project, twenty-two brief MI interventions with jail inmates were evaluated to test the reliability of the MITI 4.2. Validity of the instrument was explored using regression models to examine the associations between global scores (Empathy, Partnership, Cultivating Change Talk and Softening Sustain Talk) and outcomes. Reliability of this coding system with these data was strong. We found that therapists had lower ratings of Empathy with participants who had more extensive criminal histories. Both Relational and Technical global scores were associated with criminal histories as well as post-intervention ratings of motivation to decrease drug use. Findings indicate that the MITI 4.2 was reliable for coding sessions with jail inmates. Additionally, results provided information related to the relational and technical hypotheses of MI. Future studies can use the MITI 4.2 to better understand the mechanisms behind how MI works with this high-risk group. Published by Elsevier Ltd.
CACTI: Free, Open-Source Software for the Sequential Coding of Behavioral Interactions
Glynn, Lisa H.; Hallgren, Kevin A.; Houck, Jon M.; Moyers, Theresa B.
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.
1983-10-01
Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less
Validation of suicide and self-harm records in the Clinical Practice Research Datalink
Thomas, Kyla H; Davies, Neil; Metcalfe, Chris; Windmeijer, Frank; Martin, Richard M; Gunnell, David
2013-01-01
Aims The UK Clinical Practice Research Datalink (CPRD) is increasingly being used to investigate suicide-related adverse drug reactions. No studies have comprehensively validated the recording of suicide and nonfatal self-harm in the CPRD. We validated general practitioners' recording of these outcomes using linked Office for National Statistics (ONS) mortality and Hospital Episode Statistics (HES) admission data. Methods We identified cases of suicide and self-harm recorded using appropriate Read codes in the CPRD between 1998 and 2010 in patients aged ≥15 years. Suicides were defined as patients with Read codes for suicide recorded within 95 days of their death. International Classification of Diseases codes were used to identify suicides/hospital admissions for self-harm in the linked ONS and HES data sets. We compared CPRD-derived cases/incidence of suicide and self-harm with those identified from linked ONS mortality and HES data, national suicide incidence rates and published self-harm incidence data. Results Only 26.1% (n = 590) of the ‘true’ (ONS-confirmed) suicides were identified using Read codes. Furthermore, only 55.5% of Read code-identified suicides were confirmed as suicide by the ONS data. Of the HES-identified cases of self-harm, 68.4% were identified in the CPRD using Read codes. The CPRD self-harm rates based on Read codes had similar age and sex distributions to rates observed in self-harm hospital registers, although rates were underestimated in all age groups. Conclusions The CPRD recording of suicide using Read codes is unreliable, with significant inaccuracy (over- and under-reporting). Future CPRD suicide studies should use linked ONS mortality data. The under-reporting of self-harm appears to be less marked. PMID:23216533
Evaluating a Dental Diagnostic Terminology in an Electronic Health Record
White, Joel M.; Kalenderian, Elsbeth; Stark, Paul C.; Ramoni, Rachel L.; Vaderhobli, Ram; Walji, Muhammad F.
2011-01-01
Standardized treatment procedure codes and terms are routinely used in dentistry. Utilization of a diagnostic terminology is common in medicine, but there is not a satisfactory or commonly standardized dental diagnostic terminology available at this time. Recent advances in dental informatics have provided an opportunity for inclusion of diagnostic codes and terms as part of treatment planning and documentation in the patient treatment history. This article reports the results of the use of a diagnostic coding system in a large dental school’s predoctoral clinical practice. A list of diagnostic codes and terms, called Z codes, was developed by dental faculty members. The diagnostic codes and terms were implemented into an electronic health record (EHR) for use in a predoctoral dental clinic. The utilization of diagnostic terms was quantified. The validity of Z code entry was evaluated by comparing the diagnostic term entered to the procedure performed, where valid diagnosis-procedure associations were determined by consensus among three calibrated academically based dentists. A total of 115,004 dental procedures were entered into the EHR during the year sampled. Of those, 43,053 were excluded from this analysis because they represent diagnosis or other procedures unrelated to treatments. Among the 71,951 treatment procedures, 27,973 had diagnoses assigned to them with an overall utilization of 38.9 percent. Of the 147 available Z codes, ninety-three were used (63.3 percent). There were 335 unique procedures provided and 2,127 procedure/diagnosis pairs captured in the EHR. Overall, 76.7 percent of the diagnoses entered were valid. We conclude that dental diagnostic terminology can be incorporated within an electronic health record and utilized in an academic clinical environment. Challenges remain in the development of terms and implementation and ease of use that, if resolved, would improve the utilization. PMID:21546594
42 CFR 488.8 - Federal review of accreditation organizations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... through validation surveys, the State survey agency monitors corrections as specified at § 488.7(b)(3... CMS with electronic data in ASCII comparable code and reports necessary for effective validation and...) Validation review. Following the end of a validation review period, CMS will identify any accreditation...
42 CFR 488.8 - Federal review of accreditation organizations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... through validation surveys, the State survey agency monitors corrections as specified at § 488.7(b)(3... CMS with electronic data in ASCII comparable code and reports necessary for effective validation and...) Validation review. Following the end of a validation review period, CMS will identify any accreditation...
42 CFR 488.8 - Federal review of accreditation organizations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... through validation surveys, the State survey agency monitors corrections as specified at § 488.7(b)(3... CMS with electronic data in ASCII comparable code and reports necessary for effective validation and...) Validation review. Following the end of a validation review period, CMS will identify any accreditation...
Understanding Mixed Code and Classroom Code-Switching: Myths and Realities
ERIC Educational Resources Information Center
Li, David C. S.
2008-01-01
Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan, Lehua; Oldenburg, Curtis M.
TOGA is a numerical reservoir simulator for modeling non-isothermal flow and transport of water, CO 2, multicomponent oil, and related gas components for applications including CO 2-enhanced oil recovery (CO 2-EOR) and geologic carbon sequestration in depleted oil and gas reservoirs. TOGA uses an approach based on the Peng-Robinson equation of state (PR-EOS) to calculate the thermophysical properties of the gas and oil phases including the gas/oil components dissolved in the aqueous phase, and uses a mixing model to estimate the thermophysical properties of the aqueous phase. The phase behavior (e.g., occurrence and disappearance of the three phases, gas +more » oil + aqueous) and the partitioning of non-aqueous components (e.g., CO 2, CH 4, and n-oil components) between coexisting phases are modeled using K-values derived from assumptions of equal-fugacity that have been demonstrated to be very accurate as shown by comparison to measured data. Models for saturated (water) vapor pressure and water solubility (in the oil phase) are used to calculate the partitioning of the water (H 2O) component between the gas and oil phases. All components (e.g., CO 2, H 2O, and n hydrocarbon components) are allowed to be present in all phases (aqueous, gaseous, and oil). TOGA uses a multiphase version of Darcy’s Law to model flow and transport through porous media of mixtures with up to three phases over a range of pressures and temperatures appropriate to hydrocarbon recovery and geologic carbon sequestration systems. Transport of the gaseous and dissolved components is by advection and Fickian molecular diffusion. New methods for phase partitioning and thermophysical property modeling in TOGA have been validated against experimental data published in the literature for describing phase partitioning and phase behavior. Flow and transport has been verified by testing against related TOUGH2 EOS modules and CMG. The code has also been validated against a CO 2-EOR experimental core flood involving flow of three phases and 12 components. Results of simulations of a hypothetical 3D CO 2-EOR problem involving three phases and multiple components are presented to demonstrate the field-scale capabilities of the new code. This user guide provides instructions for use and sample problems for verification and demonstration.« less
Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior.
Panzeri, Stefano; Harvey, Christopher D; Piasini, Eugenio; Latham, Peter E; Fellin, Tommaso
2017-02-08
The two basic processes underlying perceptual decisions-how neural responses encode stimuli, and how they inform behavioral choices-have mainly been studied separately. Thus, although many spatiotemporal features of neural population activity, or "neural codes," have been shown to carry sensory information, it is often unknown whether the brain uses these features for perception. To address this issue, we propose a new framework centered on redefining the neural code as the neural features that carry sensory information used by the animal to drive appropriate behavior; that is, the features that have an intersection between sensory and choice information. We show how this framework leads to a new statistical analysis of neural activity recorded during behavior that can identify such neural codes, and we discuss how to combine intersection-based analysis of neural recordings with intervention on neural activity to determine definitively whether specific neural activity features are involved in a task. Copyright © 2017 Elsevier Inc. All rights reserved.
Seligman, Sarah C; Giovannetti, Tania; Sestito, John; Libon, David J
2014-01-01
Mild functional difficulties have been associated with early cognitive decline in older adults and increased risk for conversion to dementia in mild cognitive impairment, but our understanding of this decline has been limited by a dearth of objective methods. This study evaluated the reliability and validity of a new system to code subtle errors on an established performance-based measure of everyday action and described preliminary findings within the context of a theoretical model of action disruption. Here 45 older adults completed the Naturalistic Action Test (NAT) and neuropsychological measures. NAT performance was coded for overt errors, and subtle action difficulties were scored using a novel coding system. An inter-rater reliability coefficient was calculated. Validity of the coding system was assessed using a repeated-measures ANOVA with NAT task (simple versus complex) and error type (overt versus subtle) as within-group factors. Correlation/regression analyses were conducted among overt NAT errors, subtle NAT errors, and neuropsychological variables. The coding of subtle action errors was reliable and valid, and episodic memory breakdown predicted subtle action disruption. Results suggest that the NAT can be useful in objectively assessing subtle functional decline. Treatments targeting episodic memory may be most effective in addressing early functional impairment in older age.
Stalfors, J; Enoksson, F; Hermansson, A; Hultcrantz, M; Robinson, Å; Stenfeldt, K; Groth, A
2013-04-01
To investigate the internal validity of the diagnosis code used at discharge after treatment of acute mastoiditis. Retrospective national re-evaluation study of patient records 1993-2007 and make comparison with the original ICD codes. All ENT departments at university hospitals and one large county hospital department in Sweden. A total of 1966 records were reviewed for patients with ICD codes for in-patient treatment of acute (529), chronic (44) and unspecified mastoiditis (21) and acute otitis media (1372). ICD codes were reviewed by the authors with a defined protocol for the clinical diagnosis of acute mastoiditis. Those not satisfying the diagnosis were given an alternative diagnosis. Of 529 records with ICD coding for acute mastoiditis, 397 (75%) were found to meet the definition of acute mastoiditis used in this study, while 18% were not diagnosed as having any type of mastoiditis after review. Review of the in-patients treated for acute media otitis identified an additional 60 cases fulfilling the definition of acute mastoiditis. Overdiagnosis was common, and many patients with a diagnostic code indicating acute mastoiditis had been treated for external otitis or otorrhoea with transmyringeal drainage. The internal validity of the diagnosis acute mastoiditis is dependent on the use of standardised, well-defined criteria. Reliability of diagnosis is fundamental for the comparison of results from different studies. Inadequate reliability in the diagnosis of acute mastoiditis also affects calculations of incidence rates and statistical power and may also affect the conclusions drawn from the results. © 2013 Blackwell Publishing Ltd.
An efficient code for the simulation of nonhydrostatic stratified flow over obstacles
NASA Technical Reports Server (NTRS)
Pihos, G. G.; Wurtele, M. G.
1981-01-01
The physical model and computational procedure of the code is described in detail. The code is validated in tests against a variety of known analytical solutions from the literature and is also compared against actual mountain wave observations. The code will receive as initial input either mathematically idealized or discrete observational data. The form of the obstacle or mountain is arbitrary.
Development and Construct Validation of the Mentor Behavior Scale
ERIC Educational Resources Information Center
Brodeur, Pascale; Larose, Simon; Tarabulsy, George; Feng, Bei; Forget-Dubois, Nadine
2015-01-01
Researchers suggest that certain supportive behaviors of mentors could increase the benefits of school-based mentoring for youth. However, the literature contains few validated instruments to measure these behaviors. In our present study, we aimed to construct and validate a tool to measure the supportive behaviors of mentors participating in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wemhoff, A P; Burnham, A K
2006-04-05
Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less
Neutronic calculation of fast reactors by the EUCLID/V1 integrated code
NASA Astrophysics Data System (ADS)
Koltashev, D. A.; Stakhanova, A. A.
2017-01-01
This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.
Chung, Cecilia P; Rohan, Patricia; Krishnaswami, Shanthi; McPheeters, Melissa L
2013-12-30
To review the evidence supporting the validity of billing, procedural, or diagnosis code, or pharmacy claim-based algorithms used to identify patients with rheumatoid arthritis (RA) in administrative and claim databases. We searched the MEDLINE database from 1991 to September 2012 using controlled vocabulary and key terms related to RA and reference lists of included studies were searched. Two investigators independently assessed the full text of studies against pre-determined inclusion criteria and extracted the data. Data collected included participant and algorithm characteristics. Nine studies reported validation of computer algorithms based on International Classification of Diseases (ICD) codes with or without free-text, medication use, laboratory data and the need for a diagnosis by a rheumatologist. These studies yielded positive predictive values (PPV) ranging from 34 to 97% to identify patients with RA. Higher PPVs were obtained with the use of at least two ICD and/or procedure codes (ICD-9 code 714 and others), the requirement of a prescription of a medication used to treat RA, or requirement of participation of a rheumatologist in patient care. For example, the PPV increased from 66 to 97% when the use of disease-modifying antirheumatic drugs and the presence of a positive rheumatoid factor were required. There have been substantial efforts to propose and validate algorithms to identify patients with RA in automated databases. Algorithms that include more than one code and incorporate medications or laboratory data and/or required a diagnosis by a rheumatologist may increase the PPV. Copyright © 2013 Elsevier Ltd. All rights reserved.
Structural Behavior of Concrete Beams Reinforced with Basalt Fiber Reinforced Polymer (BFRP) Bars
NASA Astrophysics Data System (ADS)
Ovitigala, Thilan
The main challenge for civil engineers is to provide sustainable, environmentally friendly and financially feasible structures to the society. Finding new materials such as fiber reinforced polymer (FRP) material that can fulfill the above requirements is a must. FRP material was expensive and it was limited to niche markets such as space shuttles and air industry in the 1960s. Over the time, it became cheaper and spread to other industries such as sporting goods in the 1980-1990, and then towards the infrastructure industry. Design and construction guidelines are available for carbon fiber reinforced polymer (CFRP), aramid fiber reinforced polymer (AFRP) and glass fiber reinforced polymer (GFRP) and they are currently used in structural applications. Since FRP is linear elastic brittle material, design guidelines for the steel reinforcement are not valid for FRP materials. Corrosion of steel reinforcement affects the durability of the concrete structures. FRP reinforcement is identified as an alternative to steel reinforcement in corrosive environments. Although basalt fiber reinforced polymer (BFRP) has many advantages over other FRP materials, but limited studies have been done. These studies didn't include larger BFRP bar diameters that are mostly used in practice. Therefore, larger beam sizes with larger BFRP reinforcement bar diameters are needed to investigate the flexural and shear behavior of BFRP reinforced concrete beams. Also, shear behavior of BFRP reinforced concrete beams was not yet studied. Experimental testing of mechanical properties and bond strength of BFRP bars and flexural and shear behavior of BFRP reinforced concrete beams are needed to include BFRP reinforcement bars in the design codes. This study mainly focuses on the use of BFRP bars as internal reinforcement. The test results of the mechanical properties of BFRP reinforcement bars, the bond strength of BFRP reinforcement bars, and the flexural and shear behavior of concrete beams reinforced with BFRP reinforcement bars are presented and verified with other research studies, existing design codes and guidelines provided for other FRP bars. Based on the experimental testing results, analytical equations were developed and existing equations were modified to predict the actual structural behavior of FRP bar reinforced concrete beams with reasonable accuracy.
Leifker, Feea R; White, Kaitlin Hanley; Blandon, Alysia Y; Marshall, Amy D
2015-01-01
We examined the impact of PTSD symptom severity on emotional reactions to one's own and one's partner's intimacy behaviors. Heterosexual, community couples in which at least one partner reported elevated symptoms of PTSD were video-recorded discussing a relationship problem and self-reported their emotions immediately before and after the discussion. Each partner's intimacy behaviors were coded. Actor-Partner Interdependence Models indicate that, among those with greater PTSD symptom severity, partners' caring, understanding, and validation were associated with increased negative emotions, particularly fear. Among those with greater PTSD severity, provision of caring was associated with decreased anger, guilt, and sadness. Therefore, the receipt of intimacy was associated with increased negative emotions among individuals with elevated PTSD symptoms while provision of intimacy was associated with decreased negative emotions. Existing treatments for PTSD should consider the emotional context of provision and receipt of intimacy to more fully address relationship problems among couples dealing with PTSD. Copyright © 2014 Elsevier Ltd. All rights reserved.
Veenstra-VanderWeele, Jeremy; Muller, Christopher L; Iwamoto, Hideki; Sauer, Jennifer E; Owens, W Anthony; Shah, Charisma R; Cohen, Jordan; Mannangatti, Padmanabhan; Jessen, Tammy; Thompson, Brent J; Ye, Ran; Kerr, Travis M; Carneiro, Ana M; Crawley, Jacqueline N; Sanders-Bush, Elaine; McMahon, Douglas G; Ramamoorthy, Sammanda; Daws, Lynette C; Sutcliffe, James S; Blakely, Randy D
2012-04-03
Fifty years ago, increased whole-blood serotonin levels, or hyperserotonemia, first linked disrupted 5-HT homeostasis to Autism Spectrum Disorders (ASDs). The 5-HT transporter (SERT) gene (SLC6A4) has been associated with whole blood 5-HT levels and ASD susceptibility. Previously, we identified multiple gain-of-function SERT coding variants in children with ASD. Here we establish that transgenic mice expressing the most common of these variants, SERT Ala56, exhibit elevated, p38 MAPK-dependent transporter phosphorylation, enhanced 5-HT clearance rates and hyperserotonemia. These effects are accompanied by altered basal firing of raphe 5-HT neurons, as well as 5HT(1A) and 5HT(2A) receptor hypersensitivity. Strikingly, SERT Ala56 mice display alterations in social function, communication, and repetitive behavior. Our efforts provide strong support for the hypothesis that altered 5-HT homeostasis can impact risk for ASD traits and provide a model with construct and face validity that can support further analysis of ASD mechanisms and potentially novel treatments.
NASA Astrophysics Data System (ADS)
Wu, F.; Wu, T.-H.; Li, X.-Y.
2018-03-01
This article aims to present a systematic indentation theory on a half-space of multi-ferroic composite medium with transverse isotropy. The effect of sliding friction between the indenter and substrate is taken into account. The cylindrical flat-ended indenter is assumed to be electrically/magnetically conducting or insulating, which leads to four sets of mixed boundary-value problems. The indentation forces in the normal and tangential directions are related to the Coulomb friction law. For each case, the integral equations governing the contact behavior are developed by means of the generalized method of potential theory, and the corresponding coupling field is obtained in terms of elementary functions. The effect of sliding on the contact behavior is investigated. Finite element method (FEM) in the context of magneto-electro-elasticity is developed to discuss the validity of the analytical solutions. The obtained analytical solutions may serve as benchmarks to various simplified analyses and numerical codes and as a guide for future experimental studies.
Nanoscale magnetic ratchets based on shape anisotropy
NASA Astrophysics Data System (ADS)
Cui, Jizhai; Keller, Scott M.; Liang, Cheng-Yen; Carman, Gregory P.; Lynch, Christopher S.
2017-02-01
Controlling magnetization using piezoelectric strain through the magnetoelectric effect offers several orders of magnitude reduction in energy consumption for spintronic applications. However strain is a uniaxial effect and, unlike directional magnetic field or spin-polarized current, cannot induce a full 180° reorientation of the magnetization vector when acting alone. We have engineered novel ‘peanut’ and ‘cat-eye’ shaped nanomagnets on piezoelectric substrates that undergo repeated deterministic 180° magnetization rotations in response to individual electric-field-induced strain pulses by breaking the uniaxial symmetry using shape anisotropy. This behavior can be likened to a magnetic ratchet, advancing magnetization clockwise with each piezostrain trigger. The results were validated using micromagnetics implemented in a multiphysics finite elements code to simulate the engineered spatial and temporal magnetic behavior. The engineering principles start from a target device function and proceed to the identification of shapes that produce the desired function. This approach opens a broad design space for next generation magnetoelectric spintronic devices.
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
Comparison of analysis and experiment for dynamics of low-contact-ratio spur gears
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Rebbechi, Brian; Zakrajsek, James J.; Townsend, Dennis P.; Lin, Hsiang Hsi
1991-01-01
Low-contact-ratio spur gears were tested in NASA gear-noise-rig to study gear dynamics including dynamic load, tooth bending stress, vibration, and noise. The experimental results were compared with a NASA gear dynamics code to validate the code as a design tool for predicting transmission vibration and noise. Analytical predictions and experimental data for gear-tooth dynamic loads and tooth-root bending stress were compared at 28 operating conditions. Strain gage data were used to compute the normal load between meshing teeth and the bending stress at the tooth root for direct comparison with the analysis. The computed and measured waveforms for dynamic load and stress were compared for several test conditions. These are very similar in shape, which means the analysis successfully simulates the physical behavior of the test gears. The predicted peak value of the dynamic load agrees with the measurement results within an average error of 4.9 percent except at low-torque, high-speed conditions. Predictions of peak dynamic root stress are generally within 10 to 15 percent of the measured values.
DAMAGE MODELING OF INJECTION-MOLDED SHORT- AND LONG-FIBER THERMOPLASTICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Kunc, Vlastimil; Bapanapalli, Satish K.
2009-10-30
This article applies the recent anisotropic rotary diffusion – reduced strain closure (ARD-RSC) model for predicting fiber orientation and a new damage model for injection-molded long-fiber thermoplastics (LFTs) to analyze progressive damage leading to total failure of injection-molded long-glass-fiber/polypropylene (PP) specimens. The ARD-RSC model was implemented in a research version of the Autodesk Moldflow Plastics Insight (MPI) processing code, and it has been used to simulate injection-molding of a long-glass-fiber/PP plaque. The damage model combines micromechanical modeling with a continuum damage mechanics description to predict the nonlinear behavior due to plasticity coupled with damage in LFTs. This model has beenmore » implemented in the ABAQUS finite element code via user-subroutines and has been used in the damage analyses of tensile specimens removed from the injection-molded long-glass-fiber/PP plaques. Experimental characterization and mechanical testing were performed to provide input data to support and validate both process modeling and damage analyses. The predictions are in agreement with the experimental results.« less
Molecular Tagging Velocimetry Development for In-situ Measurement in High-Temperature Test Facility
NASA Technical Reports Server (NTRS)
Andre, Matthieu A.; Bardet, Philippe M.; Burns, Ross A.; Danehy, Paul M.
2015-01-01
The High Temperature Test Facility, HTTF, at Oregon State University (OSU) is an integral-effect test facility designed to model the behavior of a Very High Temperature Gas Reactor (VHTR) during a Depressurized Conduction Cooldown (DCC) event. It also has the ability to conduct limited investigations into the progression of a Pressurized Conduction Cooldown (PCC) event in addition to phenomena occurring during normal operations. Both of these phenomena will be studied with in-situ velocity field measurements. Experimental measurements of velocity are critical to provide proper boundary conditions to validate CFD codes, as well as developing correlations for system level codes, such as RELAP5 (http://www4vip.inl.gov/relap5/). Such data will be the first acquired in the HTTF and will introduce a diagnostic with numerous other applications to the field of nuclear thermal hydraulics. A laser-based optical diagnostic under development at The George Washington University (GWU) is presented; the technique is demonstrated with velocity data obtained in ambient temperature air, and adaptation to high-pressure, high-temperature flow is discussed.
Modeling Vortex Generators in a Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2011-01-01
A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.
NASA Astrophysics Data System (ADS)
Kim, Kunhwi; Rutqvist, Jonny; Nakagawa, Seiji; Birkholzer, Jens
2017-11-01
This paper presents coupled hydro-mechanical modeling of hydraulic fracturing processes in complex fractured media using a discrete fracture network (DFN) approach. The individual physical processes in the fracture propagation are represented by separate program modules: the TOUGH2 code for multiphase flow and mass transport based on the finite volume approach; and the rigid-body-spring network (RBSN) model for mechanical and fracture-damage behavior, which are coupled with each other. Fractures are modeled as discrete features, of which the hydrological properties are evaluated from the fracture deformation and aperture change. The verification of the TOUGH-RBSN code is performed against a 2D analytical model for single hydraulic fracture propagation. Subsequently, modeling capabilities for hydraulic fracturing are demonstrated through simulations of laboratory experiments conducted on rock-analogue (soda-lime glass) samples containing a designed network of pre-existing fractures. Sensitivity analyses are also conducted by changing the modeling parameters, such as viscosity of injected fluid, strength of pre-existing fractures, and confining stress conditions. The hydraulic fracturing characteristics attributed to the modeling parameters are investigated through comparisons of the simulation results.
Direct simulations of chemically reacting turbulent mixing layers, part 2
NASA Technical Reports Server (NTRS)
Metcalfe, Ralph W.; Mcmurtry, Patrick A.; Jou, Wen-Huei; Riley, James J.; Givi, Peyman
1988-01-01
The results of direct numerical simulations of chemically reacting turbulent mixing layers are presented. This is an extension of earlier work to a more detailed study of previous three dimensional simulations of cold reacting flows plus the development, validation, and use of codes to simulate chemically reacting shear layers with heat release. Additional analysis of earlier simulations showed good agreement with self similarity theory and laboratory data. Simulations with a two dimensional code including the effects of heat release showed that the rate of chemical product formation, the thickness of the mixing layer, and the amount of mass entrained into the layer all decrease with increasing rates of heat release. Subsequent three dimensional simulations showed similar behavior, in agreement with laboratory observations. Baroclinic torques and thermal expansion in the mixing layer were found to produce changes in the flame vortex structure that act to diffuse the pairing vortices, resulting in a net reduction in vorticity. Previously unexplained anomalies observed in the mean velocity profiles of reacting jets and mixing layers were shown to result from vorticity generation by baroclinic torques.
Development of MPS Method for Analyzing Melt Spreading Behavior and MCCI in Severe Accidents
NASA Astrophysics Data System (ADS)
Yamaji, Akifumi; Li, Xin
2016-08-01
Spreading of molten core (corium) on reactor containment vessel floor and molten corium-concrete interaction (MCCI) are important phenomena in the late phase of a severe accident for assessment of the containment integrity and managing the severe accident. The severe accident research at Waseda University has been advancing to show that simulations with moving particle semi-implicit (MPS) method (one of the particle methods) can greatly improve the analytical capability and mechanical understanding of the melt behavior in severe accidents. MPS models have been developed and verified regarding calculations of radiation and thermal field, solid-liquid phase transition, buoyancy, and temperature dependency of viscosity to simulate phenomena, such as spreading of corium, ablation of concrete by the corium, crust formation and cooling of the corium by top flooding. Validations have been conducted against experiments such as FARO L26S, ECOKATS-V1, Theofanous, and SPREAD for spreading, SURC-2, SURC-4, SWISS-1, and SWISS-2 for MCCI. These validations cover melt spreading behaviors and MCCI by mixture of molten oxides (including prototypic UO2-ZrO2), metals, and water. Generally, the analytical results show good agreement with the experiment with respect to the leading edge of spreading melt and ablation front history of concrete. The MPS results indicate that crust formation may play important roles in melt spreading and MCCI. There is a need to develop a code for two dimensional MCCI experiment simulation with MPS method as future study, which will be able to simulate anisotropic ablation of concrete.
The Deceptive Mean: Conceptual Scoring of Cloze Entries Differentially Advantages More Able Readers
ERIC Educational Resources Information Center
O'Toole, J. M.; King, R. A. R.
2011-01-01
The "cloze" test is one possible investigative instrument for predicting text comprehensibility. Conceptual coding of student replacement of deleted words has been considered to be more valid than exact coding, partly because conceptual coding seemed fairer to poorer readers. This paper reports a quantitative study of 447 Australian…
Tracking Holland Interest Codes: The Case of South African Field Guides
ERIC Educational Resources Information Center
Watson, Mark B.; Foxcroft, Cheryl D.; Allen, Lynda J.
2007-01-01
Holland believes that specific personality types seek out matching occupational environments and his theory codes personality and environment according to a six letter interest typology. Since 1985 there have been numerous American studies that have queried the validity of Holland's coding system. Research in South Africa is scarcer, despite…
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
The Modeling of Advanced BWR Fuel Designs with the NRC Fuel Depletion Codes PARCS/PATHS
Ward, Andrew; Downar, Thomas J.; Xu, Y.; ...
2015-04-22
The PATHS (PARCS Advanced Thermal Hydraulic Solver) code was developed at the University of Michigan in support of U.S. Nuclear Regulatory Commission research to solve the steady-state, two-phase, thermal-hydraulic equations for a boiling water reactor (BWR) and to provide thermal-hydraulic feedback for BWR depletion calculations with the neutronics code PARCS (Purdue Advanced Reactor Core Simulator). The simplified solution methodology, including a three-equation drift flux formulation and an optimized iteration scheme, yields very fast run times in comparison to conventional thermal-hydraulic systems codes used in the industry, while still retaining sufficient accuracy for applications such as BWR depletion calculations. Lastly, themore » capability to model advanced BWR fuel designs with part-length fuel rods and heterogeneous axial channel flow geometry has been implemented in PATHS, and the code has been validated against previously benchmarked advanced core simulators as well as BWR plant and experimental data. We describe the modifications to the codes and the results of the validation in this paper.« less
NASA Astrophysics Data System (ADS)
Kotchenova, Svetlana Y.; Vermote, Eric F.; Matarrese, Raffaella; Klemm, Frank J., Jr.
2006-09-01
A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.
Kotchenova, Svetlana Y; Vermote, Eric F; Matarrese, Raffaella; Klemm, Frank J
2006-09-10
A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.
Computation of Thermally Perfect Compressible Flow Properties
NASA Technical Reports Server (NTRS)
Witte, David W.; Tatum, Kenneth E.; Williams, S. Blake
1996-01-01
A set of compressible flow relations for a thermally perfect, calorically imperfect gas are derived for a value of c(sub p) (specific heat at constant pressure) expressed as a polynomial function of temperature and developed into a computer program, referred to as the Thermally Perfect Gas (TPG) code. The code is available free from the NASA Langley Software Server at URL http://www.larc.nasa.gov/LSS. The code produces tables of compressible flow properties similar to those found in NACA Report 1135. Unlike the NACA Report 1135 tables which are valid only in the calorically perfect temperature regime the TPG code results are also valid in the thermally perfect, calorically imperfect temperature regime, giving the TPG code a considerably larger range of temperature application. Accuracy of the TPG code in the calorically perfect and in the thermally perfect, calorically imperfect temperature regimes are verified by comparisons with the methods of NACA Report 1135. The advantages of the TPG code compared to the thermally perfect, calorically imperfect method of NACA Report 1135 are its applicability to any type of gas (monatomic, diatomic, triatomic, or polyatomic) or any specified mixture of gases, ease-of-use, and tabulated results.
Validation of the SINDA/FLUINT code using several analytical solutions
NASA Technical Reports Server (NTRS)
Keller, John R.
1995-01-01
The Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA/FLUINT) code has often been used to determine the transient and steady-state response of various thermal and fluid flow networks. While this code is an often used design and analysis tool, the validation of this program has been limited to a few simple studies. For the current study, the SINDA/FLUINT code was compared to four different analytical solutions. The thermal analyzer portion of the code (conduction and radiative heat transfer, SINDA portion) was first compared to two separate solutions. The first comparison examined a semi-infinite slab with a periodic surface temperature boundary condition. Next, a small, uniform temperature object (lumped capacitance) was allowed to radiate to a fixed temperature sink. The fluid portion of the code (FLUINT) was also compared to two different analytical solutions. The first study examined a tank filling process by an ideal gas in which there is both control volume work and heat transfer. The final comparison considered the flow in a pipe joining two infinite reservoirs of pressure. The results of all these studies showed that for the situations examined here, the SINDA/FLUINT code was able to match the results of the analytical solutions.
Kotchenova, Svetlana Y; Vermote, Eric F
2007-07-10
This is the second part of the validation effort of the recently developed vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), primarily used for the calculation of look-up tables in the Moderate Resolution Imaging Spectroradiometer (MODIS) atmospheric correction algorithm. The 6SV1 code was tested against a Monte Carlo code and Coulson's tabulated values for molecular and aerosol atmospheres bounded by different Lambertian and anisotropic surfaces. The code was also tested in scalar mode against the scalar code SHARM to resolve the previous 6S accuracy issues in the case of an anisotropic surface. All test cases were characterized by good agreement between the 6SV1 and the other codes: The overall relative error did not exceed 0.8%. The study also showed that ignoring the effects of radiation polarization in the atmosphere led to large errors in the simulated top-of-atmosphere reflectances: The maximum observed error was approximately 7.2% for both Lambertian and anisotropic surfaces.
NASA Astrophysics Data System (ADS)
Kotchenova, Svetlana Y.; Vermote, Eric F.
2007-07-01
This is the second part of the validation effort of the recently developed vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), primarily used for the calculation of look-up tables in the Moderate Resolution Imaging Spectroradiometer (MODIS) atmospheric correction algorithm. The 6SV1 code was tested against a Monte Carlo code and Coulson's tabulated values for molecular and aerosol atmospheres bounded by different Lambertian and anisotropic surfaces. The code was also tested in scalar mode against the scalar code SHARM to resolve the previous 6S accuracy issues in the case of an anisotropic surface. All test cases were characterized by good agreement between the 6SV1 and the other codes: The overall relative error did not exceed 0.8%. The study also showed that ignoring the effects of radiation polarization in the atmosphere led to large errors in the simulated top-of-atmosphere reflectances: The maximum observed error was approximately 7.2% for both Lambertian and anisotropic surfaces.
On the error statistics of Viterbi decoding and the performance of concatenated codes
NASA Technical Reports Server (NTRS)
Miller, R. L.; Deutsch, L. J.; Butman, S. A.
1981-01-01
Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.
Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L.
2017-05-01
MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.
Jones, Natalie; Schneider, Gary; Kachroo, Sumesh; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's Mini-Sentinel pilot program initially aimed to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest (HOIs) from administrative and claims data. This paper summarizes the process and findings of the algorithm review of pulmonary fibrosis and interstitial lung disease. PubMed and Iowa Drug Information Service Web searches were conducted to identify citations applicable to the pulmonary fibrosis/interstitial lung disease HOI. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify pulmonary fibrosis and interstitial lung disease, including validation estimates of the coding algorithms. Our search revealed a deficiency of literature focusing on pulmonary fibrosis and interstitial lung disease algorithms and validation estimates. Only five studies provided codes; none provided validation estimates. Because interstitial lung disease includes a broad spectrum of diseases, including pulmonary fibrosis, the scope of these studies varied, as did the corresponding diagnostic codes used. Research needs to be conducted on designing validation studies to test pulmonary fibrosis and interstitial lung disease algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.
Schneider, Gary; Kachroo, Sumesh; Jones, Natalie; Crean, Sheila; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's Mini-Sentinel pilot program initially aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest from administrative and claims data. This article summarizes the process and findings of the algorithm review of anaphylaxis. PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the anaphylaxis health outcome of interest. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify anaphylaxis and including validation estimates of the coding algorithms. Our search revealed limited literature focusing on anaphylaxis that provided administrative and claims data-based algorithms and validation estimates. Only four studies identified via literature searches provided validated algorithms; however, two additional studies were identified by Mini-Sentinel collaborators and were incorporated. The International Classification of Diseases, Ninth Revision, codes varied, as did the positive predictive value, depending on the cohort characteristics and the specific codes used to identify anaphylaxis. Research needs to be conducted on designing validation studies to test anaphylaxis algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.
1994-08-01
volume H1. Le rapport ext accompagnt5 doun jeo die disqoettex contenant les donn~es appropri~es Li bous let cas d’essai. (’es disqoettes sont disponibles ...GERMANY PURPL’Sb OF THE TESi The tests are part of a larger effort to establish a database of experimental measurements for missile configurations
Reconceptualizing Children's Suggestibility: Bidirectional and Temporal Properties
ERIC Educational Resources Information Center
Gilstrap, Livia L.; Ceci, Stephen J.
2005-01-01
Forty-one children (3 to 7 years) were exposed to a staged event and later interviewed by 1 of 41 professional interviewers. All interviews were coded with a detailed, mutually exclusive, and exhaustive coding scheme capturing adult behaviors (leading questions vs. neutral) and child behaviors (acquiescence vs. denial) in a temporally organized…
FAPRS Manual: Manual for the Functional Analytic Psychotherapy Rating Scale
ERIC Educational Resources Information Center
Callaghan, Glenn M.; Follette, William C.
2008-01-01
The Functional Analytic Psychotherapy Rating Scale (FAPRS) is behavioral coding system designed to capture those essential client and therapist behaviors that occur during Functional Analytic Psychotherapy (FAP). The FAPRS manual presents the purpose and rules for documenting essential aspects of FAP. The FAPRS codes are exclusive and exhaustive…
Melt layer behavior of metal targets irradiatead by powerful plasma streams
NASA Astrophysics Data System (ADS)
Bandura, A. N.; Byrka, O. V.; Chebotarev, V. V.; Garkusha, I. E.; Makhlaj, V. A.; Solyakov, D. G.; Tereshin, V. I.; Wuerz, H.
2002-12-01
In this paper melt layer erosion of metal targets under pulsed high-heat loads is studied. Experiments with steel, copper, aluminum and titanium samples were carried out in two plasma accelerator devices with different time durations of the heat load. The surfaces of the resolidified melt layers show a considerable roughness with microcraters and ridge like relief on the surface. For each material the mass loss was determined. Melt layer erosion by melt motion was clearly identified. However it is masked by boiling, bubble expansion and bubble collapse and by formation of a Kelvin-Helmholtz instability. The experimental results can be used for validation of numerical codes which model melt layer erosion of metallic armour materials in off-normal events, in tokamaks.
Application of Probability Methods to Assess Crash Modeling Uncertainty
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.
Application of Probability Methods to Assess Crash Modeling Uncertainty
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.
2007-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.
Evaluation of an alternative shielding materials for F-127 transport package
NASA Astrophysics Data System (ADS)
Gual, Maritza R.; Mesquita, Amir Z.; Pereira, Cláubia
2018-03-01
Lead is used as radiation shielding material for the Nordion's F-127 source shipping container is used for transport and storage of the GammaBeam -127's cobalt-60 source of the Nuclear Technology Development Center (CDTN) located in Belo Horizonte, Brazil. As an alternative, Th, Tl and WC have been evaluated as radiation shielding material. The goal is to check their behavior regarding shielding and dosing. Monte Carlo MCNPX code is used for the simulations. In the MCNPX calculation was used one cylinder as exclusion surface instead one sphere. Validation of MCNPX gamma doses calculations was carried out through comparison with experimental measurements. The results show that tungsten carbide WC is better shielding material for γ-ray than lead shielding.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tominaga, Nozomu; Shibata, Sanshiro; Blinnikov, Sergei I., E-mail: tominaga@konan-u.ac.jp, E-mail: sshibata@post.kek.jp, E-mail: Sergei.Blinnikov@itep.ru
We develop a time-dependent, multi-group, multi-dimensional relativistic radiative transfer code, which is required to numerically investigate radiation from relativistic fluids that are involved in, e.g., gamma-ray bursts and active galactic nuclei. The code is based on the spherical harmonic discrete ordinate method (SHDOM) which evaluates a source function including anisotropic scattering in spherical harmonics and implicitly solves the static radiative transfer equation with ray tracing in discrete ordinates. We implement treatments of time dependence, multi-frequency bins, Lorentz transformation, and elastic Thomson and inelastic Compton scattering to the publicly available SHDOM code. Our code adopts a mixed-frame approach; the source functionmore » is evaluated in the comoving frame, whereas the radiative transfer equation is solved in the laboratory frame. This implementation is validated using various test problems and comparisons with the results from a relativistic Monte Carlo code. These validations confirm that the code correctly calculates the intensity and its evolution in the computational domain. The code enables us to obtain an Eddington tensor that relates the first and third moments of intensity (energy density and radiation pressure) and is frequently used as a closure relation in radiation hydrodynamics calculations.« less
The Fast Scattering Code (FSC): Validation Studies and Program Guidelines
NASA Technical Reports Server (NTRS)
Tinetti, Ana F.; Dunn, Mark H.
2011-01-01
The Fast Scattering Code (FSC) is a frequency domain noise prediction program developed at the NASA Langley Research Center (LaRC) to simulate the acoustic field produced by the interaction of known, time harmonic incident sound with bodies of arbitrary shape and surface impedance immersed in a potential flow. The code uses the equivalent source method (ESM) to solve an exterior 3-D Helmholtz boundary value problem (BVP) by expanding the scattered acoustic pressure field into a series of point sources distributed on a fictitious surface placed inside the actual scatterer. This work provides additional code validation studies and illustrates the range of code parameters that produce accurate results with minimal computational costs. Systematic noise prediction studies are presented in which monopole generated incident sound is scattered by simple geometric shapes - spheres (acoustically hard and soft surfaces), oblate spheroids, flat disk, and flat plates with various edge topologies. Comparisons between FSC simulations and analytical results and experimental data are presented.
An Overview of Starfish: A Table-Centric Tool for Interactive Synthesis
NASA Technical Reports Server (NTRS)
Tsow, Alex
2008-01-01
Engineering is an interactive process that requires intelligent interaction at many levels. My thesis [1] advances an engineering discipline for high-level synthesis and architectural decomposition that integrates perspicuous representation, designer interaction, and mathematical rigor. Starfish, the software prototype for the design method, implements a table-centric transformation system for reorganizing control-dominated system expressions into high-level architectures. Based on the digital design derivation (DDD) system a designer-guided synthesis technique that applies correctness preserving transformations to synchronous data flow specifications expressed as co- recursive stream equations Starfish enhances user interaction and extends the reachable design space by incorporating four innovations: behavior tables, serialization tables, data refinement, and operator retiming. Behavior tables express systems of co-recursive stream equations as a table of guarded signal updates. Developers and users of the DDD system used manually constructed behavior tables to help them decide which transformations to apply and how to specify them. These design exercises produced several formally constructed hardware implementations: the FM9001 microprocessor, an SECD machine for evaluating LISP, and the SchemEngine, garbage collected machine for interpreting a byte-code representation of compiled Scheme programs. Bose and Tuna, two of DDD s developers, have subsequently commercialized the design derivation methodology at Derivation Systems, Inc. (DSI). DSI has formally derived and validated PCI bus interfaces and a Java byte-code processor; they further executed a contract to prototype SPIDER-NASA's ultra-reliable communications bus. To date, most derivations from DDD and DRS have targeted hardware due to its synchronous design paradigm. However, Starfish expressions are independent of the synchronization mechanism; there is no commitment to hardware or globally broadcast clocks. Though software back-ends for design derivation are limited to the DDD stream-interpreter, targeting synchronous or real-time software is not substantively different from targeting hardware.
Spohr, Stephanie A; Taxman, Faye S; Rodriguez, Mayra; Walters, Scott T
2016-06-01
Although substance use is common among people in the U.S. criminal justice system, treatment initiation remains an ongoing problem. This study assessed the reliability and predictive validity of the Motivational Interviewing Treatment Integrity 3.1.1. (MITI) coding instrument in a community corrections sample. We used data from 80 substance-using clients who were participating in a clinical trial of MI in a probation setting. We analyzed 124 MI counseling sessions using the MITI, a coding system for documenting MI fidelity. Bivariate associations and logistic regression modeling were used to determine if MI-consistent behaviors predicted substance use or treatment initiation at a 2-month follow-up. We found a high level of agreement between coders on behavioral utterance counts. Counselors met at least beginning proficiency on most MITI summary scores. Probationers who initiated treatment at 2-month follow-up had significantly higher ratings of clinician empathy and MI spirit than clients who did not initiate treatment. Other MITI summary scores were not significantly different between clients who had initiated treatment and those who did not. MI spirit and empathy ratings were entered into a forward logistic regression in which MI spirit significantly predicted 2-month treatment initiation (χ(2) (1)=4.10, p<.05, R(2)=.05) but counselor empathy did not. MITI summary scores did not predict substance use at 2-month follow-up. Counselor MI-consistent relational skills were an important predictor of client treatment initiation. Counselor behaviors such as empathy and MI spirit may be important for developing client rapport with people in a probation setting. Copyright © 2015. Published by Elsevier Inc.
Dobson-Belaire, Wendy; Goodfield, Jason; Borrelli, Richard; Liu, Fei Fei; Khan, Zeba M
2018-01-01
Using diagnosis code-based algorithms is the primary method of identifying patient cohorts for retrospective studies; nevertheless, many databases lack reliable diagnosis code information. To develop precise algorithms based on medication claims/prescriber visits (MCs/PVs) to identify psoriasis (PsO) patients and psoriatic patients with arthritic conditions (PsO-AC), a proxy for psoriatic arthritis, in Canadian databases lacking diagnosis codes. Algorithms were developed using medications with narrow indication profiles in combination with prescriber specialty to define PsO and PsO-AC. For a 3-year study period from July 1, 2009, algorithms were validated using the PharMetrics Plus database, which contains both adjudicated medication claims and diagnosis codes. Positive predictive value (PPV), negative predictive value (NPV), sensitivity, and specificity of the developed algorithms were assessed using diagnosis code as the reference standard. Chosen algorithms were then applied to Canadian drug databases to profile the algorithm-identified PsO and PsO-AC cohorts. In the selected database, 183,328 patients were identified for validation. The highest PPVs for PsO (85%) and PsO-AC (65%) occurred when a predictive algorithm of two or more MCs/PVs was compared with the reference standard of one or more diagnosis codes. NPV and specificity were high (99%-100%), whereas sensitivity was low (≤30%). Reducing the number of MCs/PVs or increasing diagnosis claims decreased the algorithms' PPVs. We have developed an MC/PV-based algorithm to identify PsO patients with a high degree of accuracy, but accuracy for PsO-AC requires further investigation. Such methods allow researchers to conduct retrospective studies in databases in which diagnosis codes are absent. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Schweizer, Marin L.; Eber, Michael R.; Laxminarayan, Ramanan; Furuno, Jon P.; Popovich, Kyle J.; Hota, Bala; Rubin, Michael A.; Perencevich, Eli N.
2013-01-01
BACKGROUND AND OBJECTIVE Investigators and medical decision makers frequently rely on administrative databases to assess methicillin-resistant Staphylococcus aureus (MRSA) infection rates and outcomes. The validity of this approach remains unclear. We sought to assess the validity of the International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) code for infection with drug-resistant microorganisms (V09) for identifying culture-proven MRSA infection. DESIGN Retrospective cohort study. METHODS All adults admitted to 3 geographically distinct hospitals between January 1, 2001, and December 31, 2007, were assessed for presence of incident MRSA infection, defined as an MRSA-positive clinical culture obtained during the index hospitalization, and presence of the V09 ICD-9-CM code. The k statistic was calculated to measure the agreement between presence of MRSA infection and assignment of the V09 code. Sensitivities, specificities, positive predictive values, and negative predictive values were calculated. RESULTS There were 466,819 patients discharged during the study period. Of the 4,506 discharged patients (1.0%) who had the V09 code assigned, 31% had an incident MRSA infection, 20% had prior history of MRSA colonization or infection but did not have an incident MRSA infection, and 49% had no record of MRSA infection during the index hospitalization or the previous hospitalization. The V09 code identified MRSA infection with a sensitivity of 24% (range, 21%–34%) and positive predictive value of 31% (range, 22%–53%). The agreement between assignment of the V09 code and presence of MRSA infection had a κ coefficient of 0.26 (95% confidence interval, 0.25–0.27). CONCLUSIONS In its current state, the ICD-9-CM code V09 is not an accurate predictor of MRSA infection and should not be used to measure rates of MRSA infection. PMID:21460469
Foam on Tile Impact Modeling for the STS-107 Investigation
NASA Technical Reports Server (NTRS)
Stellingwerf, R. F.; Robinson, J. H.; Richardson, S.; Evans, S. W.; Stallworth, R.; Hovater, M.
2004-01-01
Following the breakup of the Space Shuttle Columbia during reentry a NASA/Contractor investigation team was formed to examine the probable damage inflicted on Orbiter Thermal Protection System elements by impact of External Tank insulating foam projectiles. The authors formed a working subgroup within the larger team to apply the Smooth Particle Hydrodynamics code SPHC to the damage estimation problem. Numerical models of the Orbiter's tiles and of the Tank's foam were constructed and used as inputs into the code. Material properties needed to properly model the tiles and foam were obtained from other working subgroups who performed tests on these items for this purpose. Two- and three-dimensional models of the tiles were constructed, including the glass outer layer, the main body of LI-900 insulation, the densified lower layer of LI-900, the Nomex felt mounting layer, and the Aluminum 2024 vehicle skin. A model for the BX-250 foam including porous compression, elastic rebound, and surface erosion was developed. Code results for the tile damage and foam behavior were extensively validated through comparison with Southwest Research Institute foam-on-tile impact experiments carried out in 1999. These tests involved small projectiles striking individual tiles and small tile arrays. Following code and model validation we simulated impacts of larger foam projectiles on the examples of tile systems used on the Orbiter. Results for impacts on the main landing gear door are presented in this paper, including effects of impacts at several angles, and of rapidly rotating projectiles. General results suggest that foam impacts on tiles at about 500 mph could cause appreciable damage if the impact angle is greater than about 20 degrees. Some variations of the foam properties, such as increased brittleness or increased density could increase damage in some cases. Rotation up to 17 rps failed to increase the damage for the two cases considered. This does not rule out other cases in which the rotational energy might lead to an increase in tile damage, but suggests that in most cases rotation will not be an important factor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code
NASA Technical Reports Server (NTRS)
Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.
2003-01-01
Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.
Beringer, Richard M; Greenwood, Rosemary; Kilpatrick, Nicky
2014-02-01
Measuring perioperative behavior changes requires validated objective rating scales. We developed a simple score for children's behavior during induction of anesthesia (Pediatric Anesthesia Behavior score) and assessed its reliability, concurrent validity, and predictive validity. Data were collected as part of a wider observational study of perioperative behavior changes in children undergoing general anesthesia for elective dental extractions. One-hundred and two healthy children aged 2-12 were recruited. Previously validated behavioral scales were used as follows: the modified Yale Preoperative Anxiety Scale (m-YPAS); the induction compliance checklist (ICC); the Pediatric Anesthesia Emergence Delirium scale (PAED); and the Post-Hospitalization Behavior Questionnaire (PHBQ). Pediatric Anesthesia Behavior (PAB) score was independently measured by two investigators, to allow assessment of interobserver reliability. Concurrent validity was assessed by examining the correlation between the PAB score, the m-YPAS, and the ICC. Predictive validity was assessed by examining the association between the PAB score, the PAED scale, and the PHBQ. The PAB score correlated strongly with both the m-YPAS (P < 0.001) and the ICC (P < 0.001). PAB score was significantly associated with the PAED score (P = 0.031) and with the PHBQ (P = 0.034). Two independent investigators recorded identical PAB scores for 94% of children and overall, there was close agreement between scores (Kappa coefficient of 0.886 [P < 0.001]). The PAB score is simple to use and may predict which children are at increased risk of developing postoperative behavioral disturbance. This study provides evidence for its reliability and validity. © 2013 John Wiley & Sons Ltd.
Specifications and programs for computer software validation
NASA Technical Reports Server (NTRS)
Browne, J. C.; Kleir, R.; Davis, T.; Henneman, M.; Haller, A.; Lasseter, G. L.
1973-01-01
Three software products developed during the study are reported and include: (1) FORTRAN Automatic Code Evaluation System, (2) the Specification Language System, and (3) the Array Index Validation System.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, Paolo; Theiler, C.; Fasoli, A.
A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less
Motivational interviewing in health care: results of a brief training in endocrinology.
Bean, Melanie K; Biskobing, Diane; Francis, Gary L; Wickham, Edmond
2012-09-01
Despite the importance of lifestyle change in disease management and the growing evidence supporting motivational interviewing (MI) as an effective counseling method to promote behavioral change, to date there are few published reports about MI training in graduate medical education. The study aimed to pilot the feasibility and effectiveness of a brief MI training intervention for endocrinology fellows and other providers. We used a pretest/posttest design to evaluate a brief MI training for 5 endocrinology fellows and 9 other providers. All participants completed subjective assessments of perceived confidence and beliefs about behavioral counseling at pretest and posttest. Objective assessment of MI was conducted using fellows' audiotaped patient encounters, which were coded using a validated tool for adherence to MI before and after the training. Paired t tests examined changes in objective and subjective assessments. The training intervention was well received and feasible in the endocrinology setting. At posttest, participants reported increased endorsement of the MI spirit and improved confidence in MI skills. Objective assessment revealed relative improvements in MI skills across several domains. However, most domains, as assessed by a validated tool, did not reach competency level after the training intervention. Although more intensive training may be needed to develop MI competence, the results of our pilot study suggest that brief, targeted MI training has short-term efficacy and is well received by endocrinology fellows and other providers.
Rück, Christian; Larsson, K Johan; Lind, Kristina; Perez-Vigil, Ana; Isomura, Kayoko; Sariaslan, Amir; Lichtenstein, Paul; Mataix-Cols, David
2015-06-22
The usefulness of cases diagnosed in administrative registers for research purposes is dependent on diagnostic validity. This study aimed to investigate the validity and inter-rater reliability of recorded diagnoses of tic disorders and obsessive-compulsive disorder (OCD) in the Swedish National Patient Register (NPR). Chart review of randomly selected register cases and controls. 100 tic disorder cases and 100 OCD cases were randomly selected from the NPR based on codes from the International Classification of Diseases (ICD) 8th, 9th and 10th editions, together with 50 epilepsy and 50 depression control cases. The obtained psychiatric records were blindly assessed by 2 senior psychiatrists according to the criteria of the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM-IV-TR) and ICD-10. Positive predictive value (PPV; cases diagnosed correctly divided by the sum of true positives and false positives). Between 1969 and 2009, the NPR included 7286 tic disorder and 24,757 OCD cases. The vast majority (91.3% of tic cases and 80.1% of OCD cases) are coded with the most recent ICD version (ICD-10). For tic disorders, the PPV was high across all ICD versions (PPV=89% in ICD-8, 86% in ICD-9 and 97% in ICD-10). For OCD, only ICD-10 codes had high validity (PPV=91-96%). None of the epilepsy or depression control cases were wrongly diagnosed as having tic disorders or OCD, respectively. Inter-rater reliability was outstanding for both tic disorders (κ=1) and OCD (κ=0.98). The validity and reliability of ICD codes for tic disorders and OCD in the Swedish NPR is generally high. We propose simple algorithms to further increase the confidence in the validity of these codes for epidemiological research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Goode, N; Salmon, P M; Taylor, N Z; Lenné, M G; Finch, C F
2017-10-01
One factor potentially limiting the uptake of Rasmussen's (1997) Accimap method by practitioners is the lack of a contributing factor classification scheme to guide accident analyses. This article evaluates the intra- and inter-rater reliability and criterion-referenced validity of a classification scheme developed to support the use of Accimap by led outdoor activity (LOA) practitioners. The classification scheme has two levels: the system level describes the actors, artefacts and activity context in terms of 14 codes; the descriptor level breaks the system level codes down into 107 specific contributing factors. The study involved 11 LOA practitioners using the scheme on two separate occasions to code a pre-determined list of contributing factors identified from four incident reports. Criterion-referenced validity was assessed by comparing the codes selected by LOA practitioners to those selected by the method creators. Mean intra-rater reliability scores at the system (M = 83.6%) and descriptor (M = 74%) levels were acceptable. Mean inter-rater reliability scores were not consistently acceptable for both coding attempts at the system level (M T1 = 68.8%; M T2 = 73.9%), and were poor at the descriptor level (M T1 = 58.5%; M T2 = 64.1%). Mean criterion referenced validity scores at the system level were acceptable (M T1 = 73.9%; M T2 = 75.3%). However, they were not consistently acceptable at the descriptor level (M T1 = 67.6%; M T2 = 70.8%). Overall, the results indicate that the classification scheme does not currently satisfy reliability and validity requirements, and that further work is required. The implications for the design and development of contributing factors classification schemes are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Dowdy, Erin; Harrell-Williams, Leigh; Dever, Bridget V.; Furlong, Michael J.; Moore, Stephanie; Raines, Tara; Kamphaus, Randy W.
2016-01-01
Increasingly, schools are implementing school-based screening for risk of behavioral and emotional problems; hence, foundational evidence supporting the predictive validity of screening instruments is important to assess. This study examined the predictive validity of the Behavior Assessment System for Children-2 Behavioral and Emotional Screening…
Transport calculations and accelerator experiments needed for radiation risk assessment in space.
Sihver, Lembit
2008-01-01
The major uncertainties on space radiation risk estimates in humans are associated to the poor knowledge of the biological effects of low and high LET radiation, with a smaller contribution coming from the characterization of space radiation field and its primary interactions with the shielding and the human body. However, to decrease the uncertainties on the biological effects and increase the accuracy of the risk coefficients for charged particles radiation, the initial charged-particle spectra from the Galactic Cosmic Rays (GCRs) and the Solar Particle Events (SPEs), and the radiation transport through the shielding material of the space vehicle and the human body, must be better estimated Since it is practically impossible to measure all primary and secondary particles from all possible position-projectile-target-energy combinations needed for a correct risk assessment in space, accurate particle and heavy ion transport codes must be used. These codes are also needed when estimating the risk for radiation induced failures in advanced microelectronics, such as single-event effects, etc., and the efficiency of different shielding materials. It is therefore important that the models and transport codes will be carefully benchmarked and validated to make sure they fulfill preset accuracy criteria, e.g. to be able to predict particle fluence, dose and energy distributions within a certain accuracy. When validating the accuracy of the transport codes, both space and ground based accelerator experiments are needed The efficiency of passive shielding and protection of electronic devices should also be tested in accelerator experiments and compared to simulations using different transport codes. In this paper different multipurpose particle and heavy ion transport codes will be presented, different concepts of shielding and protection discussed, as well as future accelerator experiments needed for testing and validating codes and shielding materials.
Das, Ashavaree; Sarkar, Madhurima
2014-09-01
Understanding health information-seeking behaviors and barriers to care and access among pregnant women can potentially moderate the consistent negative associations between poverty, low levels of literacy, and negative maternal and child health outcomes in India. Our seminal study explores health information needs, health information-seeking behaviors, and perceived information support of low-income pregnant women in rural India. Using the Wilson Model of health information-seeking framework, we designed a culturally tailored guided interview to assess information-seeking behaviors and barriers to information seeking among pregnant women. We used a local informant and health care worker to recruit 14 expectant women for two focus group interviews lasting 45 minutes to an hour each. Thirteen other related individuals including husbands, mothers, mothers-in-law, and health care providers were also recruited by hospital counselors for in-depth interviews regarding their pregnant wives/daughters and daughters-in-law. Interviews were transcribed and analyzed by coding the data into thematic categories. The data were coded manually and emerging themes included pregnancy-related knowledge and misconceptions and personal, societal, and structural barriers, as well as risk perceptions and self-efficacy. Lack of access to health care and pregnancy-related health information led participants to rely heavily on information and misconceptions about pregnancy gleaned from elder women, friends, and mothers-in-law and husbands. Doctors and para-medical staff were only consulted during complications. All women faced personal, societal, and structural level barriers, including feelings of shame and embarrassment, fear of repercussion for discussing their pregnancies with their doctors, and inadequate time with their doctors. Lack of access and adequate health care information were of primary concern to pregnant women and their families. Our study can help inform policies and multi-sectoral approaches that are being taken by the Indian government to reduce maternal and child morbidity and burdens.
Uncertainty Assessment of Hypersonic Aerothermodynamics Prediction Capability
NASA Technical Reports Server (NTRS)
Bose, Deepak; Brown, James L.; Prabhu, Dinesh K.; Gnoffo, Peter; Johnston, Christopher O.; Hollis, Brian
2011-01-01
The present paper provides the background of a focused effort to assess uncertainties in predictions of heat flux and pressure in hypersonic flight (airbreathing or atmospheric entry) using state-of-the-art aerothermodynamics codes. The assessment is performed for four mission relevant problems: (1) shock turbulent boundary layer interaction on a compression corner, (2) shock turbulent boundary layer interaction due a impinging shock, (3) high-mass Mars entry and aerocapture, and (4) high speed return to Earth. A validation based uncertainty assessment approach with reliance on subject matter expertise is used. A code verification exercise with code-to-code comparisons and comparisons against well established correlations is also included in this effort. A thorough review of the literature in search of validation experiments is performed, which identified a scarcity of ground based validation experiments at hypersonic conditions. In particular, a shortage of useable experimental data at flight like enthalpies and Reynolds numbers is found. The uncertainty was quantified using metrics that measured discrepancy between model predictions and experimental data. The discrepancy data is statistically analyzed and investigated for physics based trends in order to define a meaningful quantified uncertainty. The detailed uncertainty assessment of each mission relevant problem is found in the four companion papers.
Exploration of Uncertainty in Glacier Modelling
NASA Technical Reports Server (NTRS)
Thompson, David E.
1999-01-01
There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.
Preliminary Validity of the Eyberg Child Behavior Inventory With Filipino Immigrant Parents
Coffey, Dean M.; Javier, Joyce R.; Schrager, Sheree M.
2016-01-01
Filipinos are an understudied minority affected by significant behavioral health disparities. We evaluate evidence for the reliability, construct validity, and convergent validity of the Eyberg Child Behavior Inventory (ECBI) in 6- to 12- year old Filipino children (N = 23). ECBI scores demonstrated high internal consistency, supporting a single-factor model (pre-intervention α =.91; post-intervention α =.95). Results document convergent validity with the Child Behavior Checklist Externalizing scale at pretest (r = .54, p < .01) and posttest (r = .71, p < .001). We conclude that the ECBI is a promising tool to measure behavior problems in Filipino children. PMID:27087739
Preliminary Validity of the Eyberg Child Behavior Inventory With Filipino Immigrant Parents.
Coffey, Dean M; Javier, Joyce R; Schrager, Sheree M
Filipinos are an understudied minority affected by significant behavioral health disparities. We evaluate evidence for the reliability, construct validity, and convergent validity of the Eyberg Child Behavior Inventory (ECBI) in 6- to 12- year old Filipino children ( N = 23). ECBI scores demonstrated high internal consistency, supporting a single-factor model (pre-intervention α =.91; post-intervention α =.95). Results document convergent validity with the Child Behavior Checklist Externalizing scale at pretest ( r = .54, p < .01) and posttest ( r = .71, p < .001). We conclude that the ECBI is a promising tool to measure behavior problems in Filipino children.
Whalen, Diana J; Scott, Lori N; Jakubowski, Karen P; McMakin, Dana L; Hipwell, Alison E; Silk, Jennifer S; Stepp, Stephanie D
2014-01-01
Developmental theories of borderline personality disorder (BPD) posit that transactions between child characteristics and adverse environments, especially those in the context of the parent-child relationship, shape and maintain symptoms of the disorder over time. However, very little empirical work has investigated the role of parenting and parent-child transactions that may predict BPD severity over time. We examined maternal and dyadic affective behaviors during a mother-adolescent conflict discussion task as predictors of the course of BPD severity scores across 3 years in a diverse, at-risk sample of girls (N = 74) oversampled for affective instability and their biological mothers. Adolescent girls completed a structured conflict discussion task with their mothers at age 16. Girls' self-reported BPD severity scores were assessed annually from ages 15 to 17. Mother-adolescent interactions were coded using a global rating system of maternal and dyadic affective behaviors. Results from multilevel linear mixed models indicated that positive maternal affective behavior (i.e., supportive/validating behavior, communication skills, autonomy-promoting behavior, and positive affect) and positive dyadic affective behaviors (i.e., satisfaction and positive escalation) were associated with decreases in girls' BPD severity scores over time. Dyadic negative escalation was associated with higher overall levels of BPD severity scores, but negative maternal affective behavior (i.e., negative affect, dominance, conflict, and denial) was not. These findings suggest that the mother-daughter context is an important protective factor in shaping the course of BPD severity scores during adolescence and may be valuable in assessment, intervention, and prevention efforts.
Pump CFD code validation tests
NASA Technical Reports Server (NTRS)
Brozowski, L. A.
1993-01-01
Pump CFD code validation tests were accomplished by obtaining nonintrusive flow characteristic data at key locations in generic current liquid rocket engine turbopump configurations. Data were obtained with a laser two-focus (L2F) velocimeter at scaled design flow. Three components were surveyed: a 1970's-designed impeller, a 1990's-designed impeller, and a four-bladed unshrouded inducer. Two-dimensional velocities were measured upstream and downstream of the two impellers. Three-dimensional velocities were measured upstream, downstream, and within the blade row of the unshrouded inducer.
NASA Astrophysics Data System (ADS)
Dartevelle, S.
2006-12-01
Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.
VIC: A Computer Analysis of Verbal Interaction Category Systems.
ERIC Educational Resources Information Center
Kline, John A.; And Others
VIC is a computer program for the analysis of verbal interaction category systems, especially the Flanders interaction analysis system. The observer codes verbal behavior on coding sheets for later machine scoring. A matrix is produced by the program showing the number and percentages of times that a particular cell describes classroom behavior.…
Committed to the Honor Code: An Investment Model Analysis of Academic Integrity
ERIC Educational Resources Information Center
Dix, Emily L.; Emery, Lydia F.; Le, Benjamin
2014-01-01
Educators worldwide face challenges surrounding academic integrity. The development of honor codes can promote academic integrity, but understanding how and why honor codes affect behavior is critical to their successful implementation. To date, research has not examined how students' "relationship" to an honor code predicts…
One Speaker, Two Languages. Cross-Disciplinary Perspectives on Code-Switching.
ERIC Educational Resources Information Center
Milroy, Lesley, Ed.; Muysken, Pieter, Ed.
Fifteen articles review code-switching in the four major areas: policy implications in specific institutional and community settings; perspectives of social theory of code-switching as a form of speech behavior in particular social contexts; the grammatical analysis of code-switching, including factors that constrain switching even within a…
Jolley, Rachel J; Jetté, Nathalie; Sawka, Keri Jo; Diep, Lucy; Goliath, Jade; Roberts, Derek J; Yipp, Bryan G; Doig, Christopher J
2015-01-01
Objective Administrative health data are important for health services and outcomes research. We optimised and validated in intensive care unit (ICU) patients an International Classification of Disease (ICD)-coded case definition for sepsis, and compared this with an existing definition. We also assessed the definition's performance in non-ICU (ward) patients. Setting and participants All adults (aged ≥18 years) admitted to a multisystem ICU with general medicosurgical ICU care from one of three tertiary care centres in the Calgary region in Alberta, Canada, between 1 January 2009 and 31 December 2012 were included. Research design Patient medical records were randomly selected and linked to the discharge abstract database. In ICU patients, we validated the Canadian Institute for Health Information (CIHI) ICD-10-CA (Canadian Revision)-coded definition for sepsis and severe sepsis against a reference standard medical chart review, and optimised this algorithm through examination of other conditions apparent in sepsis. Measures Sensitivity (Sn), specificity (Sp), positive predictive value (PPV) and negative predictive value (NPV) were calculated. Results Sepsis was present in 604 of 1001 ICU patients (60.4%). The CIHI ICD-10-CA-coded definition for sepsis had Sn (46.4%), Sp (98.7%), PPV (98.2%) and NPV (54.7%); and for severe sepsis had Sn (47.2%), Sp (97.5%), PPV (95.3%) and NPV (63.2%). The optimised ICD-coded algorithm for sepsis increased Sn by 25.5% and NPV by 11.9% with slightly lowered Sp (85.4%) and PPV (88.2%). For severe sepsis both Sn (65.1%) and NPV (70.1%) increased, while Sp (88.2%) and PPV (85.6%) decreased slightly. Conclusions This study demonstrates that sepsis is highly undercoded in administrative data, thus under-ascertaining the true incidence of sepsis. The optimised ICD-coded definition has a higher validity with higher Sn and should be preferentially considered if used for surveillance purposes. PMID:26700284
Ladner, Travis R; Greenberg, Jacob K; Guerrero, Nicole; Olsen, Margaret A; Shannon, Chevis N; Yarbrough, Chester K; Piccirillo, Jay F; Anderson, Richard C E; Feldstein, Neil A; Wellons, John C; Smyth, Matthew D; Park, Tae Sung; Limbrick, David D
2016-05-01
OBJECTIVE Administrative billing data may facilitate large-scale assessments of treatment outcomes for pediatric Chiari malformation Type I (CM-I). Validated International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code algorithms for identifying CM-I surgery are critical prerequisites for such studies but are currently only available for adults. The objective of this study was to validate two ICD-9-CM code algorithms using hospital billing data to identify pediatric patients undergoing CM-I decompression surgery. METHODS The authors retrospectively analyzed the validity of two ICD-9-CM code algorithms for identifying pediatric CM-I decompression surgery performed at 3 academic medical centers between 2001 and 2013. Algorithm 1 included any discharge diagnosis code of 348.4 (CM-I), as well as a procedure code of 01.24 (cranial decompression) or 03.09 (spinal decompression or laminectomy). Algorithm 2 restricted this group to the subset of patients with a primary discharge diagnosis of 348.4. The positive predictive value (PPV) and sensitivity of each algorithm were calculated. RESULTS Among 625 first-time admissions identified by Algorithm 1, the overall PPV for CM-I decompression was 92%. Among the 581 admissions identified by Algorithm 2, the PPV was 97%. The PPV for Algorithm 1 was lower in one center (84%) compared with the other centers (93%-94%), whereas the PPV of Algorithm 2 remained high (96%-98%) across all subgroups. The sensitivity of Algorithms 1 (91%) and 2 (89%) was very good and remained so across subgroups (82%-97%). CONCLUSIONS An ICD-9-CM algorithm requiring a primary diagnosis of CM-I has excellent PPV and very good sensitivity for identifying CM-I decompression surgery in pediatric patients. These results establish a basis for utilizing administrative billing data to assess pediatric CM-I treatment outcomes.
A Radiation Shielding Code for Spacecraft and Its Validation
NASA Technical Reports Server (NTRS)
Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.
2000-01-01
The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.
Experimental aerothermodynamic research of hypersonic aircraft
NASA Technical Reports Server (NTRS)
Cleary, Joseph W.
1987-01-01
The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.
Monte Carlo tests of the ELIPGRID-PC algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, J.R.
1995-04-01
The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID computer code of Singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM{reg_sign} PC. However, no known independent validation of the ELIPGRID algorithm exists. This document describes a Monte Carlo simulation-based validation of a modified version of the ELIPGRID-PC code. The modified ELIPGRID-PC code is shown to match Monte Carlo-calculated hot-spot detection probabilities to within {plus_minus}0.5% for 319 out of 320 test cases. The one exception, a very thin elliptical hot spot located within a rectangularmore » sampling grid, differed from the Monte Carlo-calculated probability by about 1%. These results provide confidence in the ability of the modified ELIPGRID-PC code to accurately predict hot-spot detection probabilities within an acceptable range of error.« less
Validation of multi-temperature nozzle flow code NOZNT
NASA Technical Reports Server (NTRS)
Park, Chul; Lee, Seung-Ho
1993-01-01
A computer code NOZNT (Nozzle in n-Temperatures), which calculates one-dimensional flows of partially dissociated and ionized air in an expanding nozzle, is tested against five existing sets of experimental data. The code accounts for: a) the differences among various temperatures, i.e., translational-rotational temperature, vibrational temperatures of individual molecular species, and electron-electronic temperature, b) radiative cooling, and c) the effects of impurities. The experimental data considered are: 1) the sodium line reversal and 2) the electron temperature and density data, both obtained in a shock tunnel, and 3) the spectroscopic emission data, 4) electron beam data on vibrational temperature, and 5) mass-spectrometric species concentration data, all obtained in arc-jet wind tunnels. It is shown that the impurities are most likely responsible for the observed phenomena in shock tunnels. For the arc-jet flows, impurities are inconsequential and the NOZNT code is validated by numerically reproducing the experimental data.