Management System for EMR Work Study Program.
ERIC Educational Resources Information Center
Columbia County Board of Public Instruction, Lake City, FL. Exceptional Child Education Dept.
A computerized information management system involving the specification of objectives, the coding of teacher evaluations of students, and a variety of possible outputs has been used in a work study program for educable mentally retarded adolescents. Instructional objectives are specified and coded by number and category. Evaluation is by means of…
NASA Astrophysics Data System (ADS)
Zhou, Abel; White, Graeme L.; Davidson, Rob
2018-02-01
Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.
A Novel Code System for Revealing Sources of Students' Difficulties with Stoichiometry
ERIC Educational Resources Information Center
Gulacar, Ozcan; Overton, Tina L.; Bowman, Charles R.; Fynewever, Herb
2013-01-01
A coding scheme is presented and used to evaluate solutions of seventeen students working on twenty five stoichiometry problems in a think-aloud protocol. The stoichiometry problems are evaluated as a series of sub-problems (e.g., empirical formulas, mass percent, or balancing chemical equations), and the coding scheme was used to categorize each…
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
Watkins, Sharon
2017-01-01
Objectives: The primary objective of this study was to identify patients with heat-related illness (HRI) using codes for heat-related injury diagnosis and external cause of injury in 3 administrative data sets: emergency department (ED) visit records, hospital discharge records, and death certificates. Methods: We obtained data on ED visits, hospitalizations, and deaths for Florida residents for May 1 through October 31, 2005-2012. To identify patients with HRI, we used codes from the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to search data on ED visits and hospitalizations and codes from the International Classification of Diseases, Tenth Revision (ICD-10) to search data on deaths. We stratified the results by data source and whether the HRI was work related. Results: We identified 23 981 ED visits, 4816 hospitalizations, and 140 deaths in patients with non–work-related HRI and 2979 ED visits, 415 hospitalizations, and 23 deaths in patients with work-related HRI. The most common diagnosis codes among patients were for severe HRI (heat exhaustion or heatstroke). The proportion of patients with a severe HRI diagnosis increased with data source severity. If ICD-9-CM code E900.1 and ICD-10 code W92 (excessive heat of man-made origin) were used as exclusion criteria for HRI, 5.0% of patients with non–work-related deaths, 3.0% of patients with work-related ED visits, and 1.7% of patients with work-related hospitalizations would have been removed. Conclusions: Using multiple data sources and all diagnosis fields may improve the sensitivity of HRI surveillance. Future studies should evaluate the impact of converting ICD-9-CM to ICD-10-CM codes on HRI surveillance of ED visits and hospitalizations. PMID:28379784
Critical evaluation of reverse engineering tool Imagix 4D!
Yadav, Rashmi; Patel, Ravindra; Kothari, Abhay
2016-01-01
The comprehension of legacy codes is difficult to understand. Various commercial reengineering tools are available that have unique working styles, and are equipped with their inherent capabilities and shortcomings. The focus of the available tools is in visualizing static behavior not the dynamic one. Therefore, it is difficult for people who work in software product maintenance, code understanding reengineering/reverse engineering. Consequently, the need for a comprehensive reengineering/reverse engineering tool arises. We found the usage of Imagix 4D to be good as it generates the maximum pictorial representations in the form of flow charts, flow graphs, class diagrams, metrics and, to a partial extent, dynamic visualizations. We evaluated Imagix 4D with the help of a case study involving a few samples of source code. The behavior of the tool was analyzed on multiple small codes and a large code gcc C parser. Large code evaluation was performed to uncover dead code, unstructured code, and the effect of not including required files at preprocessing level. The utility of Imagix 4D to prepare decision density and complexity metrics for a large code was found to be useful in getting to know how much reengineering is required. At the outset, Imagix 4D offered limitations in dynamic visualizations, flow chart separation (large code) and parsing loops. The outcome of evaluation will eventually help in upgrading Imagix 4D and posed a need of full featured tools in the area of software reengineering/reverse engineering. It will also help the research community, especially those who are interested in the realm of software reengineering tool building.
Nested polynomial trends for the improvement of Gaussian process-based predictors
NASA Astrophysics Data System (ADS)
Perrin, G.; Soize, C.; Marque-Pucheu, S.; Garnier, J.
2017-10-01
The role of simulation keeps increasing for the sensitivity analysis and the uncertainty quantification of complex systems. Such numerical procedures are generally based on the processing of a huge amount of code evaluations. When the computational cost associated with one particular evaluation of the code is high, such direct approaches based on the computer code only, are not affordable. Surrogate models have therefore to be introduced to interpolate the information given by a fixed set of code evaluations to the whole input space. When confronted to deterministic mappings, the Gaussian process regression (GPR), or kriging, presents a good compromise between complexity, efficiency and error control. Such a method considers the quantity of interest of the system as a particular realization of a Gaussian stochastic process, whose mean and covariance functions have to be identified from the available code evaluations. In this context, this work proposes an innovative parametrization of this mean function, which is based on the composition of two polynomials. This approach is particularly relevant for the approximation of strongly non linear quantities of interest from very little information. After presenting the theoretical basis of this method, this work compares its efficiency to alternative approaches on a series of examples.
Methods of treating complex space vehicle geometry for charged particle radiation transport
NASA Technical Reports Server (NTRS)
Hill, C. W.
1973-01-01
Current methods of treating complex geometry models for space radiation transport calculations are reviewed. The geometric techniques used in three computer codes are outlined. Evaluations of geometric capability and speed are provided for these codes. Although no code development work is included several suggestions for significantly improving complex geometry codes are offered.
Examples of Use of SINBAD Database for Nuclear Data and Code Validation
NASA Astrophysics Data System (ADS)
Kodeli, Ivan; Žerovnik, Gašper; Milocco, Alberto
2017-09-01
The SINBAD database currently contains compilations and evaluations of over 100 shielding benchmark experiments. The SINBAD database is widely used for code and data validation. Materials covered include: Air, N. O, H2O, Al, Be, Cu, graphite, concrete, Fe, stainless steel, Pb, Li, Ni, Nb, SiC, Na, W, V and mixtures thereof. Over 40 organisations from 14 countries and 2 international organisations have contributed data and work in support of SINBAD. Examples of the use of the database in the scope of different international projects, such as the Working Party on Evaluation Cooperation of the OECD and the European Fusion Programme demonstrate the merit and possible usage of the database for the validation of modern nuclear data evaluations and new computer codes.
Chriqui, Jamie F; Leider, Julien; Thrun, Emily; Nicholson, Lisa M; Slater, Sandy
2016-01-01
Communities across the United States have been reforming their zoning codes to create pedestrian-friendly neighborhoods with increased street connectivity, mixed use and higher density, open space, transportation infrastructure, and a traditional neighborhood structure. Zoning code reforms include new urbanist zoning such as the SmartCode, form-based codes, transects, transportation and pedestrian-oriented developments, and traditional neighborhood developments. To examine the relationship of zoning code reforms and more active living--oriented zoning provisions with adult active travel to work via walking, biking, or by using public transit. Zoning codes effective as of 2010 were compiled for 3,914 municipal-level jurisdictions located in 471 counties and 2 consolidated cities in 48 states and the District of Columbia, and that collectively covered 72.9% of the U.S. population. Zoning codes were evaluated for the presence of code reform zoning and nine pedestrian-oriented zoning provisions (1 = yes): sidewalks, crosswalks, bike-pedestrian connectivity, street connectivity, bike lanes, bike parking, bike-pedestrian trails/paths, mixed-use development, and other walkability/pedestrian orientation. A zoning scale reflected the number of provisions addressed (out of 10). Five continuous outcome measures were constructed using 2010-2014 American Community Survey municipal-level 5-year estimates to assess the percentage of workers: walking, biking, walking or biking, or taking public transit to work OR engaged in any active travel to work. Regression models controlled for municipal-level socioeconomic characteristics and a GIS-constructed walkability scale and were clustered on county with robust standard errors. Adjusted models indicated that several pedestrian-oriented zoning provisions were statistically associated (p < 0.05 or lower) with increased rates of walking, biking, or engaging in any active travel (walking, biking, or any active travel) to work: code reform zoning, bike parking (street furniture), bike lanes, bike-pedestrian trails/paths, other walkability, mixed-use zoning, and a higher score on the zoning scale. Public transit use was associated with code reform zoning and a number of zoning measures in Southern jurisdictions but not in non-Southern jurisdictions. As jurisdictions revisit their zoning and land use policies, they may want to evaluate the pedestrian-orientation of their zoning codes so that they can plan for pedestrian improvements that will help to encourage active travel to work.
Light Water Reactor Sustainability Program Status Report on the Grizzly Code Enhancements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novascone, Stephen R.; Spencer, Benjamin W.; Hales, Jason D.
2013-09-01
This report summarizes work conducted during fiscal year 2013 to work toward developing a full capability to evaluate fracture contour J-integrals to the Grizzly code. This is a progress report on ongoing work. During the next fiscal year, this capability will be completed, and Grizzly will be capable of evaluating these contour integrals for 3D geometry, including the effects of thermal stress and large deformation. A usable, limited capability has been developed, which is capable of evaluating these integrals on 2D geometry, without considering the effects of material nonlinearity, thermal stress or large deformation. This report presents an overview ofmore » the approach used, along with a demonstration of the current capability in Grizzly, including a comparison with an analytical solution.« less
Evaluation of candidate working fluid formulations for the electrothermal-chemical wind tunnel
NASA Technical Reports Server (NTRS)
Akyurtlu, Jale F.; Akyurtlu, Ates
1993-01-01
A new hypersonic test facility which can simulate conditions typical of atmospheric flight at Mach numbers up to 20 is currently under study at the NASA/LaRC Hypersonic Propulsion Branch. In the proposed research, it was suggested that a combustion augmented electrothermal wind tunnel concept may be applied to the planned hypersonic testing facility. The purpose of the current investigation is to evaluate some candidate working fluid formulations which may be used in the chemical-electrothermal wind. The efforts in the initial phase of this research were concentrated on acquiring the code used by GASL to model the electrothermal wind tunnel and testing it using the conditions of GASL simulation. The early version of the general chemical kinetics code (GCKP84) was obtained from NASA and the latest updated version of the code (LSENS) was obtained from the author Dr. Bittker. Both codes are installed on a personal computer with a 486 25 MHz processor and 16 Mbyte RAM. Since the available memory was not sufficient to debug LSENS, for the current work GCKP84 was used.
Structural design, analysis, and code evaluation of an odd-shaped pressure vessel
NASA Astrophysics Data System (ADS)
Rezvani, M. A.; Ziada, H. H.
1992-12-01
An effort to design, analyze, and evaluate a rectangular pressure vessel is described. Normally pressure vessels are designed in circular or spherical shapes to prevent stress concentrations. In this case, because of operational limitations, the choice of vessels was limited to a rectangular pressure box with a removable cover plate. The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code is used as a guideline for pressure containments whose width or depth exceeds 15.24 cm (6.0 in.) and where pressures will exceed 103.4 KPa (15.0 lbf/in(sup 2)). This evaluation used Section 8 of this Code, hereafter referred to as the Code. The dimensions and working pressure of the subject vessel fall within the pressure vessel category of the Code. The Code design guidelines and rules do not directly apply to this vessel. Therefore, finite-element methodology was used to analyze the pressure vessel, and the Code then was used in qualifying the vessel to be stamped to the Code. Section 8, Division 1 of the Code was used for evaluation. This action was justified by selecting a material for which fatigue damage would not be a concern. The stress analysis results were then checked against the Code, and the thicknesses adjusted to satisfy Code requirements. Although not directly applicable, the Code design formulas for rectangular vessels were also considered and presented.
Turbulence modeling for hypersonic flight
NASA Technical Reports Server (NTRS)
Bardina, Jorge E.
1992-01-01
The objective of the present work is to develop, verify, and incorporate two equation turbulence models which account for the effect of compressibility at high speeds into a three dimensional Reynolds averaged Navier-Stokes code and to provide documented model descriptions and numerical procedures so that they can be implemented into the National Aerospace Plane (NASP) codes. A summary of accomplishments is listed: (1) Four codes have been tested and evaluated against a flat plate boundary layer flow and an external supersonic flow; (2) a code named RANS was chosen because of its speed, accuracy, and versatility; (3) the code was extended from thin boundary layer to full Navier-Stokes; (4) the K-omega two equation turbulence model has been implemented into the base code; (5) a 24 degree laminar compression corner flow has been simulated and compared to other numerical simulations; and (6) work is in progress in writing the numerical method of the base code including the turbulence model.
Nuclear Data Evaluation Co-operation (WPEC) Nuclear Reaction Data Centers, NRDC (IAEA Vienna) EMPIRE , Nuclear Reaction Model Code Atlas of Neutron Resonances The Cross Section Evaluation Working Group (CSEWG
The NJOY Nuclear Data Processing System, Version 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macfarlane, Robert; Muir, Douglas W.; Boicourt, R. M.
The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenthal, Andrew
The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stovemore » pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.« less
Schmitz, Matthew; Forst, Linda
2016-02-15
Inclusion of information about a patient's work, industry, and occupation, in the electronic health record (EHR) could facilitate occupational health surveillance, better health outcomes, prevention activities, and identification of workers' compensation cases. The US National Institute for Occupational Safety and Health (NIOSH) has developed an autocoding system for "industry" and "occupation" based on 1990 Bureau of Census codes; its effectiveness requires evaluation in conjunction with promoting the mandatory addition of these variables to the EHR. The objective of the study was to evaluate the intercoder reliability of NIOSH's Industry and Occupation Computerized Coding System (NIOCCS) when applied to data collected in a community survey conducted under the Affordable Care Act; to determine the proportion of records that are autocoded using NIOCCS. Standard Occupational Classification (SOC) codes are used by several federal agencies in databases that capture demographic, employment, and health information to harmonize variables related to work activities among these data sources. There are 359 industry and occupation responses that were hand coded by 2 investigators, who came to a consensus on every code. The same variables were autocoded using NIOCCS at the high and moderate criteria level. Kappa was .84 for agreement between hand coders and between the hand coder consensus code versus NIOCCS high confidence level codes for the first 2 digits of the SOC code. For 4 digits, NIOCCS coding versus investigator coding ranged from kappa=.56 to .70. In this study, NIOCCS was able to achieve production rates (ie, to autocode) 31%-36% of entered variables at the "high confidence" level and 49%-58% at the "medium confidence" level. Autocoding (production) rates are somewhat lower than those reported by NIOSH. Agreement between manually coded and autocoded data are "substantial" at the 2-digit level, but only "fair" to "good" at the 4-digit level. This work serves as a baseline for performance of NIOCCS by investigators in the field. Further field testing will clarify NIOCCS effectiveness in terms of ability to assign codes and coding accuracy and will clarify its value as inclusion of these occupational variables in the EHR is promoted.
NASA Astrophysics Data System (ADS)
Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.
2010-04-01
An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.
Preliminary Analysis of the Transient Reactor Test Facility (TREAT) with PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Lee, C. H.
The neutron transport code PROTEUS has been used to perform preliminary simulations of the Transient Reactor Test Facility (TREAT). TREAT is an experimental reactor designed for the testing of nuclear fuels and other materials under transient conditions. It operated from 1959 to 1994, when it was placed on non-operational standby. The restart of TREAT to support the U.S. Department of Energy’s resumption of transient testing is currently underway. Both single assembly and assembly-homogenized full core models have been evaluated. Simulations were performed using a historic set of WIMS-ANL-generated cross-sections as well as a new set of Serpent-generated cross-sections. To supportmore » this work, further analyses were also performed using additional codes in order to investigate particular aspects of TREAT modeling. DIF3D and the Monte-Carlo codes MCNP and Serpent were utilized in these studies. MCNP and Serpent were used to evaluate the effect of geometry homogenization on the simulation results and to support code-to-code comparisons. New meshes for the PROTEUS simulations were created using the CUBIT toolkit, with additional meshes generated via conversion of selected DIF3D models to support code-to-code verifications. All current analyses have focused on code-to-code verifications, with additional verification and validation studies planned. The analysis of TREAT with PROTEUS-SN is an ongoing project. This report documents the studies that have been performed thus far, and highlights key challenges to address in future work.« less
CFD and Neutron codes coupling on a computational platform
NASA Astrophysics Data System (ADS)
Cerroni, D.; Da Vià, R.; Manservisi, S.; Menghini, F.; Scardovelli, R.
2017-01-01
In this work we investigate the thermal-hydraulics behavior of a PWR nuclear reactor core, evaluating the power generation distribution taking into account the local temperature field. The temperature field, evaluated using a self-developed CFD module, is exchanged with a neutron code, DONJON-DRAGON, which updates the macroscopic cross sections and evaluates the new neutron flux. From the updated neutron flux the new peak factor is evaluated and the new temperature field is computed. The exchange of data between the two codes is obtained thanks to their inclusion into the computational platform SALOME, an open-source tools developed by the collaborative project NURESAFE. The numerical libraries MEDmem, included into the SALOME platform, are used in this work, for the projection of computational fields from one problem to another. The two problems are driven by a common supervisor that can access to the computational fields of both systems, in every time step, the temperature field, is extracted from the CFD problem and set into the neutron problem. After this iteration the new power peak factor is projected back into the CFD problem and the new time step can be computed. Several computational examples, where both neutron and thermal-hydraulics quantities are parametrized, are finally reported in this work.
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Statistical Analysis of CFD Solutions from the Third AIAA Drag Prediction Workshop
NASA Technical Reports Server (NTRS)
Morrison, Joseph H.; Hemsch, Michael J.
2007-01-01
The first AIAA Drag Prediction Workshop, held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third Drag Prediction Workshop focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This work evaluated the effect of grid refinement on the code-to-code scatter for the clean attached flow test cases and the separated flow test cases.
Development of an Aeroelastic Code Based on an Euler/Navier-Stokes Aerodynamic Solver
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.; Srivastava, Rakesh; Keith, Theo G., Jr.; Stefko, George L.; Janus, Mark J.
1996-01-01
This paper describes the development of an aeroelastic code (TURBO-AE) based on an Euler/Navier-Stokes unsteady aerodynamic analysis. A brief review of the relevant research in the area of propulsion aeroelasticity is presented. The paper briefly describes the original Euler/Navier-Stokes code (TURBO) and then details the development of the aeroelastic extensions. The aeroelastic formulation is described. The modeling of the dynamics of the blade using a modal approach is detailed, along with the grid deformation approach used to model the elastic deformation of the blade. The work-per-cycle approach used to evaluate aeroelastic stability is described. Representative results used to verify the code are presented. The paper concludes with an evaluation of the development thus far, and some plans for further development and validation of the TURBO-AE code.
Repair, Evaluation, Maintenance, and Rehabilitation Research Program. Lock Accident Study
1990-09-01
ZIP Code) 10 . SOURCE OF FUNDIN6 NUMBERS -- . ;_ PROGRAM PROJECT TASK WORK UNIT Washington, DC 20314-1000 ELEMENT NO. NO. NO. . NO. 11. TITLE (1 eNy...miwcrwA; I ’+an na SECURITY CLASSIFICATION OF THIS PAGE 10 . WORK UNIT ACCESSION NO. (Continued). Funding provided by Repair, Evaluation, Maintenance, and... 10 PM S ............................................................... 10 District Records
2006-04-01
prepared by the Research and Animal Care Branch, Code 2351, of the Biosciences Division, Code 235, SSC San Diego. This is a work of the United...and Animal Care Branch Under authority of M. Rothe, Head Biosciences Division i EXECUTIVE SUMMARY In this study, we have evaluated peer... sharks , skates, and rays) and teleost fishes (modern bony fishes) and provide recommendations for research to address remaining issues. Clear responses
Deployment of the OSIRIS EM-PIC code on the Intel Knights Landing architecture
NASA Astrophysics Data System (ADS)
Fonseca, Ricardo
2017-10-01
Electromagnetic particle-in-cell (EM-PIC) codes such as OSIRIS have found widespread use in modelling the highly nonlinear and kinetic processes that occur in several relevant plasma physics scenarios, ranging from astrophysical settings to high-intensity laser plasma interaction. Being computationally intensive, these codes require large scale HPC systems, and a continuous effort in adapting the algorithm to new hardware and computing paradigms. In this work, we report on our efforts on deploying the OSIRIS code on the new Intel Knights Landing (KNL) architecture. Unlike the previous generation (Knights Corner), these boards are standalone systems, and introduce several new features, include the new AVX-512 instructions and on-package MCDRAM. We will focus on the parallelization and vectorization strategies followed, as well as memory management, and present a detailed performance evaluation of code performance in comparison with the CPU code. This work was partially supported by Fundaçã para a Ciência e Tecnologia (FCT), Portugal, through Grant No. PTDC/FIS-PLA/2940/2014.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
2016-01-01
Background Inclusion of information about a patient’s work, industry, and occupation, in the electronic health record (EHR) could facilitate occupational health surveillance, better health outcomes, prevention activities, and identification of workers’ compensation cases. The US National Institute for Occupational Safety and Health (NIOSH) has developed an autocoding system for “industry” and “occupation” based on 1990 Bureau of Census codes; its effectiveness requires evaluation in conjunction with promoting the mandatory addition of these variables to the EHR. Objective The objective of the study was to evaluate the intercoder reliability of NIOSH’s Industry and Occupation Computerized Coding System (NIOCCS) when applied to data collected in a community survey conducted under the Affordable Care Act; to determine the proportion of records that are autocoded using NIOCCS. Methods Standard Occupational Classification (SOC) codes are used by several federal agencies in databases that capture demographic, employment, and health information to harmonize variables related to work activities among these data sources. There are 359 industry and occupation responses that were hand coded by 2 investigators, who came to a consensus on every code. The same variables were autocoded using NIOCCS at the high and moderate criteria level. Results Kappa was .84 for agreement between hand coders and between the hand coder consensus code versus NIOCCS high confidence level codes for the first 2 digits of the SOC code. For 4 digits, NIOCCS coding versus investigator coding ranged from kappa=.56 to .70. In this study, NIOCCS was able to achieve production rates (ie, to autocode) 31%-36% of entered variables at the “high confidence” level and 49%-58% at the “medium confidence” level. Autocoding (production) rates are somewhat lower than those reported by NIOSH. Agreement between manually coded and autocoded data are “substantial” at the 2-digit level, but only “fair” to “good” at the 4-digit level. Conclusions This work serves as a baseline for performance of NIOCCS by investigators in the field. Further field testing will clarify NIOCCS effectiveness in terms of ability to assign codes and coding accuracy and will clarify its value as inclusion of these occupational variables in the EHR is promoted. PMID:26878932
Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mainardi, Enrico; Donahue, Richard J.; Blakely, Eleanor A.
2002-09-11
The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions ofmore » a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using this code if comparing with event-by-event Monte Carlo codes like PITS. This preliminary work has been important to address dosimetric estimates at low electron energies. It demonstrates that codes like PENELOPE can be used for Dose evaluation, even with such small geometries and energies involved, which are far below the normal use for which the code was created. Further work (initiated in Summer 2002) is still needed however, to create a user-code for PENELOPE that allows uniform comparison of exact cell geometries, integral volumes and also microdosimetric scoring quantities, a field where track-structure codes like PITS, written for this purpose, are believed to be superior.« less
Deductive Evaluation: Formal Code Analysis With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben. L
2016-01-01
We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.
Subgroup A : nuclear model codes report to the Sixteenth Meeting of the WPEC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talou, P.; Chadwick, M. B.; Dietrich, F. S.
2004-01-01
The Subgroup A activities focus on the development of nuclear reaction models and codes, used in evaluation work for nuclear reactions from the unresolved energy region up to the pion threshold production limit, and for target nuclides from the low teens and heavier. Much of the efforts are devoted by each participant to the continuing development of their own Institution codes. Progresses in this arena are reported in detail for each code in the present document. EMPIRE-II is of public access. The release of the TALYS code has been announced for the ND2004 Conference in Santa Fe, NM, October 2004.more » McGNASH is still under development and is not expected to be released in the very near future. In addition, Subgroup A members have demonstrated a growing interest in working on common modeling and codes capabilities, which would significantly reduce the amount of duplicate work, help manage efficiently the growing lines of existing codes, and render codes inter-comparison much easier. A recent and important activity of the Subgroup A has therefore been to develop the framework and the first bricks of the ModLib library, which is constituted of mostly independent pieces of codes written in Fortran 90 (and above) to be used in existing and future nuclear reaction codes. Significant progresses in the development of ModLib have been made during the past year. Several physics modules have been added to the library, and a few more have been planned in detail for the coming year.« less
Final report for the Tera Computer TTI CRADA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, G.S.; Pavlakos, C.; Silva, C.
1997-01-01
Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less
Amoroso, P J; Smith, G S; Bell, N S
2000-04-01
Accurate injury cause data are essential for injury prevention research. U.S. military hospitals, unlike civilian hospitals, use the NATO STANAG system for cause-of-injury coding. Reported deficiencies in civilian injury cause data suggested a need to specifically evaluate the STANAG. The Total Army Injury and Health Outcomes Database (TAIHOD) was used to evaluate worldwide Army injury hospitalizations, especially STANAG Trauma, Injury, and Place of Occurrence coding. We conducted a review of hospital procedures at Tripler Army Medical Center (TAMC) including injury cause and intent coding, potential crossover between acute injuries and musculoskeletal conditions, and data for certain hospital patients who are not true admissions. We also evaluated the use of free-text injury comment fields in three hospitals. Army-wide review of injury records coding revealed full compliance with cause coding, although nonspecific codes appeared to be overused. A small but intensive single hospital records review revealed relatively poor intent coding but good activity and cause coding. Data on specific injury history were present on most acute injury records and 75% of musculoskeletal conditions. Place of Occurrence coding, although inherently nonspecific, was over 80% accurate. Review of text fields produced additional details of the injuries in over 80% of cases. STANAG intent coding specificity was poor, while coding of cause of injury was at least comparable to civilian systems. The strengths of military hospital data systems are an exceptionally high compliance with injury cause coding, the availability of free text, and capture of all population hospital records without regard to work-relatedness. Simple changes in procedures could greatly improve data quality.
Large-area sheet task advanced dendritic web growth development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.
1982-01-01
The computer code for calculating web temperature distribution was expanded to provide a graphics output in addition to numerical and punch card output. The new code was used to examine various modifications of the J419 configuration and, on the basis of the results, a new growth geometry was designed. Additionally, several mathematically defined temperature profiles were evaluated for the effects of the free boundary (growth front) on the thermal stress generation. Experimental growth runs were made with modified J419 configurations to complement the modeling work. A modified J435 configuration was evaluated.
Comparison of the thermal neutron scattering treatment in MCNP6 and GEANT4 codes
NASA Astrophysics Data System (ADS)
Tran, H. N.; Marchix, A.; Letourneau, A.; Darpentigny, J.; Menelle, A.; Ott, F.; Schwindling, J.; Chauvin, N.
2018-06-01
To ensure the reliability of simulation tools, verification and comparison should be made regularly. This paper describes the work performed in order to compare the neutron transport treatment in MCNP6.1 and GEANT4-10.3 in the thermal energy range. This work focuses on the thermal neutron scattering processes for several potential materials which would be involved in the neutron source designs of Compact Accelerator-based Neutrons Sources (CANS), such as beryllium metal, beryllium oxide, polyethylene, graphite, para-hydrogen, light water, heavy water, aluminium and iron. Both thermal scattering law and free gas model, coming from the evaluated data library ENDF/B-VII, were considered. It was observed that the GEANT4.10.03-patch2 version was not able to account properly the coherent elastic process occurring in crystal lattice. This bug is treated in this work and it should be included in the next release of the code. Cross section sampling and integral tests have been performed for both simulation codes showing a fair agreement between the two codes for most of the materials except for iron and aluminium.
First benchmark of the Unstructured Grid Adaptation Working Group
NASA Technical Reports Server (NTRS)
Ibanez, Daniel; Barral, Nicolas; Krakos, Joshua; Loseille, Adrien; Michal, Todd; Park, Mike
2017-01-01
Unstructured grid adaptation is a technology that holds the potential to improve the automation and accuracy of computational fluid dynamics and other computational disciplines. Difficulty producing the highly anisotropic elements necessary for simulation on complex curved geometries that satisfies a resolution request has limited this technology's widespread adoption. The Unstructured Grid Adaptation Working Group is an open gathering of researchers working on adapting simplicial meshes to conform to a metric field. Current members span a wide range of institutions including academia, industry, and national laboratories. The purpose of this group is to create a common basis for understanding and improving mesh adaptation. We present our first major contribution: a common set of benchmark cases, including input meshes and analytic metric specifications, that are publicly available to be used for evaluating any mesh adaptation code. We also present the results of several existing codes on these benchmark cases, to illustrate their utility in identifying key challenges common to all codes and important differences between available codes. Future directions are defined to expand this benchmark to mature the technology necessary to impact practical simulation workflows.
Preserving privacy of online digital physiological signals using blind and reversible steganography.
Shiu, Hung-Jr; Lin, Bor-Sing; Huang, Chien-Hung; Chiang, Pei-Ying; Lei, Chin-Laung
2017-11-01
Physiological signals such as electrocardiograms (ECG) and electromyograms (EMG) are widely used to diagnose diseases. Presently, the Internet offers numerous cloud storage services which enable digital physiological signals to be uploaded for convenient access and use. Numerous online databases of medical signals have been built. The data in them must be processed in a manner that preserves patients' confidentiality. A reversible error-correcting-coding strategy will be adopted to transform digital physiological signals into a new bit-stream that uses a matrix in which is embedded the Hamming code to pass secret messages or private information. The shared keys are the matrix and the version of the Hamming code. An online open database, the MIT-BIH arrhythmia database, was used to test the proposed algorithms. The time-complexity, capacity and robustness are evaluated. Comparisons of several evaluations subject to related work are also proposed. This work proposes a reversible, low-payload steganographic scheme for preserving the privacy of physiological signals. An (n, m)-hamming code is used to insert (n - m) secret bits into n bits of a cover signal. The number of embedded bits per modification is higher than in comparable methods, and the computational power is efficient and the scheme is secure. Unlike other Hamming-code based schemes, the proposed scheme is both reversible and blind. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacFarlane, R. E.
An accurate representation of the scattering of neutrons by the materials used to build cold sources at neutron scattering facilities is important for the initial design and optimization of a cold source, and for the analysis of experimental results obtained using the cold source. In practice, this requires a good representation of the physics of scattering from the material, a method to convert this into observable quantities (such as scattering cross sections), and a method to use the results in a neutron transport code (such as the MCNP Monte Carlo code). At Los Alamos, the authors have been developing thesemore » capabilities over the last ten years. The final set of cold-moderator evaluations, together with evaluations for conventional moderator materials, was released in 1994. These materials have been processed into MCNP data files using the NJOY Nuclear Data Processing System. Over the course of this work, they were able to develop a new module for NJOY called LEAPR based on the LEAP + ADDELT code from the UK as modified by D.J. Picton for cold-moderator calculations. Much of the physics for methane came from Picton`s work. The liquid hydrogen work was originally based on a code using the Young-Koppel approach that went through a number of hands in Europe (including Rolf Neef and Guy Robert). It was generalized and extended for LEAPR, and depends strongly on work by Keinert and Sax of the University of Stuttgart. Thus, their collection of cold-moderator scattering kernels is truly an international effort, and they are glad to be able to return the enhanced evaluations and processing techniques to the international community. In this paper, they give sections on the major cold moderator materials (namely, solid methane, liquid methane, and liquid hydrogen) using each section to introduce the relevant physics for that material and to show typical results.« less
NASA Rotor 37 CFD Code Validation: Glenn-HT Code
NASA Technical Reports Server (NTRS)
Ameri, Ali A.
2010-01-01
In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.
Heuristic rules embedded genetic algorithm for in-core fuel management optimization
NASA Astrophysics Data System (ADS)
Alim, Fatih
The objective of this study was to develop a unique methodology and a practical tool for designing loading pattern (LP) and burnable poison (BP) pattern for a given Pressurized Water Reactor (PWR) core. Because of the large number of possible combinations for the fuel assembly (FA) loading in the core, the design of the core configuration is a complex optimization problem. It requires finding an optimal FA arrangement and BP placement in order to achieve maximum cycle length while satisfying the safety constraints. Genetic Algorithms (GA) have been already used to solve this problem for LP optimization for both PWR and Boiling Water Reactor (BWR). The GA, which is a stochastic method works with a group of solutions and uses random variables to make decisions. Based on the theories of evaluation, the GA involves natural selection and reproduction of the individuals in the population for the next generation. The GA works by creating an initial population, evaluating it, and then improving the population by using the evaluation operators. To solve this optimization problem, a LP optimization package, GARCO (Genetic Algorithm Reactor Code Optimization) code is developed in the framework of this thesis. This code is applicable for all types of PWR cores having different geometries and structures with an unlimited number of FA types in the inventory. To reach this goal, an innovative GA is developed by modifying the classical representation of the genotype. To obtain the best result in a shorter time, not only the representation is changed but also the algorithm is changed to use in-core fuel management heuristics rules. The improved GA code was tested to demonstrate and verify the advantages of the new enhancements. The developed methodology is explained in this thesis and preliminary results are shown for the VVER-1000 reactor hexagonal geometry core and the TMI-1 PWR. The improved GA code was tested to verify the advantages of new enhancements. The core physics code used for VVER in this research is Moby-Dick, which was developed to analyze the VVER by SKODA Inc. The SIMULATE-3 code, which is an advanced two-group nodal code, is used to analyze the TMI-1.
2013-01-01
Background The active recruitment of health workers from developing countries to developed countries has become a major threat to global health. In an effort to manage this migration, the 63rd World Health Assembly adopted the World Health Organization (WHO) Global Code of Practice on the International Recruitment of Health Personnel in May 2010. While the Code has been lauded as the first globally-applicable regulatory framework for health worker recruitment, its impact has yet to be evaluated. We offer the first empirical evaluation of the Code’s impact on national and sub-national actors in Australia, Canada, United Kingdom and United States of America, which are the English-speaking developed countries with the greatest number of migrant health workers. Methods 42 key informants from across government, civil society and private sectors were surveyed to measure their awareness of the Code, knowledge of specific changes resulting from it, overall opinion on the effectiveness of non-binding codes, and suggestions to improve this Code’s implementation. Results 60% of respondents believed their colleagues were not aware of the Code, and 93% reported that no specific changes had been observed in their work as a result of the Code. 86% reported that the Code has not had any meaningful impact on policies, practices or regulations in their countries. Conclusions This suggests a gap between awareness of the Code among stakeholders at global forums and the awareness and behaviour of national and sub-national actors. Advocacy and technical guidance for implementing the Code are needed to improve its impact on national decision-makers. PMID:24228827
Analysis of film cooling in rocket nozzles
NASA Technical Reports Server (NTRS)
Woodbury, Keith A.; Karr, Gerald R.
1992-01-01
Progress during the reporting period is summarized. Analysis of film cooling in rocket nozzles by computational fluid dynamics (CFD) computer codes is desirable for two reasons. First, it allows prediction of resulting flow fields within the rocket nozzle, in particular the interaction of the coolant boundary layer with the main flow. This facilitates evaluation of potential cooling configurations with regard to total thrust, etc., before construction and testing of any prototype. Secondly, CFD simulation of film cooling allows for assessment of the effectiveness of the proposed cooling in limiting nozzle wall temperature rises. This latter objective is the focus of the current work. The desired objective is to use the Finite Difference Navier Stokes (FDNS) code to predict wall heat fluxes or wall temperatures in rocket nozzles. As prior work has revealed that the FDNS code is deficient in the thermal modeling of boundary conditions, the first step is to correct these deficiencies in the FDNS code. Next, these changes must be tested against available data. Finally, the code will be used to model film cooling of a particular rocket nozzle. The third task of this research, using the modified code to compute the flow of hot gases through a nozzle, is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Platania, P., E-mail: platania@ifp.cnr.it; Figini, L.; Farina, D.
The purpose of this work is the optical modeling and physical performances evaluations of the JT-60SA ECRF launcher system. The beams have been simulated with the electromagnetic code GRASP® and used as input for ECCD calculations performed with the beam tracing code GRAY, capable of modeling propagation, absorption and current drive of an EC Gaussion beam with general astigmatism. Full details of the optical analysis has been taken into account to model the launched beams. Inductive and advanced reference scenarios has been analysed for physical evaluations in the full poloidal and toroidal steering ranges for two slightly different layouts ofmore » the launcher system.« less
Implementation of Energy Code Controls Requirements in New Commercial Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.; Hatten, Mike
Most state energy codes in the United States are based on one of two national model codes; ANSI/ASHRAE/IES 90.1 (Standard 90.1) or the International Code Council (ICC) International Energy Conservation Code (IECC). Since 2004, covering the last four cycles of Standard 90.1 updates, about 30% of all new requirements have been related to building controls. These requirements can be difficult to implement and verification is beyond the expertise of most building code officials, yet the assumption in studies that measure the savings from energy codes is that they are implemented and working correctly. The objective of the current research ismore » to evaluate the degree to which high impact controls requirements included in commercial energy codes are properly designed, commissioned and implemented in new buildings. This study also evaluates the degree to which these control requirements are realizing their savings potential. This was done using a three-step process. The first step involved interviewing commissioning agents to get a better understanding of their activities as they relate to energy code required controls measures. The second involved field audits of a sample of commercial buildings to determine whether the code required control measures are being designed, commissioned and correctly implemented and functioning in new buildings. The third step includes compilation and analysis of the information gather during the first two steps. Information gathered during these activities could be valuable to code developers, energy planners, designers, building owners, and building officials.« less
Large Eddy Simulations and Turbulence Modeling for Film Cooling
NASA Technical Reports Server (NTRS)
Acharya, Sumanta
1999-01-01
The objective of the research is to perform Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) for film cooling process, and to evaluate and improve advanced forms of the two equation turbulence models for turbine blade surface flow analysis. The DNS/LES were used to resolve the large eddies within the flow field near the coolant jet location. The work involved code development and applications of the codes developed to the film cooling problems. Five different codes were developed and utilized to perform this research. This report presented a summary of the development of the codes and their applications to analyze the turbulence properties at locations near coolant injection holes.
Nakamura, Brad J; Selbo-Bruns, Alexandra; Okamura, Kelsie; Chang, Jaime; Slavin, Lesley; Shimabukuro, Scott
2014-02-01
The purpose of this small pilot study was three-fold: (a) to begin development of a coding scheme for supervisor and therapist skill acquisition, (b) to preliminarily investigate a pilot train-the-trainer paradigm for skill development, and (c) to evaluate self-reported versus observed indicators of skill mastery in that pilot program. Participants included four supervisor-therapist dyads (N = 8) working with public mental health sector youth. Master trainers taught cognitive-behavioral therapy techniques to supervisors, who in turn trained therapists on these techniques. Supervisor and therapist skill acquisition and supervisor use of teaching strategies were repeatedly assessed through coding of scripted role-plays with a multiple-baseline across participants and behaviors design. The coding system, the Practice Element Train the Trainer - Supervisor/Therapist Versions of the Therapy Process Observational Coding System for Child Psychotherapy, was developed and evaluated though the course of the investigation. The coding scheme demonstrated excellent reliability (ICCs [1,2] = 0.81-0.91) across 168 video recordings. As calculated through within-subject effect sizes, supervisor and therapist participants, respectively, evidenced skill improvements related to teaching and performing therapy techniques. Self-reported indicators of skill mastery were inflated in comparison to observed skill mastery. Findings lend initial support for further developing an evaluative approach for a train-the-trainer effort focused on disseminating evidence-based practices. Published by Elsevier Ltd.
New procedures to evaluate visually lossless compression for display systems
NASA Astrophysics Data System (ADS)
Stolitzka, Dale F.; Schelkens, Peter; Bruylants, Tim
2017-09-01
Visually lossless image coding in isochronous display streaming or plesiochronous networks reduces link complexity and power consumption and increases available link bandwidth. A new set of codecs developed within the last four years promise a new level of coding quality, but require new techniques that are sufficiently sensitive to the small artifacts or color variations induced by this new breed of codecs. This paper begins with a summary of the new ISO/IEC 29170-2, a procedure for evaluation of lossless coding and reports the new work by JPEG to extend the procedure in two important ways, for HDR content and for evaluating the differences between still images, panning images and image sequences. ISO/IEC 29170-2 relies on processing test images through a well-defined process chain for subjective, forced-choice psychophysical experiments. The procedure sets an acceptable quality level equal to one just noticeable difference. Traditional image and video coding evaluation techniques, such as, those used for television evaluation have not proven sufficiently sensitive to the small artifacts that may be induced by this breed of codecs. In 2015, JPEG received new requirements to expand evaluation of visually lossless coding for high dynamic range images, slowly moving images, i.e., panning, and image sequences. These requirements are the basis for new amendments of the ISO/IEC 29170-2 procedures described in this paper. These amendments promise to be highly useful for the new content in television and cinema mezzanine networks. The amendments passed the final ballot in April 2017 and are on track to be published in 2018.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onishi, Yasuo
Four Japan Atomic Energy Agency (JAEA) researchers visited Pacific Northwest National Laboratory (PNNL) for seven working days and have evaluated the suitability and adaptability of FLESCOT to a JAEA’s supercomputer system to effectively simulate cesium behavior in dam reservoirs, river mouths, and coastal areas in Fukushima contaminated by the Fukushima Daiichi nuclear accident. PNNL showed the following to JAEA visitors during the seven-working day period: FLESCOT source code; User’s manual; FLESCOT description – Program structure – Algorism – Solver – Boundary condition handling – Data definition – Input and output methods – How to run. During the visit, JAEA hadmore » access to FLESCOT to run with an input data set to evaluate the capacity and feasibility of adapting it to a JAEA super computer with massive parallel processors. As a part of this evaluation, PNNL ran FLESCOT for sample cases of the contaminant migration simulation to further describe FLESCOT in action. JAEA and PNNL researchers also evaluated time spent for each subroutine of FLESCOT, and the JAEA researcher implemented some initial parallelization schemes to FLESCOT. Based on this code evaluation, JAEA and PNNL determined that FLESCOT is: applicable to Fukushima lakes/dam reservoirs, river mouth areas, and coastal water; and feasible to implement parallelization for the JAEA supercomputer. In addition, PNNL and JAEA researchers discussed molecular modeling approaches on cesium adsorption mechanisms to enhance the JAEA molecular modeling activities. PNNL and JAEA also discussed specific collaboration of molecular and computational modeling activities.« less
Cognitive Evaluation Theory: An Experimental Test of Processes and Outcomes.
1981-06-26
ICONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Organizational Effectiveness Research Program June 26, 1981 Office of Naval Research (Code 452) 13. NUMBER...frame- work for explaining the detrimental effects of performance contingent rewards on intrinsically motivated behaviors. A review of the literature...part by the Organizational Effectiveness RMzarch Program, Office of Naval Research (Code 452), under Contract No. N0014-79-C-O750i NR 170-892 with Dr
Benchmarking NNWSI flow and transport codes: COVE 1 results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayden, N.K.
1985-06-01
The code verification (COVE) activity of the Nevada Nuclear Waste Storage Investigations (NNWSI) Project is the first step in certification of flow and transport codes used for NNWSI performance assessments of a geologic repository for disposing of high-level radioactive wastes. The goals of the COVE activity are (1) to demonstrate and compare the numerical accuracy and sensitivity of certain codes, (2) to identify and resolve problems in running typical NNWSI performance assessment calculations, and (3) to evaluate computer requirements for running the codes. This report describes the work done for COVE 1, the first step in benchmarking some of themore » codes. Isothermal calculations for the COVE 1 benchmarking have been completed using the hydrologic flow codes SAGUARO, TRUST, and GWVIP; the radionuclide transport codes FEMTRAN and TRUMP; and the coupled flow and transport code TRACR3D. This report presents the results of three cases of the benchmarking problem solved for COVE 1, a comparison of the results, questions raised regarding sensitivities to modeling techniques, and conclusions drawn regarding the status and numerical sensitivities of the codes. 30 refs.« less
Practical moral codes in the transgenic organism debate.
Cooley, D R; Goreham, Gary; Youngs, George A
2004-01-01
In one study funded by the United States Department of Agriculture, people from North Dakota were interviewed to discover which moral principles they use in evaluating the morality of transgenic organisms and their introduction into markets. It was found that although the moral codes the human subjects employed were very similar, their views on transgenics were vastly different. In this paper, the codes that were used by the respondents are developed, compared to that of the academically composed Belmont Report, and then modified to create the more practical Common Moral Code. At the end, it is shown that the Common Moral Code has inherent inconsistency flaws that might be resolvable, but would require extensive work on the definition of terms and principles. However, the effort is worthwhile, especially if it results in a common moral code that all those involved in the debate are willing to use in negotiating a resolution to their differences.
Effective Identification of Similar Patients Through Sequential Matching over ICD Code Embedding.
Nguyen, Dang; Luo, Wei; Venkatesh, Svetha; Phung, Dinh
2018-04-11
Evidence-based medicine often involves the identification of patients with similar conditions, which are often captured in ICD (International Classification of Diseases (World Health Organization 2013)) code sequences. With no satisfying prior solutions for matching ICD-10 code sequences, this paper presents a method which effectively captures the clinical similarity among routine patients who have multiple comorbidities and complex care needs. Our method leverages the recent progress in representation learning of individual ICD-10 codes, and it explicitly uses the sequential order of codes for matching. Empirical evaluation on a state-wide cancer data collection shows that our proposed method achieves significantly higher matching performance compared with state-of-the-art methods ignoring the sequential order. Our method better identifies similar patients in a number of clinical outcomes including readmission and mortality outlook. Although this paper focuses on ICD-10 diagnosis code sequences, our method can be adapted to work with other codified sequence data.
NASA Technical Reports Server (NTRS)
Shankar, V.; Rowell, C.; Hall, W. F.; Mohammadian, A. H.; Schuh, M.; Taylor, K.
1992-01-01
Accurate and rapid evaluation of radar signature for alternative aircraft/store configurations would be of substantial benefit in the evolution of integrated designs that meet radar cross-section (RCS) requirements across the threat spectrum. Finite-volume time domain methods offer the possibility of modeling the whole aircraft, including penetrable regions and stores, at longer wavelengths on today's gigaflop supercomputers and at typical airborne radar wavelengths on the teraflop computers of tomorrow. A structured-grid finite-volume time domain computational fluid dynamics (CFD)-based RCS code has been developed at the Rockwell Science Center, and this code incorporates modeling techniques for general radar absorbing materials and structures. Using this work as a base, the goal of the CFD-based CEM effort is to define, implement and evaluate various code development issues suitable for rapid prototype signature prediction.
NASA Astrophysics Data System (ADS)
Cassan, Arnaud
2017-07-01
The exoplanet detection rate from gravitational microlensing has grown significantly in recent years thanks to a great enhancement of resources and improved observational strategy. Current observatories include ground-based wide-field and/or robotic world-wide networks of telescopes, as well as space-based observatories such as satellites Spitzer or Kepler/K2. This results in a large quantity of data to be processed and analysed, which is a challenge for modelling codes because of the complexity of the parameter space to be explored and the intensive computations required to evaluate the models. In this work, I present a method that allows to compute the quadrupole and hexadecapole approximations of the finite-source magnification with more efficiency than previously available codes, with routines about six times and four times faster, respectively. The quadrupole takes just about twice the time of a point-source evaluation, which advocates for generalizing its use to large portions of the light curves. The corresponding routines are available as open-source python codes.
Evaluation of Subgrid-Scale Models for Large Eddy Simulation of Compressible Flows
NASA Technical Reports Server (NTRS)
Blaisdell, Gregory A.
1996-01-01
The objective of this project was to evaluate and develop subgrid-scale (SGS) turbulence models for large eddy simulations (LES) of compressible flows. During the first phase of the project results from LES using the dynamic SGS model were compared to those of direct numerical simulations (DNS) of compressible homogeneous turbulence. The second phase of the project involved implementing the dynamic SGS model in a NASA code for simulating supersonic flow over a flat-plate. The model has been successfully coded and a series of simulations has been completed. One of the major findings of the work is that numerical errors associated with the finite differencing scheme used in the code can overwhelm the SGS model and adversely affect the LES results. Attached to this overview are three submitted papers: 'Evaluation of the Dynamic Model for Simulations of Compressible Decaying Isotropic Turbulence'; 'The effect of the formulation of nonlinear terms on aliasing errors in spectral methods'; and 'Large-Eddy Simulation of a Spatially Evolving Compressible Boundary Layer Flow'.
Employment and residential characteristics in relation to automated external defibrillator locations
Griffis, Heather M.; Band, Roger A; Ruther, Matthew; Harhay, Michael; Asch, David A.; Hershey, John C.; Hill, Shawndra; Nadkarni, Lindsay; Kilaru, Austin; Branas, Charles C.; Shofer, Frances; Nichol, Graham; Becker, Lance B.; Merchant, Raina M.
2015-01-01
Background Survival from out-of-hospital cardiac arrest (OHCA) is generally poor and varies by geography. Variability in automated external defibrillator (AED) locations may be a contributing factor. To inform optimal placement of AEDs, we investigated AED access in a major US city relative to demographic and employment characteristics. Methods and Results This was a retrospective analysis of a Philadelphia AED registry (2,559 total AEDs). The 2010 US Census and the Local Employment Dynamics (LED) database by ZIP code was used. AED access was calculated as the weighted areal percentage of each ZIP code covered by a 400 meter radius around each AED. Of 47 ZIP codes, only 9%(4) were high AED service areas. In 26%(12) of ZIP codes, less than 35% of the area was covered by AED service areas. Higher AED access ZIP codes were more likely to have a moderately populated residential area (p=0.032), higher median household income (p=0.006), and higher paying jobs (p=008). Conclusions The locations of AEDs vary across specific ZIP codes; select residential and employment characteristics explain some variation. Further work on evaluating OHCA locations, AED use and availability, and OHCA outcomes could inform AED placement policies. Optimizing the placement of AEDs through this work may help to increase survival. PMID:26856232
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.
In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetrymore » with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.« less
NASA Astrophysics Data System (ADS)
Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.
2013-07-01
In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.
Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller
NASA Astrophysics Data System (ADS)
Perdikis, S.; Leeb, R.; Williamson, J.; Ramsay, A.; Tavella, M.; Desideri, L.; Hoogerwerf, E.-J.; Al-Khodairy, A.; Murray-Smith, R.; Millán, J. d. R.
2014-06-01
Objective. While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. Approach. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. Main results. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. Significance. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.
Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller.
Perdikis, S; Leeb, R; Williamson, J; Ramsay, A; Tavella, M; Desideri, L; Hoogerwerf, E-J; Al-Khodairy, A; Murray-Smith, R; Millán, J D R
2014-06-01
While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.
Statistical evaluation of PACSTAT random number generation capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piepel, G.F.; Toland, M.R.; Harty, H.
1988-05-01
This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less
NASA Astrophysics Data System (ADS)
Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret
2003-12-01
A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.
Progress towards a world-wide code of conduct
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, J.A.N.; Berleur, J.
1994-12-31
In this paper the work of the International Federation for Information Processing (IFIP) Task Group on Ethics is described and the recommendations presented to the General Assembly are reviewed. While a common code of ethics or conduct has been not recommended for consideration by the member societies of IMP, a set of guidelines for the establishment and evaluation of codes has been produced and procedures for the assistance of code development have been established within IMP. This paper proposes that the data collected by the Task Group and the proposed guidelines can be used as a tool for the studymore » of codes of practice providing a teachable, learnable educational module in courses related to the ethics of computing and computation, and looks at the next steps in bringing ethical awareness to the IT community.« less
[Representation of knowledge in respiratory medicine: ontology should help the coding process].
Blanc, F-X; Baneyx, A; Charlet, J; Housset, B
2010-09-01
Access to medical knowledge is a major issue for health professionals and requires the development of terminologies. The objective of the reported work was to construct an ontology of respiratory medicine, i.e. an organized and formalized terminology composed by specific knowledge. The purpose is to help the medico-economical coding process and to represent the relevant knowledge about the patient. Our researches cover the whole life cycle of an ontology, from the development of a methodology, to building it from texts, to its use in an operational system. A computerized tool, based on the ontology, allows both a medico-economical coding and a graphical medical one. This second one will be used to index hospital reports. Our ontology counts 1913 concepts and contains all the knowledge included in the PMSI part of the SPLF thesaurus. Our tool has been evaluated and showed a recall of 80% and an accuracy of 85% regarding the medico-economical coding. The work presented in this paper justifies the approach that has been used. It must be continued on a large scale to validate our coding principles and the possibility of making enquiries on patient reports concerning clinical research. Copyright © 2010. Published by Elsevier Masson SAS.
PAREMD: A parallel program for the evaluation of momentum space properties of atoms and molecules
NASA Astrophysics Data System (ADS)
Meena, Deep Raj; Gadre, Shridhar R.; Balanarayan, P.
2018-03-01
The present work describes a code for evaluating the electron momentum density (EMD), its moments and the associated Shannon information entropy for a multi-electron molecular system. The code works specifically for electronic wave functions obtained from traditional electronic structure packages such as GAMESS and GAUSSIAN. For the momentum space orbitals, the general expression for Gaussian basis sets in position space is analytically Fourier transformed to momentum space Gaussian basis functions. The molecular orbital coefficients of the wave function are taken as an input from the output file of the electronic structure calculation. The analytic expressions of EMD are evaluated over a fine grid and the accuracy of the code is verified by a normalization check and a numerical kinetic energy evaluation which is compared with the analytic kinetic energy given by the electronic structure package. Apart from electron momentum density, electron density in position space has also been integrated into this package. The program is written in C++ and is executed through a Shell script. It is also tuned for multicore machines with shared memory through OpenMP. The program has been tested for a variety of molecules and correlated methods such as CISD, Møller-Plesset second order (MP2) theory and density functional methods. For correlated methods, the PAREMD program uses natural spin orbitals as an input. The program has been benchmarked for a variety of Gaussian basis sets for different molecules showing a linear speedup on a parallel architecture.
NASA Technical Reports Server (NTRS)
Lawton, Pat
2004-01-01
The objective of this work was to support the design of improved IUE NEWSIPS high dispersion extraction algorithms. The purpose of this work was to evaluate use of the Linearized Image (LIHI) file versus the Re-Sampled Image (SIHI) file, evaluate various extraction, and design algorithms for evaluation of IUE High Dispersion spectra. It was concluded the use of the Re-Sampled Image (SIHI) file was acceptable. Since the Gaussian profile worked well for the core and the Lorentzian profile worked well for the wings, the Voigt profile was chosen for use in the extraction algorithm. It was found that the gamma and sigma parameters varied significantly across the detector, so gamma and sigma masks for the SWP detector were developed. Extraction code was written.
Ruffing, T; Huchzermeier, P; Muhm, M; Winkler, H
2014-05-01
Precise coding is an essential requirement in order to generate a valid DRG. The aim of our study was to evaluate the quality of the initial coding of surgical procedures, as well as to introduce our "hybrid model" of a surgical specialist supervising medical coding and a nonphysician for case auditing. The department's DRG responsible physician as a surgical specialist has profound knowledge both in surgery and in DRG coding. At a Level 1 hospital, 1000 coded cases of surgical procedures were checked. In our department, the DRG responsible physician who is both a surgeon and encoder has proven itself for many years. The initial surgical DRG coding had to be corrected by the DRG responsible physician in 42.2% of cases. On average, one hour per working day was necessary. The implementation of a DRG responsible physician is a simple, effective way to connect medical and business expertise without interface problems. Permanent feedback promotes both medical and economic sensitivity for the improvement of coding quality.
Report on Automated Semantic Analysis of Scientific and Engineering Codes
NASA Technical Reports Server (NTRS)
Stewart. Maark E. M.; Follen, Greg (Technical Monitor)
2001-01-01
The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.
MO-F-CAMPUS-T-05: SQL Database Queries to Determine Treatment Planning Resource Usage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, C; Gladstone, D
2015-06-15
Purpose: A radiation oncology clinic’s treatment capacity is traditionally thought to be limited by the number of machines in the clinic. As the number of fractions per course decrease and the number of adaptive plans increase, the question of how many treatment plans a clinic can plan becomes increasingly important. This work seeks to lay the ground work for assessing treatment planning resource usage. Methods: Care path templates were created using the Aria 11 care path interface. Care path tasks included key steps in the treatment planning process from the completion of CT simulation through the first radiation treatment. SQLmore » Server Management Studio was used to run SQL queries to extract task completion time stamps along with care path template information and diagnosis codes from the Aria database. 6 months of planning cycles were evaluated. Elapsed time was evaluated in terms of work hours within Monday – Friday, 7am to 5pm. Results: For the 195 validated treatment planning cycles, the average time for planning and MD review was 22.8 hours. Of those cases 33 were categorized as urgent. The average planning time for urgent plans was 5 hours. A strong correlation between diagnosis code and range of elapsed planning time was as well as between elapsed time and select diagnosis codes was observed. It was also observed that tasks were more likely to be completed on the date due than the time that they were due. Follow-up confirmed that most users did not look at the due time. Conclusion: Evaluation of elapsed planning time and other tasks suggest that care paths should be adjusted to allow for different contouring and planning times for certain diagnosis codes and urgent cases. Additional clinic training around task due times vs dates or a structuring of care paths around due dates is also needed.« less
Applying a rateless code in content delivery networks
NASA Astrophysics Data System (ADS)
Suherman; Zarlis, Muhammad; Parulian Sitorus, Sahat; Al-Akaidi, Marwan
2017-09-01
Content delivery network (CDN) allows internet providers to locate their services, to map their coverage into networks without necessarily to own them. CDN is part of the current internet infrastructures, supporting multi server applications especially social media. Various works have been proposed to improve CDN performances. Since accesses on social media servers tend to be short but frequent, providing redundant to the transmitted packets to ensure lost packets not degrade the information integrity may improve service performances. This paper examines the implementation of rateless code in the CDN infrastructure. The NS-2 evaluations show that rateless code is able to reduce packet loss up to 50%.
Code Lavender: Cultivating Intentional Acts of Kindness in Response to Stressful Work Situations.
Davidson, Judy E; Graham, Patricia; Montross-Thomas, Lori; Norcross, William; Zerbi, Giovanna
Providing healthcare can be stressful. Gone unchecked, clinicians may experience decreased compassion, and increased burnout or secondary traumatic stress. Code Lavender is designed to increase acts of kindness after stressful workplace events occur. To test the feasibility of providing Code Lavender. After stressful events in the workplace, staff will provide, receive, and recommend Code Lavender to others. The provision of Code Lavender will improve Professional Quality of Life Scale (ProQoL) scores, general job satisfaction, and feeling cared for in the workplace. Pilot program testing and evaluation. Staff and physicians on four hospital units were informed of the Code Lavender kit availability, which includes words of comfort, chocolate, lavender essential oil, and employee health referral information. Feasibility data and ProQoL scores were collected at baseline and three months. At baseline, 48% (n = 164) reported a stressful event at work in the last three months. Post-intervention, 51% reported experiencing a stressful workplace event, with 32% receiving a Code Lavender kit from their co-workers as a result (n = 83). Of those who received the Code Lavender intervention; 100% found it helpful, and 84% would recommend it to others. No significant changes were demonstrated before and after the intervention in ProQoL scores or job satisfaction, however the emotion of feeling cared-for improved. Results warrant continuation and further dissemination of Code Lavender. Investigators have received requests to expand the program implying positive reception of the intervention. Additional interventions are needed to overcome workplace stressors. A more intense peer support program is being tested. Copyright © 2017. Published by Elsevier Inc.
Assessment of Current Jet Noise Prediction Capabilities
NASA Technical Reports Server (NTRS)
Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas
2008-01-01
An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.
An Analysis of the Changes in Communication Techniques in the Italian Codes of Medical Deontology.
Conti, Andrea Alberto
2017-04-28
The code of deontology of the Italian National Federation of the Colleges of Physicians, Surgeons and Dentists (FNOMCeO) contains the principles and rules to which the professional medical practitioner must adhere. This work identifies and analyzes the medical-linguistic choices and the expressive techniques present in the different editions of the code, and evaluates their purpose and function, focusing on the first appearance and the subsequent frequency of key terms. Various aspects of the formal and expressive revisions of the eight editions of the Codes of Medical Deontology published after the Second World War (from 1947/48 to 2014) are here presented, starting from a brief comparison with the first edition of 1903. Formal characteristics, choices of medical terminology and the introduction of new concepts and communicative attitudes are here identified and evaluated. This paper, in presenting a quantitative and epistemological analysis of variations, modifications and confirmations in the different editions of the Italian code of medical deontology over the last century, enucleates and demonstrates the dynamic paradigm of changing attitudes in the medical profession. This analysis shows the evolution in medical-scientific communication as embodied in the Italian code of medical deontology. This code, in its adoption, changes and adaptations, as evidenced in its successive editions, bears witness to the expressions and attitudes pertinent to and characteristic of the deontological stance of the medical profession during the twentieth century.
Adaptive Wavelet Coding Applied in a Wireless Control System.
Gama, Felipe O S; Silveira, Luiz F Q; Salazar, Andrés O
2017-12-13
Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER) versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.
2004-01-01
Based Research Inc . 1595 Spring Hill Road, Suite 250 Vienna, VA 22182-2216 Wheatleyg@je.jfcom.mil Mr. J. Wilder UNITED STATES US. Army Training...Sarin, 2000). Collaboration C2 Metrics The following collaboration metrics have evolved out of work done by Evidence Based Research, Inc . for the...Enemy, Troops, Terrain, Troops, Time, and Civil considerations OOTW Operations Other Than War PESTLE Political, Economic, Social, Technological
The feasibility of adapting a population-based asthma-specific job exposure matrix (JEM) to NHANES.
McHugh, Michelle K; Symanski, Elaine; Pompeii, Lisa A; Delclos, George L
2010-12-01
To determine the feasibility of applying a job exposure matrix (JEM) for classifying exposures to 18 asthmagens in the National Health and Nutrition Examination Survey (NHANES), 1999-2004. We cross-referenced 490 National Center for Health Statistics job codes used to develop the 40 NHANES occupation groups with 506 JEM job titles and assessed homogeneity in asthmagen exposure across job codes within each occupation group. In total, 399 job codes corresponded to one JEM job title, 32 to more than one job title, and 59 were not in the JEM. Three occupation groups had the same asthmagen exposure across job codes, 11 had no asthmagen exposure, and 26 groups had heterogeneous exposures across jobs codes. The NHANES classification of occupations limits the use of the JEM to evaluate the association between workplace exposures and asthma and more refined occupational data are needed to enhance work-related injury/illness surveillance efforts.
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Huber, Frank W.
1992-01-01
The current status of the activities and future plans of the Turbine Technology Team of the Consortium for Computational Fluid Dynamics is reviewed. The activities of the Turbine Team focus on developing and enhancing codes and models, obtaining data for code validation and general understanding of flows through turbines, and developing and analyzing the aerodynamic designs of turbines suitable for use in the Space Transportation Main Engine fuel and oxidizer turbopumps. Future work will include the experimental evaluation of the oxidizer turbine configuration, the development, analysis, and experimental verification of concepts to control secondary and tip losses, and the aerodynamic design, analysis, and experimental evaluation of turbine volutes.
Toward a New Evaluation of Neutron Standards
Carlson, Allan D.; Pronyaev, Vladimir G.; Capote, Roberto; ...
2016-02-03
Measurements related to neutron cross section standards and certain prompt neutron fission spectra are being evaluated. In addition to the standard cross sections, investigations of reference data that are not as well known as the standards are being considered. We discuss procedures and codes for performing this work. A number of libraries will use the results of this standards evaluation for new versions of their libraries. Most of these data have applications in neutron dosimetry.
A Fast Optimization Method for General Binary Code Learning.
Shen, Fumin; Zhou, Xiang; Yang, Yang; Song, Jingkuan; Shen, Heng; Tao, Dacheng
2016-09-22
Hashing or binary code learning has been recognized to accomplish efficient near neighbor search, and has thus attracted broad interests in recent retrieval, vision and learning studies. One main challenge of learning to hash arises from the involvement of discrete variables in binary code optimization. While the widely-used continuous relaxation may achieve high learning efficiency, the pursued codes are typically less effective due to accumulated quantization error. In this work, we propose a novel binary code optimization method, dubbed Discrete Proximal Linearized Minimization (DPLM), which directly handles the discrete constraints during the learning process. Specifically, the discrete (thus nonsmooth nonconvex) problem is reformulated as minimizing the sum of a smooth loss term with a nonsmooth indicator function. The obtained problem is then efficiently solved by an iterative procedure with each iteration admitting an analytical discrete solution, which is thus shown to converge very fast. In addition, the proposed method supports a large family of empirical loss functions, which is particularly instantiated in this work by both a supervised and an unsupervised hashing losses, together with the bits uncorrelation and balance constraints. In particular, the proposed DPLM with a supervised `2 loss encodes the whole NUS-WIDE database into 64-bit binary codes within 10 seconds on a standard desktop computer. The proposed approach is extensively evaluated on several large-scale datasets and the generated binary codes are shown to achieve very promising results on both retrieval and classification tasks.
Reporting of occupational injury and illness in the semiconductor manufacturing industry.
McCurdy, S A; Schenker, M B; Samuels, S J
1991-01-01
In the United States, occupational illness and injury cases meeting specific reporting criteria are recorded on company Occupational Safety and Health Administration (OSHA) 200 logs; case description data are submitted to participating state agencies for coding and entry in the national Supplementary Data System (SDS). We evaluated completeness of reporting (the percentage of reportable cases that were recorded in the company OSHA 200 log) in the semiconductor manufacturing industry by reviewing company health clinic records for 1984 of 10 manufacturing sites of member companies of a national semiconductor manufacturing industry trade association. Of 416 randomly selected work-related cases, 101 met OSHA reporting criteria. Reporting completeness was 60 percent and was lowest for occupational illnesses (44 percent). Case-description data from 150 reported cases were submitted twice to state coding personnel to evaluate coding reliability. Reliability was high (kappa 0.82-0.93) for "nature," "affected body part," "source," and "type" variables. Coding for the SDS appears reliable; reporting completeness may be improved by use of a stepwise approach by company personnel responsible for reporting decisions.
NASA Technical Reports Server (NTRS)
Radhakrishnan, K.
1984-01-01
The efficiency and accuracy of several algorithms recently developed for the efficient numerical integration of stiff ordinary differential equations are compared. The methods examined include two general-purpose codes, EPISODE and LSODE, and three codes (CHEMEQ, CREK1D, and GCKP84) developed specifically to integrate chemical kinetic rate equations. The codes are applied to two test problems drawn from combustion kinetics. The comparisons show that LSODE is the fastest code currently available for the integration of combustion kinetic rate equations. An important finding is that an interactive solution of the algebraic energy conservation equation to compute the temperature does not result in significant errors. In addition, this method is more efficient than evaluating the temperature by integrating its time derivative. Significant reductions in computational work are realized by updating the rate constants (k = at(supra N) N exp(-E/RT) only when the temperature change exceeds an amount delta T that is problem dependent. An approximate expression for the automatic evaluation of delta T is derived and is shown to result in increased efficiency.
Transport and stability analyses supporting disruption prediction in high beta KSTAR plasmas
NASA Astrophysics Data System (ADS)
Ahn, J.-H.; Sabbagh, S. A.; Park, Y. S.; Berkery, J. W.; Jiang, Y.; Riquezes, J.; Lee, H. H.; Terzolo, L.; Scott, S. D.; Wang, Z.; Glasser, A. H.
2017-10-01
KSTAR plasmas have reached high stability parameters in dedicated experiments, with normalized beta βN exceeding 4.3 at relatively low plasma internal inductance li (βN/li>6). Transport and stability analyses have begun on these plasmas to best understand a disruption-free path toward the design target of βN = 5 while aiming to maximize the non-inductive fraction of these plasmas. Initial analysis using the TRANSP code indicates that the non-inductive current fraction in these plasmas has exceeded 50 percent. The advent of KSTAR kinetic equilibrium reconstructions now allows more accurate computation of the MHD stability of these plasmas. Attention is placed on code validation of mode stability using the PEST-3 and resistive DCON codes. Initial evaluation of these analyses for disruption prediction is made using the disruption event characterization and forecasting (DECAF) code. The present global mode kinetic stability model in DECAF developed for low aspect ratio plasmas is evaluated to determine modifications required for successful disruption prediction of KSTAR plasmas. Work supported by U.S. DoE under contract DE-SC0016614.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doucet, M.; Durant Terrasson, L.; Mouton, J.
2006-07-01
Criticality safety evaluations implement requirements to proof of sufficient sub critical margins outside of the reactor environment for example in fuel fabrication plants. Basic criticality data (i.e., criticality standards) are used in the determination of sub critical margins for all processes involving plutonium or enriched uranium. There are several criticality international standards, e.g., ARH-600, which is one the US nuclear industry relies on. The French Nuclear Safety Authority (DGSNR and its advising body IRSN) has requested AREVA NP to review the criticality standards used for the evaluation of its Low Enriched Uranium fuel fabrication plants with CRISTAL V0, the recentlymore » updated French criticality evaluation package. Criticality safety is a concern for every phase of the fabrication process including UF{sub 6} cylinder storage, UF{sub 6}-UO{sub 2} conversion, powder storage, pelletizing, rod loading, assembly fabrication, and assembly transportation. Until 2003, the accepted criticality standards were based on the French CEA work performed in the late seventies with the APOLLO1 cell/assembly computer code. APOLLO1 is a spectral code, used for evaluating the basic characteristics of fuel assemblies for reactor physics applications, which has been enhanced to perform criticality safety calculations. Throughout the years, CRISTAL, starting with APOLLO1 and MORET 3 (a 3D Monte Carlo code), has been improved to account for the growth of its qualification database and for increasing user requirements. Today, CRISTAL V0 is an up-to-date computational tool incorporating a modern basic microscopic cross section set based on JEF2.2 and the comprehensive APOLLO2 and MORET 4 codes. APOLLO2 is well suited for criticality standards calculations as it includes a sophisticated self shielding approach, a P{sub ij} flux determination, and a 1D transport (S{sub n}) process. CRISTAL V0 is the result of more than five years of development work focusing on theoretical approaches and the implementation of user-friendly graphical interfaces. Due to its comprehensive physical simulation and thanks to its broad qualification database with more than a thousand benchmark/calculation comparisons, CRISTAL V0 provides outstanding and reliable accuracy for criticality evaluations for configurations covering the entire fuel cycle (i.e. from enrichment, pellet/assembly fabrication, transportation, to fuel reprocessing). After a brief description of the calculation scheme and the physics algorithms used in this code package, results for the various fissile media encountered in a UO{sub 2} fuel fabrication plant will be detailed and discussed. (authors)« less
Design implications for task-specific search utilities for retrieval and re-engineering of code
NASA Astrophysics Data System (ADS)
Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif
2017-05-01
The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.
Context-sensitive trace inlining for Java.
Häubl, Christian; Wimmer, Christian; Mössenböck, Hanspeter
2013-12-01
Method inlining is one of the most important optimizations in method-based just-in-time (JIT) compilers. It widens the compilation scope and therefore allows optimizing multiple methods as a whole, which increases the performance. However, if method inlining is used too frequently, the compilation time increases and too much machine code is generated. This has negative effects on the performance. Trace-based JIT compilers only compile frequently executed paths, so-called traces, instead of whole methods. This may result in faster compilation, less generated machine code, and better optimized machine code. In the previous work, we implemented a trace recording infrastructure and a trace-based compiler for [Formula: see text], by modifying the Java HotSpot VM. Based on this work, we evaluate the effect of trace inlining on the performance and the amount of generated machine code. Trace inlining has several major advantages when compared to method inlining. First, trace inlining is more selective than method inlining, because only frequently executed paths are inlined. Second, the recorded traces may capture information about virtual calls, which simplify inlining. A third advantage is that trace information is context sensitive so that different method parts can be inlined depending on the specific call site. These advantages allow more aggressive inlining while the amount of generated machine code is still reasonable. We evaluate several inlining heuristics on the benchmark suites DaCapo 9.12 Bach, SPECjbb2005, and SPECjvm2008 and show that our trace-based compiler achieves an up to 51% higher peak performance than the method-based Java HotSpot client compiler. Furthermore, we show that the large compilation scope of our trace-based compiler has a positive effect on other compiler optimizations such as constant folding or null check elimination.
FIELD VALIDATION OF EXPOSURE ASSESSMENT MODELS. VOLUME 1. DATA
This is the first of two volumes describing work done to evaluate the PAL-DS model, a Gaussian diffusion code modified to account for dry deposition and settling. This first volume describes the experimental techniques employed to dispense, collect, and measure depositing (zinc s...
Application of Advanced Multi-Core Processor Technologies to Oceanographic Research
2014-09-30
Jordan Stanway are taking on the work of analyzing their code, and we are working on the Robot Operating System (ROS) and MOOS-DB systems to evaluate...Linux/GNU operating system that should reduce the time required to build the kernel and userspace significantly. This part of the work is vital to...the platform to be used not only as a service, but also as a private deployable package. As much as possible, this system was built using operating
NASA Technical Reports Server (NTRS)
Gronoff, Guillaume; Norman, Ryan B.; Mertens, Christopher J.
2014-01-01
The ability to evaluate the cosmic ray environment at Mars is of interest for future manned exploration. To support exploration, tools must be developed to accurately access the radiation environment in both free space and on planetary surfaces. The primary tool NASA uses to quantify radiation exposure behind shielding materials is the space radiation transport code, HZETRN. In order to build confidence in HZETRN, code benchmarking against Monte Carlo radiation transport codes is often used. This work compares the dose calculations at Mars by HZETRN and the Geant4 application Planetocosmics. The dose at ground and the energy deposited in the atmosphere by galactic cosmic ray protons and alpha particles has been calculated for the Curiosity landing conditions. In addition, this work has considered Solar Energetic Particle events, allowing for the comparison of varying input radiation environments. The results for protons and alpha particles show very good agreement between HZETRN and Planetocosmics.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.
From Novice to Expert: Problem Solving in ICD-10-PCS Procedural Coding
Rousse, Justin Thomas
2013-01-01
The benefits of converting to ICD-10-CM/PCS have been well documented in recent years. One of the greatest challenges in the conversion, however, is how to train the workforce in the code sets. The International Classification of Diseases, Tenth Revision, Procedure Coding System (ICD-10-PCS) has been described as a language requiring higher-level reasoning skills because of the system's increased granularity. Training and problem-solving strategies required for correct procedural coding are unclear. The objective of this article is to propose that the acquisition of rule-based logic will need to be augmented with self-evaluative and critical thinking. Awareness of how this process works is helpful for established coders as well as for a new generation of coders who will master the complexities of the system. PMID:23861674
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldrich, Robb; Butterfield, Karla
With funding from the Building America Program, part of the U.S. Department of Energy Building Technologies Office, the Consortium for Advanced Residential Buildings (CARB) worked with BrightBuilt Home (BBH) to evaluate and optimize building systems. CARB’s work focused on a home built by Black Bros. Builders in Lincolnville, Maine (International Energy Conservation Code Climate Zone 6). As with most BBH projects to date, modular boxes were built by Keiser Homes in Oxford, Maine.
2010-04-01
Layer Interaction, Real Gas, Radiation and Plasma Phenomena in Contemporary CFD Codes Michael S. Holden, PhD CUBRC , Inc. 4455 Genesee Street Buffalo...NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) CUBRC , Inc. 4455 Genesee Street Buffalo, NY 14225, USA 8. PERFORMING...HyFly Navy EMRG Reentry-F Slide 2 X-43 HIFiRE-2 Figure 17: Transition in Hypervelocity Flows: CUBRC Focus – Fully Duplicated Ground Test
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
Code of Federal Regulations, 2011 CFR
2011-01-01
... to an amendment published at 75 FR 78153, Dec. 15, 2010. Access device means any card, plate, code... evaluation of employability skills coupled with counseling on how and where to search for employment. If combined with work experience, employment search or training, an assessment of this nature could constitute...
Area Array Technology Evaluations for Space and Military Applications
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza
1996-01-01
The Jet Propulsion Laboratory (JPL) is currently assessing the use of Area Array Packaging (AAP) for National Aeronautics and Space Administration (NASA) spaceflight applications. this work is being funded through NASA Headquarters, Code Q. The paper discusses background of AAP, objectives, and uses of AAP.
NASA Astrophysics Data System (ADS)
Prakash, Ram; Gai, Sudhir L.; O'Byrne, Sean; Brown, Melrose
2016-11-01
The flow over a `tick' shaped configuration is performed using two Direct Simulation Monte Carlo codes: the DS2V code of Bird and the code from Sandia National Laboratory, called SPARTA. The configuration creates a flow field, where the flow is expanded initially but then is affected by the adverse pressure gradient induced by a compression surface. The flow field is challenging in the sense that the full flow domain is comprised of localized areas spanning continuum and transitional regimes. The present work focuses on the capability of SPARTA to model such flow conditions and also towards a comparative evaluation with results from DS2V. An extensive grid adaptation study is performed using both the codes on a model with a sharp leading edge and the converged results are then compared. The computational predictions are evaluated in terms of surface parameters such as heat flux, shear stress, pressure and velocity slip. SPARTA consistently predicts higher values for these surface properties. The skin friction predictions of both the codes don't give any indication of separation but the velocity slip plots indicate an incipient separation behavior at the corner. The differences in the results are attributed towards the flow resolution at the leading edge that dictates the downstream flow characteristics.
Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
Hess, J A; Mootz, R D
1999-06-01
Resource-based relative value scales (RBRVS) have become a standard method for identifying costs and determining reimbursement for physician services. Development of RBRVS systems and methods are reviewed, and the RBRVS concept of physician "work" is defined. Results of work and time inputs from chiropractic physicians are compared with those reported by osteopathic and medical specialties. Last, implications for reimbursement of chiropractic fee services are discussed. Total work, intraservice work, and time inputs for clinical vignettes reported by chiropractic, osteopathic, and medical physicians are compared. Data for chiropractic work and time reports were drawn from a national random sample of chiropractors conducted as part of a 1997 workers' compensation chiropractic fee schedule development project. Medical and osteopathic inputs were drawn from RBRVS research conducted at Harvard University under a federal contract reported in 1990. Both data sets used the same or similar clinical vignettes and similar methods. Comparisons of work and time inputs are made for clinical vignettes to assess whether work reported by chiropractors is of similar magnitude and variability as work reported by other specialties. Chiropractic inputs for vignettes related to evaluation and management services are similar to those reported by medical specialists and osteopathic physicians. The range of variation between chiropractic work input and other specialties is of similar magnitude to that within other specialties. Chiropractors report greater work input for radiologic interpretation and lower work input for manipulation services. Chiropractors seem to perform similar total "work" for evaluation and management services as other specialties. No basis exists for excluding chiropractors from using evaluation and management codes for reimbursement purposes on grounds of dissimilar physician time or work estimates. Greater work input by chiropractors in radiology interpretation may be related to a greater importance placed on findings in care planning. Consistently higher reports for osteopathic work input on manipulation are likely attributable to differences in reference vignettes used in the respective populations. Research with a common reference vignette used for manipulation providers is recommended, as is development of a single generic approach to coding for manipulation services.
Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions
Holland, Troy; Fletcher, Thomas H.
2017-02-22
Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less
Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Troy; Fletcher, Thomas H.
Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less
What if pediatric residents could bill for their outpatient services?
Ng, M; Lawless, S T
2001-10-01
We prospectively studied the potential of billing and coding practices of pediatric residents in outpatient clinics and extrapolated our results to assess the financial implications of billing inaccuracies. Using Medicare as a common measure of "currency," we also used the relative value unit (RVU) and ambulatory payment class methodologies as means of assessing the productivity and financial value of resident-staffed pediatric clinics. Residents were asked to submit voluntarily shadow billing forms and documentation of outpatient clinic visits. Documentation of work was assessed by a blinded reviewer, and current procedure terminology evaluation and management codes were assigned. Comparisons between resident codes and calculated codes were made. Financial implications of physician productivity were calculated in terms of dollar amounts and RVUs. Resource intensity was measured using the ambulatory payment class methodology. A total of 344 charts were reviewed. Coding agreement for health maintenance visits was 86%, whereas agreement for acute care visits was 38%. Eighty-three percent of coding disagreement in the latter group was resulting from undercoding by residents. Errors accounted for a 4.79% difference in potential reimbursement for all visit types and a 19.10% difference for acute care visits. No significant differences in shadow billing discrepancies were found between different levels of training. Residents were predicted to generate $67 230, $87 593, and $96 072 in Medicare revenue in the outpatient clinic setting during each successive year of training. On average, residents generated 1.17 +/- 0.01 and 0.81 +/- 0.02 work RVUs for each health maintenance visit and office visit, respectively. Annual productivity from outpatient clinic settings was estimated at 548, 735, and 893 work RVUs in the postgraduate levels 1, 2, and 3, respectively. When pediatric residents are not trained adequately in proper coding practices, the potential for billing discrepancies is high and potential reimbursement differences may be substantial. Discussion of financial issues should be considered in curriculum development.
NASA Astrophysics Data System (ADS)
Cai, Huai-yu; Dong, Xiao-tong; Zhu, Meng; Huang, Zhan-hua
2018-01-01
Wavefront coding for athermal technique can effectively ensure the stability of the optical system imaging in large temperature range, as well as the advantages of compact structure and low cost. Using simulation method to analyze the properties such as PSF and MTF of wavefront coding athermal system under several typical temperature gradient distributions has directive function to characterize the working state of non-ideal temperature environment, and can effectively realize the system design indicators as well. In this paper, we utilize the interoperability of data between Solidworks and ZEMAX to simplify the traditional process of structure/thermal/optical integrated analysis. Besides, we design and build the optical model and corresponding mechanical model of the infrared imaging wavefront coding athermal system. The axial and radial temperature gradients of different degrees are applied to the whole system by using SolidWorks software, thus the changes of curvature, refractive index and the distance between the lenses are obtained. Then, we import the deformation model to ZEMAX for ray tracing, and obtain the changes of PSF and MTF in optical system. Finally, we discuss and evaluate the consistency of the PSF (MTF) of the wavefront coding athermal system and the image restorability, which provides the basis and reference for the optimal design of the wavefront coding athermal system. The results show that the adaptability of single material infrared wavefront coding athermal system to axial temperature gradient can reach the upper limit of temperature fluctuation of 60°C, which is much higher than that of radial temperature gradient.
Relativity Screens for Misvalued Medical Services: Impact on Noninvasive Diagnostic Radiology.
Rosenkrantz, Andrew B; Silva, Ezequiel; Hawkins, C Matthew
2017-11-01
In 2006, the AMA/Specialty Society Relative Value Scale Update Committee (RUC) introduced ongoing relativity screens to identify potentially misvalued medical services for payment adjustments. We assess the impact of these screens upon the valuation of noninvasive diagnostic radiology services. Data regarding relativity screens and relative value unit (RVU) changes were obtained from the 2016 AMA Relativity Assessment Status Report. All global codes in the 2016 Medicare Physician Fee Schedule with associated work RVUs were classified as noninvasive diagnostic radiology services versus remaining services. The frequency of having ever undergone a screen was compared between the two groups. Screened radiology codes were further evaluated regarding the RVU impact of subsequent revaluation. Of noninvasive diagnostic radiology codes, 46.0% (201 of 437) were screened versus 22.2% (1,460 of 6,575) of remaining codes (P < .001). Most common screens for which radiology codes were identified as potentially misvalued were (1) high expenditures (27.5%) and (2) high utilization (25.6%). The modality and body region most likely to be identified in a screen were CT (82.1%) and breast (90.9%), respectively. Among screened radiology codes, work RVUs, practice expense RVUs, and nonfacility total RVUs decreased in 20.3%, 65.9%, and 75.3%, respectively. All screened CT, MRI, brain, and spine codes exhibited decreased total RVUs. Policymakers' ongoing search for potentially misvalued medical services has disproportionately impacted noninvasive diagnostic radiology services, risking the introduction of unintended or artificial shifts in physician practice. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
De Vito, David; Al-Aidroos, Naseem; Fenske, Mark J
2017-05-01
Stimuli appearing as visual distractors subsequently receive more negative affective evaluations than novel items or prior targets of attention. Leading accounts question whether this distractor devaluation effect occurs through evaluative codes that become associated with distractors as a mere artefact of attention-task instructions, or through affective consequences of attentional inhibition when applied to prevent distractor interference. Here we test opposing predictions arising from the evaluative-coding and devaluation-by-inhibition hypotheses using an electrophysiological marker of attentional inhibition in a task that requires participants to avoid interference from abstract-shape distractors presented while maintaining a uniquely-colored stimulus in memory. Consistent with prior research, distractors that matched the colour of the stimulus being held in memory elicited a Pd component of the event-related potential waveform, indicating that their processing was being actively suppressed. Subsequent affective evaluations revealed that memory-matching distractors also received more negative ratings than non-matching distractors or previously-unseen shapes. Moreover, Pd magnitude was greater on trials in which the memory-matching distractors were later rated negatively than on trials preceding positive ratings. These results support the devaluation-by-inhibition hypothesis and strongly suggest that fluctuations in stimulus inhibition are closely associated with subsequent affective evaluations. In contrast, none of the evaluative-coding based predictions were confirmed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Alternative Formats to Achieve More Efficient Energy Codes for Commercial Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.; Rosenberg, Michael I.; Halverson, Mark A.
2013-01-26
This paper identifies and examines several formats or structures that could be used to create the next generation of more efficient energy codes and standards for commercial buildings. Pacific Northwest National Laboratory (PNNL) is funded by the U.S. Department of Energy’s Building Energy Codes Program (BECP) to provide technical support to the development of ANSI/ASHRAE/IES Standard 90.1. While the majority of PNNL’s ASHRAE Standard 90.1 support focuses on developing and evaluating new requirements, a portion of its work involves consideration of the format of energy standards. In its current working plan, the ASHRAE 90.1 committee has approved an energy goalmore » of 50% improvement in Standard 90.1-2013 relative to Standard 90.1-2004, and will likely be considering higher improvement targets for future versions of the standard. To cost-effectively achieve the 50% goal in manner that can gain stakeholder consensus, formats other than prescriptive must be considered. Alternative formats that include reducing the reliance on prescriptive requirements may make it easier to achieve these aggressive efficiency levels in new codes and standards. The focus on energy code and standard formats is meant to explore approaches to presenting the criteria that will foster compliance, enhance verification, and stimulate innovation while saving energy in buildings. New formats may also make it easier for building designers and owners to design and build the levels of efficiency called for in the new codes and standards. This paper examines a number of potential formats and structures, including prescriptive, performance-based (with sub-formats of performance equivalency and performance targets), capacity constraint-based, and outcome-based. The paper also discusses the pros and cons of each format from the viewpoint of code users and of code enforcers.« less
Child Support Enforcement: A Framework for Evaluating Costs, Benefits, and Effects.
1991-03-01
efforts to gain and enforce child support awards might yield additional collections on behalf of these children, they would surely entail additional...framework for evaluating the full costs and ne . effects of child support enforcement.I This framework could assist your office and others in planning...following results of our develop- S . ............. .. mental work: (1) models of the child support enforcement system activi- AvajiabilitY Codes. ties
A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation
NASA Astrophysics Data System (ADS)
Yoshida, Toshio
In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.
Evaluating the performance of parallel subsurface simulators: An illustrative example with PFLOTRAN
Hammond, G E; Lichtner, P C; Mills, R T
2014-01-01
[1] To better inform the subsurface scientist on the expected performance of parallel simulators, this work investigates performance of the reactive multiphase flow and multicomponent biogeochemical transport code PFLOTRAN as it is applied to several realistic modeling scenarios run on the Jaguar supercomputer. After a brief introduction to the code's parallel layout and code design, PFLOTRAN's parallel performance (measured through strong and weak scalability analyses) is evaluated in the context of conceptual model layout, software and algorithmic design, and known hardware limitations. PFLOTRAN scales well (with regard to strong scaling) for three realistic problem scenarios: (1) in situ leaching of copper from a mineral ore deposit within a 5-spot flow regime, (2) transient flow and solute transport within a regional doublet, and (3) a real-world problem involving uranium surface complexation within a heterogeneous and extremely dynamic variably saturated flow field. Weak scalability is discussed in detail for the regional doublet problem, and several difficulties with its interpretation are noted. PMID:25506097
Evaluating the performance of parallel subsurface simulators: An illustrative example with PFLOTRAN.
Hammond, G E; Lichtner, P C; Mills, R T
2014-01-01
[1] To better inform the subsurface scientist on the expected performance of parallel simulators, this work investigates performance of the reactive multiphase flow and multicomponent biogeochemical transport code PFLOTRAN as it is applied to several realistic modeling scenarios run on the Jaguar supercomputer. After a brief introduction to the code's parallel layout and code design, PFLOTRAN's parallel performance (measured through strong and weak scalability analyses) is evaluated in the context of conceptual model layout, software and algorithmic design, and known hardware limitations. PFLOTRAN scales well (with regard to strong scaling) for three realistic problem scenarios: (1) in situ leaching of copper from a mineral ore deposit within a 5-spot flow regime, (2) transient flow and solute transport within a regional doublet, and (3) a real-world problem involving uranium surface complexation within a heterogeneous and extremely dynamic variably saturated flow field. Weak scalability is discussed in detail for the regional doublet problem, and several difficulties with its interpretation are noted.
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
NASA Technical Reports Server (NTRS)
Nguyen, H. L.; Ying, S.-J.
1990-01-01
Jet-A spray combustion has been evaluated in gas turbine combustion with the use of propane chemical kinetics as the first approximation for the chemical reactions. Here, the numerical solutions are obtained by using the KIVA-2 computer code. The KIVA-2 code is the most developed of the available multidimensional combustion computer programs for application of the in-cylinder combustion dynamics of internal combustion engines. The released version of KIVA-2 assumes that 12 chemical species are present; the code uses an Arrhenius kinetic-controlled combustion model governed by a four-step global chemical reaction and six equilibrium reactions. Researchers efforts involve the addition of Jet-A thermophysical properties and the implementation of detailed reaction mechanisms for propane oxidation. Three different detailed reaction mechanism models are considered. The first model consists of 131 reactions and 45 species. This is considered as the full mechanism which is developed through the study of chemical kinetics of propane combustion in an enclosed chamber. The full mechanism is evaluated by comparing calculated ignition delay times with available shock tube data. However, these detailed reactions occupy too much computer memory and CPU time for the computation. Therefore, it only serves as a benchmark case by which to evaluate other simplified models. Two possible simplified models were tested in the existing computer code KIVA-2 for the same conditions as used with the full mechanism. One model is obtained through a sensitivity analysis using LSENS, the general kinetics and sensitivity analysis program code of D. A. Bittker and K. Radhakrishnan. This model consists of 45 chemical reactions and 27 species. The other model is based on the work published by C. K. Westbrook and F. L. Dryer.
Mr.CAS-A minimalistic (pure) Ruby CAS for fast prototyping and code generation
NASA Astrophysics Data System (ADS)
Ragni, Matteo
There are Computer Algebra System (CAS) systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.
A Lossless hybrid wavelet-fractal compression for welding radiographic images.
Mekhalfa, Faiza; Avanaki, Mohammad R N; Berkani, Daoud
2016-01-01
In this work a lossless wavelet-fractal image coder is proposed. The process starts by compressing and decompressing the original image using wavelet transformation and fractal coding algorithm. The decompressed image is removed from the original one to obtain a residual image which is coded by using Huffman algorithm. Simulation results show that with the proposed scheme, we achieve an infinite peak signal to noise ratio (PSNR) with higher compression ratio compared to typical lossless method. Moreover, the use of wavelet transform speeds up the fractal compression algorithm by reducing the size of the domain pool. The compression results of several welding radiographic images using the proposed scheme are evaluated quantitatively and compared with the results of Huffman coding algorithm.
NASA's Use of Human Behavior Models for Concept Development and Evaluation
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2012-01-01
Overview of NASA's use of computational approaches and methods to support research goals, of human performance models, with a focus on examples of the methods used in Code TH and TI at NASA Ames, followed by an in depth review of MIDAS' current FAA work.
Clients' Preferences for Small Groups vs. Individual Testing.
ERIC Educational Resources Information Center
Backman, Margaret E.; And Others
Test takers' preferences for group versus individual administration of the Micro-TOWER System of Vocational Evaluation are reported. The system was administered to 211 clients at a vocational rehabilitation center, and consisted of work samples measuring the following job skills: record checking, filing, lamp assembly, message-taking, zip coding,…
Optimal shielding thickness for galactic cosmic ray environments
NASA Astrophysics Data System (ADS)
Slaba, Tony C.; Bahadori, Amir A.; Reddell, Brandon D.; Singleterry, Robert C.; Clowdsley, Martha S.; Blattnig, Steve R.
2017-02-01
Models have been extensively used in the past to evaluate and develop material optimization and shield design strategies for astronauts exposed to galactic cosmic rays (GCR) on long duration missions. A persistent conclusion from many of these studies was that passive shielding strategies are inefficient at reducing astronaut exposure levels and the mass required to significantly reduce the exposure is infeasible, given launch and associated cost constraints. An important assumption of this paradigm is that adding shielding mass does not substantially increase astronaut exposure levels. Recent studies with HZETRN have suggested, however, that dose equivalent values actually increase beyond ∼20 g/cm2 of aluminum shielding, primarily as a result of neutron build-up in the shielding geometry. In this work, various Monte Carlo (MC) codes and 3DHZETRN are evaluated in slab geometry to verify the existence of a local minimum in the dose equivalent versus aluminum thickness curve near 20 g/cm2. The same codes are also evaluated in polyethylene shielding, where no local minimum is observed, to provide a comparison between the two materials. Results are presented so that the physical interactions driving build-up in dose equivalent values can be easily observed and explained. Variation of transport model results for light ions (Z ≤ 2) and neutron-induced target fragments, which contribute significantly to dose equivalent for thick shielding, is also highlighted and indicates that significant uncertainties are still present in the models for some particles. The 3DHZETRN code is then further evaluated over a range of related slab geometries to draw closer connection to more realistic scenarios. Future work will examine these related geometries in more detail.
Optimal shielding thickness for galactic cosmic ray environments.
Slaba, Tony C; Bahadori, Amir A; Reddell, Brandon D; Singleterry, Robert C; Clowdsley, Martha S; Blattnig, Steve R
2017-02-01
Models have been extensively used in the past to evaluate and develop material optimization and shield design strategies for astronauts exposed to galactic cosmic rays (GCR) on long duration missions. A persistent conclusion from many of these studies was that passive shielding strategies are inefficient at reducing astronaut exposure levels and the mass required to significantly reduce the exposure is infeasible, given launch and associated cost constraints. An important assumption of this paradigm is that adding shielding mass does not substantially increase astronaut exposure levels. Recent studies with HZETRN have suggested, however, that dose equivalent values actually increase beyond ∼20g/cm 2 of aluminum shielding, primarily as a result of neutron build-up in the shielding geometry. In this work, various Monte Carlo (MC) codes and 3DHZETRN are evaluated in slab geometry to verify the existence of a local minimum in the dose equivalent versus aluminum thickness curve near 20g/cm 2 . The same codes are also evaluated in polyethylene shielding, where no local minimum is observed, to provide a comparison between the two materials. Results are presented so that the physical interactions driving build-up in dose equivalent values can be easily observed and explained. Variation of transport model results for light ions (Z ≤ 2) and neutron-induced target fragments, which contribute significantly to dose equivalent for thick shielding, is also highlighted and indicates that significant uncertainties are still present in the models for some particles. The 3DHZETRN code is then further evaluated over a range of related slab geometries to draw closer connection to more realistic scenarios. Future work will examine these related geometries in more detail. Published by Elsevier Ltd.
[How does our specialty present itself on the Internet?].
Rechenberg, U; Josten, C; Grüner, S; Klima, S
2013-08-01
The huge amount of information on the internet about orthopaedic and trauma surgical issues is very often unclear and hard to control in reliability, autonomy and expertise. The aim of this work is to evaluate German-speaking internet sites with orthopaedic and trauma surgical contents. Over a period of two months (from May to June) in 2012 different websites about 20 common orthopaedic and trauma surgical diseases were analysed on the internet by Google search engine. The first ten search results were evaluated for the HON code principles (Health On the Net Foundation). Furthermore there was an evaluation about qualification of the first 50 websites. The best 1,000 placed websites on Google were analysed for the authoritative value: academic, commercial, media, non-medical, physicians, non-profit and miscellaneous. Only 194 from 200 websites could be evaluated by the HON code principles. Overall 188 websites complied the principle of transparency, followed by privacy with 150 sites and authoritative with 134 sites. Only 90 websites give information about financial disclosure. Medical articles of the website Wikipedia appear most frequently. In the second part of this work it is shown that non-profit sites and sites by physicians are the most frequent. The fewest results are given by academic and commercial sites with 93 and 85 online hits. In summary, it is shown that most of the websites about medical information have inadequate quality. This statement is in accord with some U. S. American publications. It is clear that Wikipedia has a top-ranking on the internet when it is about medical information. Wikipedia almost achieves all of the HON code principles. It is possible to give better medical online information about orthopaedic and trauma surgical issues published by professionals. Georg Thieme Verlag KG Stuttgart · New York.
Back injuries among union carpenters in Washington State, 1989-2003.
Lipscomb, Hester J; Cameron, Wilfrid; Silverstein, Barbara
2008-06-01
There is limited information on occupational back pain specific to carpenters despite their known exposures to recognized occupational risk factors and limited opportunities for modified work due to the predominantly heavy nature of their work. By combining union records with worker's compensation claims, we describe work-related back injuries, including associated medical diagnoses, among a well-defined cohort of union carpenters between 1989 and 2003. High risk subgroups were explored based on age, gender, union tenure, and predominant type of work. Paid lost time claims were contrasted to less serious events, and injuries sustained from overexertion activities were contrasted with those sustained through more acute trauma. Back injuries occurred at an overall rate of 6.2/200,000 hours worked. Most injuries were coded in the compensation records as sprains, but there was little agreement between these nature of injury codes and ICD9 diagnosis codes. Injury rates declined most significantly over time for injuries secondary to overexertion. In multivariate analyses, we observed similar patterns of risk for the types of claims evaluated despite disparate mechanisms and severity. Those who worked predominantly in residential carpentry or drywall installation were consistently at greatest risk. Overexertion injuries from manual materials handling activities are responsible for the largest burden of back injuries among these carpenters, but a growing proportion of injuries result from acute traumatic events. Interventions are called for which specifically address risk among residential carpenters and drywall installers. These data provide additional evidence that Bureau of Labor Statistics data underestimate work-related injuries. Copyright 2008 Wiley-Liss, Inc.
Empirical Evaluation of Hunk Metrics as Bug Predictors
NASA Astrophysics Data System (ADS)
Ferzund, Javed; Ahsan, Syed Nadeem; Wotawa, Franz
Reducing the number of bugs is a crucial issue during software development and maintenance. Software process and product metrics are good indicators of software complexity. These metrics have been used to build bug predictor models to help developers maintain the quality of software. In this paper we empirically evaluate the use of hunk metrics as predictor of bugs. We present a technique for bug prediction that works at smallest units of code change called hunks. We build bug prediction models using random forests, which is an efficient machine learning classifier. Hunk metrics are used to train the classifier and each hunk metric is evaluated for its bug prediction capabilities. Our classifier can classify individual hunks as buggy or bug-free with 86 % accuracy, 83 % buggy hunk precision and 77% buggy hunk recall. We find that history based and change level hunk metrics are better predictors of bugs than code level hunk metrics.
NASA Astrophysics Data System (ADS)
Palma, V.; Carli, M.; Neri, A.
2011-02-01
In this paper a Multi-view Distributed Video Coding scheme for mobile applications is presented. Specifically a new fusion technique between temporal and spatial side information in Zernike Moments domain is proposed. Distributed video coding introduces a flexible architecture that enables the design of very low complex video encoders compared to its traditional counterparts. The main goal of our work is to generate at the decoder the side information that optimally blends temporal and interview data. Multi-view distributed coding performance strongly depends on the side information quality built at the decoder. At this aim for improving its quality a spatial view compensation/prediction in Zernike moments domain is applied. Spatial and temporal motion activity have been fused together to obtain the overall side-information. The proposed method has been evaluated by rate-distortion performances for different inter-view and temporal estimation quality conditions.
Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim
2016-01-01
Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458
Aiello, Francesco A; Judelson, Dejah R; Messina, Louis M; Indes, Jeffrey; FitzGerald, Gordon; Doucet, Danielle R; Simons, Jessica P; Schanzer, Andres
2016-08-01
Vascular surgery procedural reimbursement depends on accurate procedural coding and documentation. Despite the critical importance of correct coding, there has been a paucity of research focused on the effect of direct physician involvement. We hypothesize that direct physician involvement in procedural coding will lead to improved coding accuracy, increased work relative value unit (wRVU) assignment, and increased physician reimbursement. This prospective observational cohort study evaluated procedural coding accuracy of fistulograms at an academic medical institution (January-June 2014). All fistulograms were coded by institutional coders (traditional coding) and by a single vascular surgeon whose codes were verified by two institution coders (multidisciplinary coding). The coding methods were compared, and differences were translated into revenue and wRVUs using the Medicare Physician Fee Schedule. Comparison between traditional and multidisciplinary coding was performed for three discrete study periods: baseline (period 1), after a coding education session for physicians and coders (period 2), and after a coding education session with implementation of an operative dictation template (period 3). The accuracy of surgeon operative dictations during each study period was also assessed. An external validation at a second academic institution was performed during period 1 to assess and compare coding accuracy. During period 1, traditional coding resulted in a 4.4% (P = .004) loss in reimbursement and a 5.4% (P = .01) loss in wRVUs compared with multidisciplinary coding. During period 2, no significant difference was found between traditional and multidisciplinary coding in reimbursement (1.3% loss; P = .24) or wRVUs (1.8% loss; P = .20). During period 3, traditional coding yielded a higher overall reimbursement (1.3% gain; P = .26) than multidisciplinary coding. This increase, however, was due to errors by institution coders, with six inappropriately used codes resulting in a higher overall reimbursement that was subsequently corrected. Assessment of physician documentation showed improvement, with decreased documentation errors at each period (11% vs 3.1% vs 0.6%; P = .02). Overall, between period 1 and period 3, multidisciplinary coding resulted in a significant increase in additional reimbursement ($17.63 per procedure; P = .004) and wRVUs (0.50 per procedure; P = .01). External validation at a second academic institution was performed to assess coding accuracy during period 1. Similar to institution 1, traditional coding revealed an 11% loss in reimbursement ($13,178 vs $14,630; P = .007) and a 12% loss in wRVU (293 vs 329; P = .01) compared with multidisciplinary coding. Physician involvement in the coding of endovascular procedures leads to improved procedural coding accuracy, increased wRVU assignments, and increased physician reimbursement. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
NDEC: A NEA platform for nuclear data testing, verification and benchmarking
NASA Astrophysics Data System (ADS)
Díez, C. J.; Michel-Sendis, F.; Cabellos, O.; Bossant, M.; Soppera, N.
2017-09-01
The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle) platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
An evaluation of hospital discharge records as a tool for serious work related injury surveillance.
Alamgir, H; Koehoorn, M; Ostry, A; Tompa, E; Demers, P
2006-04-01
To identify and describe work related serious injuries among sawmill workers in British Columbia, Canada using hospital discharge records, and compare the agreement and capturing patterns of the work related indicators available in the hospital discharge records. Hospital discharge records were extracted from 1989 to 1998 for a cohort of sawmill workers. Work related injuries were identified from these records using International Classification of Disease (ICD-9) external cause of injury codes, which have a fifth digit, and sometimes a fourth digit, indicating place of occurrence, and the responsibility of payment schedule, which identifies workers' compensation as being responsible for payment. The most frequent causes of work related hospitalisations were falls, machinery related, overexertion, struck against, cutting or piercing, and struck by falling objects. Almost all cases of machinery related, struck by falling object, and caught in or between injuries were found to be work related. Overall, there was good agreement between the two indicators (ICD-9 code and payment schedule) for identifying work relatedness of injury hospitalisations (kappa = 0.75, p < 0.01). There was better concordance between them for injuries, such as struck against, drowning/suffocation/foreign body, fire/flame/natural/environmental, and explosions/firearms/hot substance/electric current/radiation, and poor concordance for injuries, such as machinery related, struck by falling object, overexertion, cutting or piercing, and caught in or between. Hospital discharge records are collected for administrative reasons, and thus are readily available. Depending on the coding reliability and validity, hospital discharge records represent an alternative and independent source of information for serious work related injuries. The study findings support the use of hospital discharge records as a potential surveillance system for such injuries.
An evaluation of hospital discharge records as a tool for serious work related injury surveillance
Alamgir, H; Koehoorn, M; Ostry, A; Tompa, E; Demers, P
2006-01-01
Objectives To identify and describe work related serious injuries among sawmill workers in British Columbia, Canada using hospital discharge records, and compare the agreement and capturing patterns of the work related indicators available in the hospital discharge records. Methods Hospital discharge records were extracted from 1989 to 1998 for a cohort of sawmill workers. Work related injuries were identified from these records using International Classification of Disease (ICD‐9) external cause of injury codes, which have a fifth digit, and sometimes a fourth digit, indicating place of occurrence, and the responsibility of payment schedule, which identifies workers' compensation as being responsible for payment. Results The most frequent causes of work related hospitalisations were falls, machinery related, overexertion, struck against, cutting or piercing, and struck by falling objects. Almost all cases of machinery related, struck by falling object, and caught in or between injuries were found to be work related. Overall, there was good agreement between the two indicators (ICD‐9 code and payment schedule) for identifying work relatedness of injury hospitalisations (kappa = 0.75, p < 0.01). There was better concordance between them for injuries, such as struck against, drowning/suffocation/foreign body, fire/flame/natural/environmental, and explosions/firearms/hot substance/electric current/radiation, and poor concordance for injuries, such as machinery related, struck by falling object, overexertion, cutting or piercing, and caught in or between. Conclusions Hospital discharge records are collected for administrative reasons, and thus are readily available. Depending on the coding reliability and validity, hospital discharge records represent an alternative and independent source of information for serious work related injuries. The study findings support the use of hospital discharge records as a potential surveillance system for such injuries. PMID:16556751
NASA Astrophysics Data System (ADS)
Nightingale, James; Wang, Qi; Grecos, Christos; Goma, Sergio
2014-02-01
High Efficiency Video Coding (HEVC), the latest video compression standard (also known as H.265), can deliver video streams of comparable quality to the current H.264 Advanced Video Coding (H.264/AVC) standard with a 50% reduction in bandwidth. Research into SHVC, the scalable extension to the HEVC standard, is still in its infancy. One important area for investigation is whether, given the greater compression ratio of HEVC (and SHVC), the loss of packets containing video content will have a greater impact on the quality of delivered video than is the case with H.264/AVC or its scalable extension H.264/SVC. In this work we empirically evaluate the layer-based, in-network adaptation of video streams encoded using SHVC in situations where dynamically changing bandwidths and datagram loss ratios require the real-time adaptation of video streams. Through the use of extensive experimentation, we establish a comprehensive set of benchmarks for SHVC-based highdefinition video streaming in loss prone network environments such as those commonly found in mobile networks. Among other results, we highlight that packet losses of only 1% can lead to a substantial reduction in PSNR of over 3dB and error propagation in over 130 pictures following the one in which the loss occurred. This work would be one of the earliest studies in this cutting-edge area that reports benchmark evaluation results for the effects of datagram loss on SHVC picture quality and offers empirical and analytical insights into SHVC adaptation to lossy, mobile networking conditions.
Benoit, M F; Ma, J F; Upperman, B A
2017-02-01
In 1992, Congress implemented a relative value unit (RVU) payment system to set reimbursement for all procedures covered by Medicare. In 1997, data supported that a significant gender bias existed in reimbursement for gynecologic compared to urologic procedures. The present study was performed to compare work and total RVU's for gender specific procedures effective January 2015 and to evaluate if time has healed the gender-based RVU worth. Using the 2015 CPT codes, we compared work and total RVU's for 50 pairs of gender specific procedures. We also evaluated 2015 procedure related provider compensation. The groups were matched so that the procedures were anatomically similar. We also compared 2015 to 1997 RVU and fee schedules. Evaluation of work RVU's for the paired procedures revealed that in 36 cases (72%), male vs female procedures had a higher wRVU and tRVU. For total fee/reimbursement, 42 (84%) male based procedures were compensated at a higher rate than the paired female procedures. On average, male specific surgeries were reimbursed at an amount that was 27.67% higher for male procedures than for female-specific surgeries. Female procedure based work RVU's have increased minimally from 1997 to 2015. Time and effort have trended towards resolution of some gender-related procedure worth discrepancies but there are still significant RVU and compensation differences that should be further reviewed and modified as surgical time and effort highly correlate. Copyright © 2016. Published by Elsevier Inc.
Comparison of turbulence models and CFD solution options for a plain pipe
NASA Astrophysics Data System (ADS)
Canli, Eyub; Ates, Ali; Bilir, Sefik
2018-06-01
Present paper is partly a declaration of state of a currently ongoing PhD work about turbulent flow in a thick walled pipe in order to analyze conjugate heat transfer. An ongoing effort on CFD investigation of this problem using cylindrical coordinates and dimensionless governing equations is identified alongside a literature review. The mentioned PhD work will be conducted using an in-house developed code. However it needs preliminary evaluation by means of commercial codes available in the field. Accordingly ANSYS CFD was utilized in order to evaluate mesh structure needs and asses the turbulence models and solution options in terms of computational power versus difference signification. Present work contains a literature survey, an arrangement of governing equations of the PhD work, CFD essentials of the preliminary analysis and findings about the mesh structure and solution options. Mesh element number was changed between 5,000 and 320,000. k-ɛ, k-ω, Spalart-Allmaras and Viscous-Laminar models were compared. Reynolds number was changed between 1,000 and 50,000. As it may be expected due to the literature, k-ɛ yields more favorable results near the pipe axis and k-ωyields more convenient results near the wall. However k-ɛ is found sufficient to give turbulent structures for a conjugate heat transfer problem in a thick walled plain pipe.
Fully-Coupled Fluid/Structure Vibration Analysis Using MSC/NASTRAN
NASA Technical Reports Server (NTRS)
Fernholz, Christian M.; Robinson, Jay H.
1996-01-01
MSC/NASTRAN's performance in the solution of fully-coupled fluid/structure problems is evaluated. NASTRAN is used to perform normal modes (SOL 103) and forced-response analyses (SOL 108, 111) on cylindrical and cubic fluid/structure models. Bulk data file cards unique to the specification of a fluid element are discussed and analytic partially-coupled solutions are derived for each type of problem. These solutions are used to evaluate NASTRAN's solutions for accuracy. Appendices to this work include NASTRAN data presented in fringe plot form, FORTRAN source code listings written in support of this work, and NASTRAN data file usage requirements for each analysis.
Exarchakis, Georgios; Lücke, Jörg
2017-11-01
Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.
Outdoor Test Facility and Related Facilities | Photovoltaic Research | NREL
advanced or emerging photovoltaic (PV) technologies under simulated, accelerated indoor and outdoor, and evaluate prototype, pre-commercial, and commercial PV modules. One of the major roles of researchers at the OTF is to work with industry to develop uniform and consensus standards and codes for testing PV
Design and optimization of a portable LQCD Monte Carlo code using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.
Evaluating pharmacy leader development through the seven action logics.
Philip, Achsah; Desai, Avani; Nguyen, Phouc Anne; Birney, Patrick; Colavecchia, Anthony; Karralli, Rusol; Smith, Lindsey; Lorimer, Dirk; Burgess, Gwen; Munch, Kyle; Daniel, Nelvin; Lionetti, Jason; Garey, Kevin W
2016-01-15
Pharmacy leader development over time was analyzed using the seven action logics. As part of an ongoing leadership seminar series, students were required to select a visionary pharmacy leader and conduct a structured interview to evaluate pharmacy leaders' action logics. A standardized questionnaire comprising 13 questions was created by the class. Questions addressed leadership qualities during the leaders' early years, education years, and work years. Transcripts were then coded by two separate trained investigators based on the leader's stage of life to provide a score for each action logic individually over time. Kappa coefficient was used to evaluate interrater agreement. A total of 14 leaders were interviewed. All leaders were currently employed and had won national awards for their contributions to pharmacy practice. Overall, there was 82% agreement between the two evaluators' scores for the various characteristics. Action logics changed based on the leaders' life stage. Using aggregate data from all leader interviews, a progression from lower-order action logics (opportunist, diplomat, expert) to higher-order action logics (strategist, alchemist) was found. Ten leaders (71%) were diplomats during their early years. Six leaders (43%) were experts during their education years, and 4 (29%) were strategists or alchemists. During the third life stage analyzed (the work years), 6 leaders (43%) were strategists, and 2 were alchemists. During their work years, all leaders had a percentage of their answers coded as alchemist (range, 5-22%). Throughout their professional careers, pharmacy leaders continually develop skills through formal education and mentorship that follow action logics. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Evans, Austin Lewis
1987-01-01
A computer code to model the steady-state performance of a monogroove heat pipe for the NASA Space Station is presented, including the effects on heat pipe performance of a screen in the evaporator section which deals with transient surges in the heat input. Errors in a previous code have been corrected, and the new code adds additional loss terms in order to model several different working fluids. Good agreement with existing performance curves is obtained. From a preliminary evaluation of several of the radiator design parameters it is found that an optimum fin width could be achieved but that structural considerations limit the thickness of the fin to a value above optimum.
Parallel DSMC Solution of Three-Dimensional Flow Over a Finite Flat Plate
NASA Technical Reports Server (NTRS)
Nance, Robert P.; Wilmoth, Richard G.; Moon, Bongki; Hassan, H. A.; Saltz, Joel
1994-01-01
This paper describes a parallel implementation of the direct simulation Monte Carlo (DSMC) method. Runtime library support is used for scheduling and execution of communication between nodes, and domain decomposition is performed dynamically to maintain a good load balance. Performance tests are conducted using the code to evaluate various remapping and remapping-interval policies, and it is shown that a one-dimensional chain-partitioning method works best for the problems considered. The parallel code is then used to simulate the Mach 20 nitrogen flow over a finite-thickness flat plate. It is shown that the parallel algorithm produces results which compare well with experimental data. Moreover, it yields significantly faster execution times than the scalar code, as well as very good load-balance characteristics.
Prestes, R C; Silva, L B; Torri, A M P; Kubota, E H; Rosa, C S; Roman, S S; Kempka, A P; Demiate, I M
2015-07-01
The objective of this work was to evaluate the effect of adding different starches (native and modified) on the physicochemical, sensory, structural and microbiological characteristics of low-fat chicken mortadella. Two formulations containing native cassava and regular corn starch, coded CASS (5.0 % of cassava starch) and CORN (5.0 % of regular corn starch), and one formulation produced with physically treated starch coded as MOD1 (2.5 % of Novation 2300) and chemically modified starch coded as MOD2 (2.5 % of Thermtex) were studied. The following tests were performed: physicochemical characterization (moisture, ash, protein, starch and lipid contents, and water activity); cooling, freezing and reheating losses; texture (texture profile test); color coordinates (L*, a*, b*, C and h); microbiological evaluation; sensory evaluation (multiple comparison and preference test); and histological evaluation (light microscopy). There was no significant difference (p > 0.05) for ash, protein, cooling loss, cohesiveness or in the preference test for the tested samples. The other evaluated parameters showed significant differences (p < 0.05). Histological study allowed for a qualitative evaluation between the physical properties of the food and its microscopic structure. The best results were obtained for formulation MOD2 (2.5 % Thermtex). The addition of modified starch resulted in a better performance than the native starch in relation to the evaluated technological parameters, mainly in relation to reheating losses, which demonstrated the good interaction between the modified starch in the structure of the product and the possibility of the application of this type of starch in other types of functional meat products.
Modeling chemical gradients in sediments under losing and gaining flow conditions: The GRADIENT code
NASA Astrophysics Data System (ADS)
Boano, Fulvio; De Falco, Natalie; Arnon, Shai
2018-02-01
Interfaces between sediments and water bodies often represent biochemical hotspots for nutrient reactions and are characterized by steep concentration gradients of different reactive solutes. Vertical profiles of these concentrations are routinely collected to obtain information on nutrient dynamics, and simple codes have been developed to analyze these profiles and determine the magnitude and distribution of reaction rates within sediments. However, existing publicly available codes do not consider the potential contribution of water flow in the sediments to nutrient transport, and their applications to field sites with significant water-borne nutrient fluxes may lead to large errors in the estimated reaction rates. To fill this gap, the present work presents GRADIENT, a novel algorithm to evaluate distributions of reaction rates from observed concentration profiles. GRADIENT is a Matlab code that extends a previously published framework to include the role of nutrient advection, and provides robust estimates of reaction rates in sediments with significant water flow. This work discusses the theoretical basis of the method and shows its performance by comparing the results to a series of synthetic data and to laboratory experiments. The results clearly show that in systems with losing or gaining fluxes, the inclusion of such fluxes is critical for estimating local and overall reaction rates in sediments.
Aerodynamic Interference Due to MSL Reaction Control System
NASA Technical Reports Server (NTRS)
Dyakonov, Artem A.; Schoenenberger, Mark; Scallion, William I.; VanNorman, John W.; Novak, Luke A.; Tang, Chun Y.
2009-01-01
An investigation of effectiveness of the reaction control system (RCS) of Mars Science Laboratory (MSL) entry capsule during atmospheric flight has been conducted. The reason for the investigation is that MSL is designed to fly a lifting actively guided entry with hypersonic bank maneuvers, therefore an understanding of RCS effectiveness is required. In the course of the study several jet configurations were evaluated using Langley Aerothermal Upwind Relaxation Algorithm (LAURA) code, Data Parallel Line Relaxation (DPLR) code, Fully Unstructured 3D (FUN3D) code and an Overset Grid Flowsolver (OVERFLOW) code. Computations indicated that some of the proposed configurations might induce aero-RCS interactions, sufficient to impede and even overwhelm the intended control torques. It was found that the maximum potential for aero-RCS interference exists around peak dynamic pressure along the trajectory. Present analysis largely relies on computational methods. Ground testing, flight data and computational analyses are required to fully understand the problem. At the time of this writing some experimental work spanning range of Mach number 2.5 through 4.5 has been completed and used to establish preliminary levels of confidence for computations. As a result of the present work a final RCS configuration has been designed such as to minimize aero-interference effects and it is a design baseline for MSL entry capsule.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herman, M.; Members of the Cross Sections Evaluation Working Group
2009-06-01
In December 2006, the Cross Section Evaluation Working Group (CSEWG) of the United States released the new ENDF/B-VII.0 library. This represented considerable achievement as it was the 1st major release since 1990 when ENDF/B-VI has been made publicly available. The two libraries have been released in the same format, ENDF-6, which has been originally developed for the ENDF/B-VI library. In the early stage of work on the VII-th generation of the library CSEWG made important decision to use the same formats. This decision was adopted even though it was argued that it would be timely to modernize the formats andmore » several interesting ideas were proposed. After careful deliberation CSEWG concluded that actual implementation would require considerable resources needed to modify processing codes and to guarantee high quality of the files processed by these codes. In view of this the idea of format modernization has been postponed and ENDF-6 format was adopted for the new ENDF/B-VII library. In several other areas related to ENDF we made our best to move beyond established tradition and achieve maximum modernization. Thus, the 'Big Paper' on ENDF/B-VII.0 has been published, also in December 2006, as the Special Issue of Nuclear Data Sheets 107 (1996) 2931-3060. The new web retrieval and plotting system for ENDF-6 formatted data, Sigma, was developed by the NNDC and released in 2007. Extensive paper has been published on the advanced tool for nuclear reaction data evaluation, EMPIRE, in 2007. This effort was complemented with release of updated set of ENDF checking codes in 2009. As the final item on this list, major revision of ENDF-6 Formats Manual was made. This work started in 2006 and came to fruition in 2009 as documented in the present report.« less
Musshauser, Doris; Bader, Angelika; Wildt, Beatrice; Hochleitner, Margarethe
2006-09-01
The aim of the present study was to evaluate the physical and mental health status of female workers from five different occupational groups and to identify possible sociodemographic and gender-coded family-related factors as well as work characteristics influencing women's health. The identified predictors of health status were subjected to a gender-sensitive analysis and their relations to one another are discussed. A total of 1083 female hospital workers including medical doctors, technical and administrative personnel, nurses and a group mainly consisting of scientific personnel and psychologists completed a questionnaire measuring work- and family-related variables, sociodemographic data and the Short-form 36 Health Questionnaire (SF-36). Data were analysed by multivariate regression analyses. Female medical doctors reported highest scores for all physical health dimensions except General Health. Our study population showed general low mental health status among administrative personnel and the heterogeneous group, others, scored highest on all mental health component scores. A series of eight regression analyses were performed. Three variables contributed highly significantly to all SF-36 subscale scores: age, satisfaction with work schedule, and the unpaid work variable. Age had the strongest influence on all physical dimensions except General Health (beta=-0.17) and had no detectable influence on mental health scores. The unpaid work variable (beta=-0.23; p<0.001) exerted a stronger influence on General Health than did age. Nevertheless, these variables were limited predictors of physical and mental health status. In all occupational groups the amount of time spent daily on child care and household tasks, as a traditional gender-coded factor, and satisfaction with work schedule were the only contributors to mental health among working women in this study. Traditional sociodemographic data had no effect on mental health status. In addition to age, these factors were shown to be the only predictors of physical health status of female workers. Gender coded-factors matter. These findings underline the importance of including gender-coded family- and work-related variables in medical research over and above basic sociodemographic data in order to describe study populations more clearly.
Using Qualitative Methods to Evaluate a Family Behavioral Intervention for Type 1 Diabetes
Herbert, Linda Jones; Sweenie, Rachel; Kelly, Katherine Patterson; Holmes, Clarissa; Streisand, Randi
2013-01-01
Introduction The objectives of this study were to qualitatively evaluate a dyadic adolescent-parent type 1 diabetes (T1D) program developed to prevent deterioration in diabetes care among adolescents with T1D and provide recommendations for program refinement. Method Thirteen adolescent-parent dyads who participated in the larger RCT, the TeamWork Project, were interviewed regarding their perceptions of their participation in the program and current T1D challenges. Interviews were transcribed and coded to establish broad themes. Results Adolescents and parents thought the TeamWork Project sessions were helpful and taught them new information. Five themes catalog findings from the qualitative interviews: TeamWork content, TeamWork structure, transition of responsibility, current and future challenges, and future intervention considerations. Discussion Addressing T1D challenges as a parent-adolescent dyad via a behavioral clinic program is helpful to families during adolescence. Findings highlight the utility of qualitative evaluation to tailor interventions for the unique challenges related to pediatric chronic illness. PMID:24269281
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shingledecker, John P
2007-01-01
Creep-rupture experiments were conducted on HR6W and Haynes 230, candidate Ultrasupercritical (USC) alloys, tubes to evaluate the effects of cold-work and recrystallization during high-temperature service. These creep tests were performed by internally pressurizing cold-bent boiler tubes at 775 C for times up to 8000 hours. The bends were fabricated with cold-work levels beyond the current ASME Boiler and Pressure Vessel (ASME B&PV) Code Section I limits for austenitic stainless steels. Destructive metallographic evaluation of the crept tube bends was used to determine the effects of cold-work and the degree of recrystallization. The metallographic analysis combined with an evaluation of themore » creep and rupture data suggest that solid-solution strengthened nickel-based alloys can be fabricated for high-temperature service at USC conditions utilizing levels of cold-work higher than the current allowed levels for austenitic stainless steels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, Matthew; Yin, Shengjun; Stevens, Gary
2012-01-01
In past years, the authors have undertaken various studies of nozzles in both boiling water reactors (BWRs) and pressurized water reactors (PWRs) located in the reactor pressure vessel (RPV) adjacent to the core beltline region. Those studies described stress and fracture mechanics analyses performed to assess various RPV nozzle geometries, which were selected based on their proximity to the core beltline region, i.e., those nozzle configurations that are located close enough to the core region such that they may receive sufficient fluence prior to end-of-life (EOL) to require evaluation of embrittlement as part of the RPV analyses associated with pressure-temperaturemore » (P-T) limits. In this paper, additional stress and fracture analyses are summarized that were performed for additional PWR nozzles with the following objectives: To expand the population of PWR nozzle configurations evaluated, which was limited in the previous work to just two nozzles (one inlet and one outlet nozzle). To model and understand differences in stress results obtained for an internal pressure load case using a two-dimensional (2-D) axi-symmetric finite element model (FEM) vs. a three-dimensional (3-D) FEM for these PWR nozzles. In particular, the ovalization (stress concentration) effect of two intersecting cylinders, which is typical of RPV nozzle configurations, was investigated. To investigate the applicability of previously recommended linear elastic fracture mechanics (LEFM) hand solutions for calculating the Mode I stress intensity factor for a postulated nozzle corner crack for pressure loading for these PWR nozzles. These analyses were performed to further expand earlier work completed to support potential revision and refinement of Title 10 to the U.S. Code of Federal Regulations (CFR), Part 50, Appendix G, Fracture Toughness Requirements, and are intended to supplement similar evaluation of nozzles presented at the 2008, 2009, and 2011 Pressure Vessels and Piping (PVP) Conferences. This work is also relevant to the ongoing efforts of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel (B&PV) Code, Section XI, Working Group on Operating Plant Criteria (WGOPC) efforts to incorporate nozzle fracture mechanics solutions into a revision to ASME B&PV Code, Section XI, Nonmandatory Appendix G.« less
Calculation evaluation of multiplying properties of LWR with thorium fuel
NASA Astrophysics Data System (ADS)
Shamanin, I. V.; Grachev, V. M.; Knyshev, V. V.; Bedenko, S. V.; Novikova, N. G.
2017-01-01
The results of multiplying properties design research of the unit cell and LWR fuel assembly with the high temperature gas-cooled thorium reactor fuel pellet are presented in the work. The calculation evaluation showed the possibility of using thorium in LWR effectively. In this case the amount of fissile isotope is 2.45 times smaller in comparison with the standard loading of LWR. The research and numerical experiments were carried out using the verified accounting code of the program MCU5, modern libraries of evaluated nuclear data and multigroup approximations.
Burstyn, Igor; Slutsky, Anton; Lee, Derrick G; Singer, Alison B; An, Yuan; Michael, Yvonne L
2014-05-01
Epidemiologists typically collect narrative descriptions of occupational histories because these are less prone than self-reported exposures to recall bias of exposure to a specific hazard. However, the task of coding these narratives can be daunting and prohibitively time-consuming in some settings. The aim of this manuscript is to evaluate the performance of a computer algorithm to translate the narrative description of occupational codes into standard classification of jobs (2010 Standard Occupational Classification) in an epidemiological context. The fundamental question we address is whether exposure assignment resulting from manual (presumed gold standard) coding of the narratives is materially different from that arising from the application of automated coding. We pursued our work through three motivating examples: assessment of physical demands in Women's Health Initiative observational study, evaluation of predictors of exposure to coal tar pitch volatiles in the US Occupational Safety and Health Administration's (OSHA) Integrated Management Information System, and assessment of exposure to agents known to cause occupational asthma in a pregnancy cohort. In these diverse settings, we demonstrate that automated coding of occupations results in assignment of exposures that are in reasonable agreement with results that can be obtained through manual coding. The correlation between physical demand scores based on manual and automated job classification schemes was reasonable (r = 0.5). The agreement between predictive probability of exceeding the OSHA's permissible exposure level for polycyclic aromatic hydrocarbons, using coal tar pitch volatiles as a surrogate, based on manual and automated coding of jobs was modest (Kendall rank correlation = 0.29). In the case of binary assignment of exposure to asthmagens, we observed that fair to excellent agreement in classifications can be reached, depending on presence of ambiguity in assigned job classification (κ = 0.5-0.8). Thus, the success of automated coding appears to depend on the setting and type of exposure that is being assessed. Our overall recommendation is that automated translation of short narrative descriptions of jobs for exposure assessment is feasible in some settings and essential for large cohorts, especially if combined with manual coding to both assess reliability of coding and to further refine the coding algorithm.
Context-aware and locality-constrained coding for image categorization.
Xiao, Wenhua; Wang, Bin; Liu, Yu; Bao, Weidong; Zhang, Maojun
2014-01-01
Improving the coding strategy for BOF (Bag-of-Features) based feature design has drawn increasing attention in recent image categorization works. However, the ambiguity in coding procedure still impedes its further development. In this paper, we introduce a context-aware and locality-constrained Coding (CALC) approach with context information for describing objects in a discriminative way. It is generally achieved by learning a word-to-word cooccurrence prior to imposing context information over locality-constrained coding. Firstly, the local context of each category is evaluated by learning a word-to-word cooccurrence matrix representing the spatial distribution of local features in neighbor region. Then, the learned cooccurrence matrix is used for measuring the context distance between local features and code words. Finally, a coding strategy simultaneously considers locality in feature space and context space, while introducing the weight of feature is proposed. This novel coding strategy not only semantically preserves the information in coding, but also has the ability to alleviate the noise distortion of each class. Extensive experiments on several available datasets (Scene-15, Caltech101, and Caltech256) are conducted to validate the superiority of our algorithm by comparing it with baselines and recent published methods. Experimental results show that our method significantly improves the performance of baselines and achieves comparable and even better performance with the state of the arts.
Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.
Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh
2018-01-01
Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.
Zuckerman, Joseph D; Kubiak, Eric N; Immerman, Igor; Dicesare, Paul
2005-04-01
The impact of strict enforcement of Section 405 of the New York State Public Health Code to restrict resident work to eighty hours per week and the adoption of a similar policy by the Accreditation Council on Graduate Medical Education in 2002 for orthopaedic residency training have not been evaluated. Adoption of these rules has created accreditation as well as staffing problems and has generated controversy in the surgical training community. The purposes of this study were (1) to evaluate the attitudes of orthopaedic residents and attending surgeons toward the Code 405 work-hour regulations and the effect of those regulations on the perceived quality of residency training, quality of life, and patient care and (2) to quantify the effect of the work-hour restrictions on the actual number of hours worked. We administered a thirty-four-question Likert-style questionnaire to forty-eight orthopaedic surgery residents (postgraduate years [PGY]-2 through 5) and a similar twenty-nine-question Likert-style questionnaire to thirty-nine orthopaedic attending surgeons. All questionnaires were collected anonymously and analyzed. Additionally, resident work hours before and after strict enforcement of the Code 405 regulations were obtained from resident time sheets. The average weekly work hours decreased from 89.25 to 74.25 hours for PGY-2 residents and from 86.5 to 73.25 hours for PGY-3 residents, and they increased from 61.5 to 68.5 hours for PGY-4 residents. Residents at all levels felt that they had increased time available for reading. There was general agreement between attending and resident surgeons that their operating experience had been negatively impacted. Senior residents thought that their education had been negatively affected, while junior residents thought that their operating experience in general had been negatively affected. Senior residents and attending surgeons felt that continuity of care had been negatively impacted. All agreed that quality of life for the residents had improved and that residents were more rested. On the basis of the survey data, the implementation of the new work-hour restrictions was found to result in a decrease in the number of hours worked per week for PGY-2 and PGY-3 residents and in an increase in work hours for PGY-4 residents. This could explain the definite difference between the attitudes expressed by the senior residents and those of the junior residents. Senior residents felt that their education was negatively impacted by the work rules, while junior residents expressed a more neutral view. However, senior residents did not believe that their operative experience was as negatively impacted as did junior residents. Although junior and senior residents and attending surgeons agreed that resident quality of life had improved, we were not able to determine whether this offset the perceived negative impact on education, continuity of care, and operative experience.
NASA Astrophysics Data System (ADS)
Del Lama, L. S.; Cunha, D. M.; Poletti, M. E.
2017-08-01
The presence and morphology of microcalcification clusters are the main point to provide early indications of breast carcinomas. However, the visualization of those structures may be jeopardized due to overlapping tissues even for digital mammography systems. Although digital mammography is the current standard for breast cancer diagnosis, further improvements should be achieved in order to address some of those physical limitations. One possible solution for such issues is the application of the dual-energy technique (DE), which is able to highlight specific lesions or cancel out the tissue background. In this sense, this work aimed to evaluate several quantities of interest in radiation applications and compare those values with works present in the literature to validate a modified PENELOPE code for digital mammography applications. For instance, the scatter-to-primary ratio (SPR), the scatter fraction (SF) and the normalized mean glandular dose (DgN) were evaluated by simulations and the resulting values were compared to those found in earlier studies. Our results present a good correlation for the evaluated quantities, showing agreement equal or better than 5% for the scatter and dosimetric-related quantities when compared to the literature. Finally, a DE imaging chain was simulated and the visualization of microcalcifications was investigated.
Lorkowski, Jacek; Mrzygłód, Mirosław; Kotela, Ireneusz; Kiełbasiewicz-Lorkowska, Ewa; Teul, Iwona
2013-01-01
According to the verdict of the Supreme Court in 2005, an employer may dismiss an employee if their conduct (including dress) exposes the employer to losses or threatens his interests. The aim of the study was a holistic assessment of the pleiotropic effects of high-heeled pointed shoes on the health condition of women's feet, wearing them at work, in accordance with the existing rules of the "business dress code". A holistic multidisciplinary analysis was performed. It takes into account: 1) women employees of banks and other large corporations (82 persons); 2) 2D FEM computer model developed by the authors of foot deformed by pointed high-heeled shoes; 3) web site found after entering the code "business dress code". Over 60% of women in the office wore high-heeled shoes. The following has been found among people walking to work in high heels: 1) reduction in the quality of life in about 70% of cases, through periodic occurrence of pain and reduction of functional capacity of the feet; 2) increase in the pressure on the plantar side of the forefoot at least twice; 3) the continued effects the forces deforming the forefoot. 1. An evolutionary change of "dress code" shoes is necessary in order to lead to a reduction in non-physiological overload of feet and the consequence of their disability. 2. These changes are particularly urgent in patients with so-called "sensitive foot".
Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic
NASA Astrophysics Data System (ADS)
Li, Yan; Dai, Shifang; Wu, Weiwei
2016-12-01
Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.
Deploying electromagnetic particle-in-cell (EM-PIC) codes on Xeon Phi accelerators boards
NASA Astrophysics Data System (ADS)
Fonseca, Ricardo
2014-10-01
The complexity of the phenomena involved in several relevant plasma physics scenarios, where highly nonlinear and kinetic processes dominate, makes purely theoretical descriptions impossible. Further understanding of these scenarios requires detailed numerical modeling, but fully relativistic particle-in-cell codes such as OSIRIS are computationally intensive. The quest towards Exaflop computer systems has lead to the development of HPC systems based on add-on accelerator cards, such as GPGPUs and more recently the Xeon Phi accelerators that power the current number 1 system in the world. These cards, also referred to as Intel Many Integrated Core Architecture (MIC) offer peak theoretical performances of >1 TFlop/s for general purpose calculations in a single board, and are receiving significant attention as an attractive alternative to CPUs for plasma modeling. In this work we report on our efforts towards the deployment of an EM-PIC code on a Xeon Phi architecture system. We will focus on the parallelization and vectorization strategies followed, and present a detailed performance evaluation of code performance in comparison with the CPU code.
NASA Technical Reports Server (NTRS)
Pratt, D. T.; Radhakrishnan, K.
1986-01-01
The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.
Evaluating Quality Circles in a College of Further Education. Manchester Monographs 31.
ERIC Educational Resources Information Center
Atkinson, Tim
Quality circles (QCs) are small volunteer groups of workers who meet weekly with a trained leader operating to a strict code of conduct. They use techniques of brainstorming, cause and effect classification, pareto analysis, and presentation to consider work-related problems and recommend solutions to management. QCs have been tried in educational…
Developments in the Evaluation of Work-Based Learning: A UK Perspective
ERIC Educational Resources Information Center
Murdoch, Ian J.
2004-01-01
UK higher education institutions are now expected to be able to demonstrate that they are adhering to the Code of Practice for the Assurance of Academic Quality and Standards in Higher Education in Placement Learning. The responsibility for ensuring that a placement provides an adequate opportunity for its intended learning outcomes rests with the…
Air Traffic Controller Working Memory: Considerations in Air Traffic Control Tactical Operations
1993-09-01
INFORMATION PROCESSING SYSTEM 3 2. AIR TRAFFIC CONTROLLER MEMORY 5 2.1 MEMORY CODES 6 21.1 Visual Codes 7 2.1.2 Phonetic Codes 7 2.1.3 Semantic Codes 8...raise an awareness of the memory re- quirements of ATC tactical operations by presenting information on working memory processes that are relevant to...working v memory permeates every aspect of the controller’s ability to process air traffic information and control live traffic. The
Development of a CFD code for casting simulation
NASA Technical Reports Server (NTRS)
Murph, Jesse E.
1992-01-01
The task of developing a computational fluid dynamics (CFD) code to accurately model the mold filling phase of a casting operation was accomplished in a systematic manner. First the state-of-the-art was determined through a literature search, a code search, and participation with casting industry personnel involved in consortium startups. From this material and inputs from industry personnel, an evaluation of the currently available codes was made. It was determined that a few of the codes already contained sophisticated CFD algorithms and further validation of one of these codes could preclude the development of a new CFD code for this purpose. With industry concurrence, ProCAST was chosen for further evaluation. Two benchmark cases were used to evaluate the code's performance using a Silicon Graphics Personal Iris system. The results of these limited evaluations (because of machine and time constraints) are presented along with discussions of possible improvements and recommendations for further evaluation.
Evaluation of three coding schemes designed for improved data communication
NASA Technical Reports Server (NTRS)
Snelsire, R. W.
1974-01-01
Three coding schemes designed for improved data communication are evaluated. Four block codes are evaluated relative to a quality function, which is a function of both the amount of data rejected and the error rate. The Viterbi maximum likelihood decoding algorithm as a decoding procedure is reviewed. This evaluation is obtained by simulating the system on a digital computer. Short constraint length rate 1/2 quick-look codes are studied, and their performance is compared to general nonsystematic codes.
Beronius, Anna; Molander, Linda; Zilliacus, Johanna; Rudén, Christina; Hanberg, Annika
2018-05-28
The Science in Risk Assessment and Policy (SciRAP) web-based platform was developed to promote and facilitate structure and transparency in the evaluation of ecotoxicity and toxicity studies for hazard and risk assessment of chemicals. The platform includes sets of criteria and a colour-coding tool for evaluating the reliability and relevance of individual studies. The SciRAP method for evaluating in vivo toxicity studies was first published in 2014 and the aim of the work presented here was to evaluate and develop that method further. Toxicologists and risk assessors from different sectors and geographical areas were invited to test the SciRAP criteria and tool on a specific set of in vivo toxicity studies and to provide feedback concerning the scientific soundness and user-friendliness of the SciRAP approach. The results of this expert assessment were used to refine and improve both the evaluation criteria and the colour-coding tool. It is expected that the SciRAP web-based platform will continue to be developed and enhanced to keep up to date with the needs of end-users. Copyright © 2018 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Davis, S. J.; Egolf, T. A.
1980-07-01
Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.
Beller, Elaine; Clark, Justin; Tsafnat, Guy; Adams, Clive; Diehl, Heinz; Lund, Hans; Ouzzani, Mourad; Thayer, Kristina; Thomas, James; Turner, Tari; Xia, Jun; Robinson, Karen; Glasziou, Paul
2018-05-19
Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The 'Vienna Principles' set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.
Ogunrin, Olubunmi A; Ogundiran, Temidayo O; Adebamowo, Clement
2013-01-02
The formulation and implementation of national ethical regulations to protect research participants is fundamental to ethical conduct of research. Ethics education and capacity are inadequate in developing African countries. This study was designed to develop a module for online training in research ethics based on the Nigerian National Code of Health Research Ethics and assess its ease of use and reliability among biomedical researchers in Nigeria. This was a three-phased evaluation study. Phase one involved development of an online training module based on the Nigerian Code of Health Research Ethics (NCHRE) and uploading it to the Collaborative Institutional Training Initiative (CITI) website while the second phase entailed the evaluation of the module for comprehensibility, readability and ease of use by 45 Nigerian biomedical researchers. The third phase involved modification and re-evaluation of the module by 30 Nigerian biomedical researchers and determination of test-retest reliability of the module using Cronbach's alpha. The online module was easily accessible and comprehensible to 95% of study participants. There were significant differences in the pretest and posttest scores of study participants during the evaluation of the online module (p = 0.001) with correlation coefficients of 0.9 and 0.8 for the pretest and posttest scores respectively. The module also demonstrated excellent test-retest reliability and internal consistency as shown by Cronbach's alpha coefficients of 0.92 and 0.84 for the pretest and posttest respectively. The module based on the Nigerian Code was developed, tested and made available online as a valuable tool for training in cultural and societal relevant ethical principles to orient national and international biomedical researchers working in Nigeria. It would complement other general research ethics and Good Clinical Practice modules. Participants suggested that awareness of the online module should be increased through seminars, advertisement on government websites and portals used by Nigerian biomedical researchers, and incorporation of the Code into the undergraduate medical training curriculum.
Finite element modelling of crash response of composite aerospace sub-floor structures
NASA Astrophysics Data System (ADS)
McCarthy, M. A.; Harte, C. G.; Wiggenraad, J. F. M.; Michielsen, A. L. P. J.; Kohlgrüber, D.; Kamoulakos, A.
Composite energy-absorbing structures for use in aircraft are being studied within a European Commission research programme (CRASURV - Design for Crash Survivability). One of the aims of the project is to evaluate the current capabilities of crashworthiness simulation codes for composites modelling. This paper focuses on the computational analysis using explicit finite element analysis, of a number of quasi-static and dynamic tests carried out within the programme. It describes the design of the structures, the analysis techniques used, and the results of the analyses in comparison to the experimental test results. It has been found that current multi-ply shell models are capable of modelling the main energy-absorbing processes at work in such structures. However some deficiencies exist, particularly in modelling fabric composites. Developments within the finite element code are taking place as a result of this work which will enable better representation of composite fabrics.
Evaluation of icing drag coefficient correlations applied to iced propeller performance prediction
NASA Technical Reports Server (NTRS)
Miller, Thomas L.; Shaw, R. J.; Korkan, K. D.
1987-01-01
Evaluation of three empirical icing drag coefficient correlations is accomplished through application to a set of propeller icing data. The various correlations represent the best means currently available for relating drag rise to various flight and atmospheric conditions for both fixed-wing and rotating airfoils, and the work presented here ilustrates and evaluates one such application of the latter case. The origins of each of the correlations are discussed, and their apparent capabilities and limitations are summarized. These correlations have been made to be an integral part of a computer code, ICEPERF, which has been designed to calculate iced propeller performance. Comparison with experimental propeller icing data shows generally good agreement, with the quality of the predicted results seen to be directly related to the radial icing extent of each case. The code's capability to properly predict thrust coefficient, power coefficient, and propeller efficiency is shown to be strongly dependent on the choice of correlation selected, as well as upon proper specificatioon of radial icing extent.
Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior
2011-09-23
Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.
Overview and Current Status of Analyses of Potential LEU Design Concepts for TREAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Kontogeorgakos, D. C.; Papadias, D. D.
2015-10-01
Neutronic and thermal-hydraulic analyses have been performed to evaluate the performance of different low-enriched uranium (LEU) fuel design concepts for the conversion of the Transient Reactor Test Facility (TREAT) from its current high-enriched uranium (HEU) fuel. TREAT is an experimental reactor developed to generate high neutron flux transients for the testing of nuclear fuels. The goal of this work was to identify an LEU design which can maintain the performance of the existing HEU core while continuing to operate safely. A wide variety of design options were considered, with a focus on minimizing peak fuel temperatures and optimizing the powermore » coupling between the TREAT core and test samples. Designs were also evaluated to ensure that they provide sufficient reactivity and shutdown margin for each control rod bank. Analyses were performed using the core loading and experiment configuration of historic M8 Power Calibration experiments (M8CAL). The Monte Carlo code MCNP was utilized for steady-state analyses, and transient calculations were performed with the point kinetics code TREKIN. Thermal analyses were performed with the COMSOL multi-physics code. Using the results of this study, a new LEU Baseline design concept is being established, which will be evaluated in detail in a future report.« less
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1992-01-01
Worked performed during the reporting period is summarized. Construction of robustly good trellis codes for use with sequential decoding was developed. The robustly good trellis codes provide a much better trade off between free distance and distance profile. The unequal error protection capabilities of convolutional codes was studied. The problem of finding good large constraint length, low rate convolutional codes for deep space applications is investigated. A formula for computing the free distance of 1/n convolutional codes was discovered. Double memory (DM) codes, codes with two memory units per unit bit position, were studied; a search for optimal DM codes is being conducted. An algorithm for constructing convolutional codes from a given quasi-cyclic code was developed. Papers based on the above work are included in the appendix.
Methodology for Evaluating Cost-effectiveness of Commercial Energy Code Changes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Philip R.; Liu, Bing
This document lays out the U.S. Department of Energy’s (DOE’s) method for evaluating the cost-effectiveness of energy code proposals and editions. The evaluation is applied to provisions or editions of the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) Standard 90.1 and the International Energy Conservation Code (IECC). The method follows standard life-cycle cost (LCC) economic analysis procedures. Cost-effectiveness evaluation requires three steps: 1) evaluating the energy and energy cost savings of code changes, 2) evaluating the incremental and replacement costs related to the changes, and 3) determining the cost-effectiveness of energy code changes based on those costs andmore » savings over time.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viktor K. Decyk
The UCLA work on this grant was to design and help implement an object-oriented version of the GTC code, which is written in Fortran90. The GTC code is the main global gyrokinetic code used in this project, and over the years multiple, incompatible versions have evolved. The reason for this effort is to allow multiple authors to work together on GTC and to simplify future enhancements to GTC. The effort was designed to proceed incrementally. Initially, an upper layer of classes (derived types and methods) was implemented which called the original GTC code 'under the hood.' The derived types pointedmore » to data in the original GTC code, and the methods called the original GTC subroutines. The original GTC code was modified only very slightly. This allowed one to define (and refine) a set of classes which described the important features of the GTC code in a new, more abstract way, with a minimum of implementation. Furthermore, classes could be added one at a time, and at the end of the each day, the code continued to work correctly. This work was done in close collaboration with Y. Nishimura from UC Irvine and Stefan Ethier from PPPL. Ten classes were ultimately defined and implemented: gyrokinetic and drift kinetic particles, scalar and vector fields, a mesh, jacobian, FLR, equilibrium, interpolation, and particles species descriptors. In the second state of this development, some of the scaffolding was removed. The constructors in the class objects now allocated the data and the array data in the original GTC code was removed. This isolated the components and now allowed multiple instantiations of the objects to be created, in particular, multiple ion species. Again, the work was done incrementally, one class at a time, so that the code was always working properly. This work was done in close collaboration with Y. Nishimura and W. Zhang from UC Irvine and Stefan Ethier from PPPL. The third stage of this work was to integrate the capabilities of the various versions of the GTC code into one flexible and extensible version. To do this, we developed a methodology to implement Design Patterns in Fortran90. Design Patterns are abstract solutions to generic programming problems, which allow one to handle increased complexity. This work was done in collaboration with Henry Gardner, a computer scientist (and former plasma physicist) from the Australian National University. As an example, the Strategy Pattern is being used in GTC to support multiple solvers. This new code is currently being used in the study of energetic particles. A document describing the evolution of the GTC code to this new object-oriented version is available to users of GTC.« less
The Italian experience on T/H best estimate codes: Achievements and perspectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alemberti, A.; D`Auria, F.; Fiorino, E.
1997-07-01
Themalhydraulic system codes are complex tools developed to simulate the power plants behavior during off-normal conditions. Among the objectives of the code calculations the evaluation of safety margins, the operator training, the optimization of the plant design and of the emergency operating procedures, are mostly considered in the field of the nuclear safety. The first generation of codes was developed in the United States at the end of `60s. Since that time, different research groups all over the world started the development of their own codes. At the beginning of the `80s, the second generation codes were proposed; these differmore » from the first generation codes owing to the number of balance equations solved (six instead of three), the sophistication of the constitutive models and of the adopted numerics. The capabilities of available computers have been fully exploited during the years. The authors then summarize some of the major steps in the process of developing, modifying, and advancing the capabilities of the codes. They touch on the fact that Italian, and for that matter non-American, researchers have not been intimately involved in much of this work. They then describe the application of these codes in Italy, even though there are no operating or under construction nuclear power plants at this time. Much of this effort is directed at the general question of plant safety in the face of transient type events.« less
Ensemble coding remains accurate under object and spatial visual working memory load.
Epstein, Michael L; Emmanouil, Tatiana A
2017-10-01
A number of studies have provided evidence that the visual system statistically summarizes large amounts of information that would exceed the limitations of attention and working memory (ensemble coding). However the necessity of working memory resources for ensemble coding has not yet been tested directly. In the current study, we used a dual task design to test the effect of object and spatial visual working memory load on size averaging accuracy. In Experiment 1, we tested participants' accuracy in comparing the mean size of two sets under various levels of object visual working memory load. Although the accuracy of average size judgments depended on the difference in mean size between the two sets, we found no effect of working memory load. In Experiment 2, we tested the same average size judgment while participants were under spatial visual working memory load, again finding no effect of load on averaging accuracy. Overall our results reveal that ensemble coding can proceed unimpeded and highly accurately under both object and spatial visual working memory load, providing further evidence that ensemble coding reflects a basic perceptual process distinct from that of individual object processing.
How collaboration in therapy becomes therapeutic: the therapeutic collaboration coding system.
Ribeiro, Eugénia; Ribeiro, António P; Gonçalves, Miguel M; Horvath, Adam O; Stiles, William B
2013-09-01
The quality and strength of the therapeutic collaboration, the core of the alliance, is reliably associated with positive therapy outcomes. The urgent challenge for clinicians and researchers is constructing a conceptual framework to integrate the dialectical work that fosters collaboration, with a model of how clients make progress in therapy. We propose a conceptual account of how collaboration in therapy becomes therapeutic. In addition, we report on the construction of a coding system - the therapeutic collaboration coding system (TCCS) - designed to analyse and track on a moment-by-moment basis the interaction between therapist and client. Preliminary evidence is presented regarding the coding system's psychometric properties. The TCCS evaluates each speaking turn and assesses whether and how therapists are working within the client's therapeutic zone of proximal development, defined as the space between the client's actual therapeutic developmental level and their potential developmental level that can be reached in collaboration with the therapist. We applied the TCCS to five cases: a good and a poor outcome case of narrative therapy, a good and a poor outcome case of cognitive-behavioural therapy, and a dropout case of narrative therapy. The TCCS offers markers that may help researchers better understand the therapeutic collaboration on a moment-to-moment basis and may help therapists better regulate the relationship. © 2012 The British Psychological Society.
Development and Evaluation of a Hyperbaric Toxic Gas Monitor (SubTox) for Disabled Submarines
2013-08-01
PAGES 19a. NAME OF RESPONSIBLE PERSON: NEDU Librarian a. REPORT Unclassified b. ABSTRACT Unclassified c. THIS PAGE Unclassified 19b. TELEPHONE...NUMBER (include area code) 850.230.3170 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii ACKNOWLEDGMENTS This work...in September 2012. iii CONTENTS Standard Form 298 ................................................................................ i
Development and Evaluation of a Hyperbaric Toxic Gas Monitor (SUBTOX) for Disabled Submarines
2013-08-01
PAGES 19a. NAME OF RESPONSIBLE PERSON: NEDU Librarian a. REPORT Unclassified b. ABSTRACT Unclassified c. THIS PAGE Unclassified 19b. TELEPHONE...NUMBER (include area code) 850.230.3170 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii ACKNOWLEDGMENTS This work...in September 2012. iii CONTENTS Standard Form 298 ................................................................................ i
The Evolution of Random Number Generation in MUVES
2017-01-01
mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number
Experiences of employees with arm, neck or shoulder complaints: a focus group study
2014-01-01
Background Many people suffer from complaints of the arm, neck or shoulder (CANS). CANS causes significant work problems, including absenteeism (sickness absence), presenteeism (decreased work productivity) and, ultimately, job loss. There is a need for intervention programs for people suffering from CANS. Management of symptoms and workload, and improving the workstyle, could be important factors in the strategy to deal with CANS. The objective of this study is to evaluate the experienced problems of employees with CANS, as a first step in an intervention mapping process aimed at adaptation of an existing self-management program to the characteristics of employees suffering from CANS. Methods A qualitative study comprising three focus group meetings with 15 employees suffering from CANS. Based on a question guide, participants were asked about experiences in relation to continuing work despite their complaints. Data were analysed using content analysis with an open-coding system. During selective coding, general themes and patterns were identified and relationships between the codes were examined. Results Participants suffering from CANS often have to deal with pain, disability, fatigue, misunderstanding and stress at work. Some needs of the participants were identified, i.e. disease-specific information, exercises, muscle relaxation, working with pain, influence of the work and/or social environment, and personal factors (including workstyle). Conclusions Employees suffering from CANS search for ways to deal with their complaints in daily life and at work. This study reveals several recurring problems and the results endorse the multi-factorial origin of CANS. Participants generally experience problems similar to those of employees with other types of complaints or chronic diseases, e.g. related to their illness, insufficient communication, working together with healthcare professionals, colleagues and management, and workplace adaptations. These topics will be addressed in the adaptation of an existing self-management program to the characteristics of employees suffering from CANS. PMID:24779360
Experiences of employees with arm, neck or shoulder complaints: a focus group study.
Hutting, Nathan; Heerkens, Yvonne F; Engels, Josephine A; Staal, J Bart; Nijhuis-van der Sanden, Maria W G
2014-04-29
Many people suffer from complaints of the arm, neck or shoulder (CANS). CANS causes significant work problems, including absenteeism (sickness absence), presenteeism (decreased work productivity) and, ultimately, job loss. There is a need for intervention programs for people suffering from CANS. Management of symptoms and workload, and improving the workstyle, could be important factors in the strategy to deal with CANS. The objective of this study is to evaluate the experienced problems of employees with CANS, as a first step in an intervention mapping process aimed at adaptation of an existing self-management program to the characteristics of employees suffering from CANS. A qualitative study comprising three focus group meetings with 15 employees suffering from CANS. Based on a question guide, participants were asked about experiences in relation to continuing work despite their complaints. Data were analysed using content analysis with an open-coding system. During selective coding, general themes and patterns were identified and relationships between the codes were examined. Participants suffering from CANS often have to deal with pain, disability, fatigue, misunderstanding and stress at work. Some needs of the participants were identified, i.e. disease-specific information, exercises, muscle relaxation, working with pain, influence of the work and/or social environment, and personal factors (including workstyle). Employees suffering from CANS search for ways to deal with their complaints in daily life and at work. This study reveals several recurring problems and the results endorse the multi-factorial origin of CANS. Participants generally experience problems similar to those of employees with other types of complaints or chronic diseases, e.g. related to their illness, insufficient communication, working together with healthcare professionals, colleagues and management, and workplace adaptations. These topics will be addressed in the adaptation of an existing self-management program to the characteristics of employees suffering from CANS.
Beatty, Garrett F; Cranley, Nicole M; Carnaby, Giselle; Janelle, Christopher M
2016-03-01
Emotions motivate individuals to attain appetitive goals and avoid aversive consequences. Empirical investigations have detailed how broad approach and avoidance orientations are reflected in fundamental movement attributes such as the speed, accuracy, and variability of motor actions. Several theoretical perspectives propose explanations for how emotional states influence the speed with which goal directed movements are initiated. These perspectives include biological predisposition, muscle activation, distance regulation, cognitive evaluation, and evaluative response coding accounts. A comprehensive review of literature and meta-analysis were undertaken to quantify empirical support for these theoretical perspectives. The systematic review yielded 34 studies that contained 53 independent experiments producing 128 effect sizes used to evaluate the predictions of existing theories. The central tenets of the biological predisposition (Hedges' g = -0.356), distance regulation (g = -0.293; g = 0.243), and cognitive evaluation (g = -0.249; g = -0.405; g = -0.174) accounts were supported. Partial support was also identified for the evaluative response coding (g = -0.255) framework. Our findings provide quantitative evidence that substantiate existing theoretical perspectives, and provide potential direction for conceptual integration of these independent perspectives. Recommendations for future empirical work in this area are discussed. (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Liu, Yuan-Hao; Nievaart, Sander; Chen, Yen-Fu; Wu, Shu-Wei; Chou, Wen-Tsae; Jiang, Shiang-Huei
2011-10-01
High energy photon (over 10 MeV) and neutron beams adopted in radiobiology and radiotherapy always produce mixed neutron/gamma-ray fields. The Mg(Ar) ionization chambers are commonly applied to determine the gamma-ray dose because of its neutron insensitive characteristic. Nowadays, many perturbation corrections for accurate dose estimation and lots of treatment planning systems are based on Monte Carlo technique. The Monte Carlo codes EGSnrc, FLUKA, GEANT4, MCNP5, and MCNPX were used to evaluate energy dependent response functions of the Exradin M2 Mg(Ar) ionization chamber to a parallel photon beam with mono-energies from 20 keV to 20 MeV. For the sake of validation, measurements were carefully performed in well-defined (a) primary M-100 X-ray calibration field, (b) primary 60Co calibration beam, (c) 6-MV, and (d) 10-MV therapeutic beams in hospital. At energy region below 100 keV, MCNP5 and MCNPX both had lower responses than other codes. For energies above 1 MeV, the MCNP ITS-mode greatly resembled other three codes and the differences were within 5%. Comparing to the measured currents, MCNP5 and MCNPX using ITS-mode had perfect agreement with the 60Co, and 10-MV beams. But at X-ray energy region, the derivations reached 17%. This work shows us a better insight into the performance of different Monte Carlo codes in photon-electron transport calculation. Regarding the application of the mixed field dosimetry like BNCT, MCNP with ITS-mode is recognized as the most suitable tool by this work.
Utilizing GPUs to Accelerate Turbomachinery CFD Codes
NASA Technical Reports Server (NTRS)
MacCalla, Weylin; Kulkarni, Sameer
2016-01-01
GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.
Automated Classification of Pathology Reports.
Oleynik, Michel; Finger, Marcelo; Patrão, Diogo F C
2015-01-01
This work develops an automated classifier of pathology reports which infers the topography and the morphology classes of a tumor using codes from the International Classification of Diseases for Oncology (ICD-O). Data from 94,980 patients of the A.C. Camargo Cancer Center was used for training and validation of Naive Bayes classifiers, evaluated by the F1-score. Measures greater than 74% in the topographic group and 61% in the morphologic group are reported. Our work provides a successful baseline for future research for the classification of medical documents written in Portuguese and in other domains.
Path Toward a Unified Geometry for Radiation Transport
NASA Astrophysics Data System (ADS)
Lee, Kerry
The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.
Evolutionary Models of Cold, Magnetized, Interstellar Clouds
NASA Technical Reports Server (NTRS)
Gammie, Charles F.; Ostriker, Eve; Stone, James M.
2004-01-01
We modeled the long-term and small-scale evolution of molecular clouds using direct 2D and 3D magnetohydrodynamic (MHD) simulations. This work followed up on previous research by our group under auspices of the ATP in which we studied the energetics of turbulent, magnetized clouds and their internal structure on intermediate scales. Our new work focused on both global and smallscale aspects of the evolution of turbulent, magnetized clouds, and in particular studied the response of turbulent proto-cloud material to passage through the Galactic spiral potential, and the dynamical collapse of turbulent, magnetized (supercritical) clouds into fragments to initiate the formation of a stellar cluster. Technical advances under this program include developing an adaptive-mesh MHD code as a successor to ZEUS (ATHENA) in order to follow cloud fragmentation, developing a shearing-sheet MHD code which includes self-gravity and externally-imposed gravity to follow the evolution of clouds in the Galactic potential, and developing radiative transfer models to evaluate the internal ionization of clumpy clouds exposed to external photoionizing UV and CR radiation. Gammie's work at UIUC focused on the radiative transfer aspects of this program.
NASA Technical Reports Server (NTRS)
Davis, S. J.; Egolf, T. A.
1980-01-01
Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.
Some practical universal noiseless coding techniques, part 2
NASA Technical Reports Server (NTRS)
Rice, R. F.; Lee, J. J.
1983-01-01
This report is an extension of earlier work (Part 1) which provided practical adaptive techniques for the efficient noiseless coding of a broad class of data sources characterized by only partially known and varying statistics (JPL Publication 79-22). The results here, while still claiming such general applicability, focus primarily on the noiseless coding of image data. A fairly complete and self-contained treatment is provided. Particular emphasis is given to the requirements of the forthcoming Voyager II encounters of Uranus and Neptune. Performance evaluations are supported both graphically and pictorially. Expanded definitions of the algorithms in Part 1 yield a computationally improved set of options for applications requiring efficient performance at entropies above 4 bits/sample. These expanded definitions include as an important subset, a somewhat less efficient but extremely simple "FAST' compressor which will be used at the Voyager Uranus encounter. Additionally, options are provided which enhance performance when atypical data spikes may be present.
NASA Technical Reports Server (NTRS)
Mclennan, G. A.
1986-01-01
This report describes, and is a User's Manual for, a computer code (ANL/RBC) which calculates cycle performance for Rankine bottoming cycles extracting heat from a specified source gas stream. The code calculates cycle power and efficiency and the sizes for the heat exchangers, using tabular input of the properties of the cycle working fluid. An option is provided to calculate the costs of system components from user defined input cost functions. These cost functions may be defined in equation form or by numerical tabular data. A variety of functional forms have been included for these functions and they may be combined to create very general cost functions. An optional calculation mode can be used to determine the off-design performance of a system when operated away from the design-point, using the heat exchanger areas calculated for the design-point.
NASA Astrophysics Data System (ADS)
Capote, R.; Herman, M.; Obložinský, P.; Young, P. G.; Goriely, S.; Belgya, T.; Ignatyuk, A. V.; Koning, A. J.; Hilaire, S.; Plujko, V. A.; Avrigeanu, M.; Bersillon, O.; Chadwick, M. B.; Fukahori, T.; Ge, Zhigang; Han, Yinlu; Kailas, S.; Kopecky, J.; Maslov, V. M.; Reffo, G.; Sin, M.; Soukhovitskii, E. Sh.; Talou, P.
2009-12-01
We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and γ-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from 51V to 239Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capote, R.; Herman, M.; Oblozinsky, P.
We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through (http://www-nds.iaea.org/RIPL-3/). This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capote, R.; Herman, M.; Capote,R.
We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less
Lorio, Morgan; Martinson, Melissa; Ferrara, Lisa
2016-01-01
Minimally invasive sacroiliac joint arthrodesis ("MI SIJ fusion") received a Category I CPT ® code (27279) effective January 1, 2015 and was assigned a work relative value unit ("RVU") of 9.03. The International Society for the Advancement of Spine Surgery ("ISASS") conducted a study consisting of a Rasch analysis of two separate surveys of surgeons to assess the accuracy of the assigned work RVU. A survey was developed and sent to ninety-three ISASS surgeon committee members. Respondents were asked to compare CPT ® 27279 to ten other comparator CPT ® codes reflective of common spine surgeries. The survey presented each comparator CPT ® code with its code descriptor as well as the description of CPT ® 27279 and asked respondents to indicate whether CPT ® 27279 was greater, equal, or less in terms of work effort than the comparator code. A second survey was sent to 557 U.S.-based spine surgeon members of ISASS and 241 spine surgeon members of the Society for Minimally Invasive Spine Surgery ("SMISS"). The design of the second survey mirrored that of the first survey except for the use of a broader set of comparator CPT ® codes (27 vs. 10). Using the work RVUs of the comparator codes, a Rasch analysis was performed to estimate the relative difficulty of CPT ® 27279, after which the work RVU of CPT ® 27279 was estimated by regression analysis. Twenty surgeons responded to the first survey and thirty-four surgeons responded to the second survey. The results of the regression analysis of the first survey indicate a work RVU for CPT ® 27279 of 14.36 and the results of the regression analysis of the second survey indicate a work RVU for CPT ® 27279 of 14.1. The Rasch analysis indicates that the current work RVU assigned to CPT ® 27279 is undervalued at 9.03. Averaging the results of the regression analyses of the two surveys indicates a work RVU for CPT ® 27279 of 14.23.
Code of practice for food handler activities.
Smith, T A; Kanas, R P; McCoubrey, I A; Belton, M E
2005-08-01
The food industry regulates various aspects of food handler activities, according to legislation and customer expectations. The purpose of this paper is to provide a code of practice which delineates a set of working standards for food handler hygiene, handwashing, use of protective equipment, wearing of jewellery and body piercing. The code was developed by a working group of occupational physicians with expertise in both food manufacturing and retail, using a risk assessment approach. Views were also obtained from other occupational physicians working within the food industry and the relevant regulatory bodies. The final version of the code (available in full as Supplementary data in Occupational Medicine Online) therefore represents a broad consensus of opinion. The code of practice represents a set of minimum standards for food handler suitability and activities, based on a practical assessment of risk, for application in food businesses. It aims to provide useful working advice to food businesses of all sizes.
Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao
2017-01-01
Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead. PMID:29095934
Progress of IRSN R&D on ITER Safety Assessment
NASA Astrophysics Data System (ADS)
Van Dorsselaere, J. P.; Perrault, D.; Barrachin, M.; Bentaib, A.; Gensdarmes, F.; Haeck, W.; Pouvreau, S.; Salat, E.; Seropian, C.; Vendel, J.
2012-08-01
The French "Institut de Radioprotection et de Sûreté Nucléaire" (IRSN), in support to the French "Autorité de Sûreté Nucléaire", is analysing the safety of ITER fusion installation on the basis of the ITER operator's safety file. IRSN set up a multi-year R&D program in 2007 to support this safety assessment process. Priority has been given to four technical issues and the main outcomes of the work done in 2010 and 2011 are summarized in this paper: for simulation of accident scenarios in the vacuum vessel, adaptation of the ASTEC system code; for risk of explosion of gas-dust mixtures in the vacuum vessel, adaptation of the TONUS-CFD code for gas distribution, development of DUST code for dust transport, and preparation of IRSN experiments on gas inerting, dust mobilization, and hydrogen-dust mixtures explosion; for evaluation of the efficiency of the detritiation systems, thermo-chemical calculations of tritium speciation during transport in the gas phase and preparation of future experiments to evaluate the most influent factors on detritiation; for material neutron activation, adaptation of the VESTA Monte Carlo depletion code. The first results of these tasks have been used in 2011 for the analysis of the ITER safety file. In the near future, this R&D global programme may be reoriented to account for the feedback of the latter analysis or for new knowledge.
An empirical analysis of journal policy effectiveness for computational reproducibility.
Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun
2018-03-13
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.
An empirical analysis of journal policy effectiveness for computational reproducibility
Seiler, Jennifer; Ma, Zhaokun
2018-01-01
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by (i) requesting data and code from authors and (ii) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility. PMID:29531050
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee
2016-09-01
This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.
NASA Astrophysics Data System (ADS)
Saha, Uttiyoarnab; Devan, K.; Bachchan, Abhitab; Pandikumar, G.; Ganesan, S.
2018-04-01
The radiation damage in the structural materials of a 500 MWe Indian prototype fast breeder reactor (PFBR) is re-assessed by computing the neutron displacement per atom (dpa) cross-sections from the recent nuclear data library evaluated by the USA, ENDF / B-VII.1, wherein revisions were taken place in the new evaluations of basic nuclear data because of using the state-of-the-art neutron cross-section experiments, nuclear model-based predictions and modern data evaluation techniques. An indigenous computer code, computation of radiation damage (CRaD), is developed at our centre to compute primary-knock-on atom (PKA) spectra and displacement cross-sections of materials both in point-wise and any chosen group structure from the evaluated nuclear data libraries. The new radiation damage model, athermal recombination-corrected displacement per atom (arc-dpa), developed based on molecular dynamics simulations is also incorporated in our study. This work is the result of our earlier initiatives to overcome some of the limitations experienced while using codes like RECOIL, SPECTER and NJOY 2016, to estimate radiation damage. Agreement of CRaD results with other codes and ASTM standard for Fe dpa cross-section is found good. The present estimate of total dpa in D-9 steel of PFBR necessitates renormalisation of experimental correlations of dpa and radiation damage to ensure consistency of damage prediction with ENDF / B-VII.1 library.
Solar Thermal Concept Evaluation
NASA Technical Reports Server (NTRS)
Hawk, Clark W.; Bonometti, Joseph A.
1995-01-01
Concentrated solar thermal energy can be utilized in a variety of high temperature applications for both terrestrial and space environments. In each application, knowledge of the collector and absorber's heat exchange interaction is required. To understand this coupled mechanism, various concentrator types and geometries, as well as, their relationship to the physical absorber mechanics were investigated. To conduct experimental tests various parts of a 5,000 watt, thermal concentrator, facility were made and evaluated. This was in anticipation at a larger NASA facility proposed for construction. Although much of the work centered on solar thermal propulsion for an upper stage (less than one pound thrust range), the information generated and the facility's capabilities are applicable to material processing, power generation and similar uses. The numerical calculations used to design the laboratory mirror and the procedure for evaluating other solar collectors are presented here. The mirror design is based on a hexagonal faceted system, which uses a spherical approximation to the parabolic surface. The work began with a few two dimensional estimates and continued with a full, three dimensional, numerical algorithm written in FORTRAN code. This was compared to a full geometry, ray trace program, BEAM 4, which optimizes the curvatures, based on purely optical considerations. Founded on numerical results, the characteristics of a faceted concentrator were construed. The numerical methodologies themselves were evaluated and categorized. As a result, the three-dimensional FORTRAN code was the method chosen to construct the mirrors, due to its overall accuracy and superior results to the ray trace program. This information is being used to fabricate and subsequently, laser map the actual mirror surfaces. Evaluation of concentrator mirrors, thermal applications and scaling the results of the 10 foot diameter mirror to a much larger concentrator, were studied. Evaluations, recommendations and pit falls regarding the structure, materials and facility design are presented.
Reactor Pressure Vessel Fracture Analysis Capabilities in Grizzly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Chakraborty, Pritam
2015-03-01
Efforts have been underway to develop fracture mechanics capabilities in the Grizzly code to enable it to be used to perform deterministic fracture assessments of degraded reactor pressure vessels (RPVs). Development in prior years has resulted a capability to calculate -integrals. For this application, these are used to calculate stress intensity factors for cracks to be used in deterministic linear elastic fracture mechanics (LEFM) assessments of fracture in degraded RPVs. The -integral can only be used to evaluate stress intensity factors for axis-aligned flaws because it can only be used to obtain the stress intensity factor for pure Mode Imore » loading. Off-axis flaws will be subjected to mixed-mode loading. For this reason, work has continued to expand the set of fracture mechanics capabilities to permit it to evaluate off-axis flaws. This report documents the following work to enhance Grizzly’s engineering fracture mechanics capabilities for RPVs: • Interaction Integral and -stress: To obtain mixed-mode stress intensity factors, a capability to evaluate interaction integrals for 2D or 3D flaws has been developed. A -stress evaluation capability has been developed to evaluate the constraint at crack tips in 2D or 3D. Initial verification testing of these capabilities is documented here. • Benchmarking for axis-aligned flaws: Grizzly’s capabilities to evaluate stress intensity factors for axis-aligned flaws have been benchmarked against calculations for the same conditions in FAVOR. • Off-axis flaw demonstration: The newly-developed interaction integral capabilities are demon- strated in an application to calculate the mixed-mode stress intensity factors for off-axis flaws. • Other code enhancements: Other enhancements to the thermomechanics capabilities that relate to the solution of the engineering RPV fracture problem are documented here.« less
Managing Evaluation: A Community Arts Organisation Perspective.
Swan, Peter; Atkinson, Sarah
2012-09-01
Arts and health organisations must increasingly provide measurable evidence of impact to stakeholders, which can pose both logistical and ideological challenges. This paper examines the relationship between the ethos of an arts and health organisation with external demands for evaluation. Research involved an ethnographic engagement where the first author worked closely with the organisation for a year. In addition to informal discussions, twenty semi-structured interviews were conducted with core staff and participants. Transcribed interviews were coded and emerging themes were identified. Staff considered evaluation to be necessary and useful, yet also to be time consuming and a potential threat to their ethos. Nevertheless, they were able to negotiate the terms of evaluation to enable them to meet their own needs as well as those of funders and other stakeholders. While not completely resisting outside demands for evaluation, the organisation was seen to intentionally rework demands for evidence into processes they felt they could work with, thus enabling their ethos to be maintained.
Managing Evaluation: A Community Arts Organisation Perspective
Swan, Peter; Atkinson, Sarah
2014-01-01
Background Arts and health organisations must increasingly provide measurable evidence of impact to stakeholders, which can pose both logistical and ideological challenges. This paper examines the relationship between the ethos of an arts and health organisation with external demands for evaluation. Methods Research involved an ethnographic engagement where the first author worked closely with the organisation for a year. In addition to informal discussions, twenty semi-structured interviews were conducted with core staff and participants. Transcribed interviews were coded and emerging themes were identified. Results Staff considered evaluation to be necessary and useful, yet also to be time consuming and a potential threat to their ethos. Nevertheless, they were able to negotiate the terms of evaluation to enable them to meet their own needs as well as those of funders and other stakeholders. Conclusions While not completely resisting outside demands for evaluation, the organisation was seen to intentionally rework demands for evidence into processes they felt they could work with, thus enabling their ethos to be maintained. PMID:25429306
Software Process Assessment (SPA)
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.
1994-01-01
NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenquin, U.P.; Stewart, K.B.; Heeb, C.M.
1975-07-01
The principal aim of this neutron cross-section research is to provide the utility industry with a 'standard nuclear data base' that will perform satisfactorily when used for analysis of thermal power reactor systems. EPRI is coordinating its activities with those of the Cross Section Evaluation Working Group (CSEWG), responsible for the development of the Evaluated Nuclear Data File-B (ENDF/B) library, in order to improve the performance of the ENDF/B library in thermal reactors and other applications of interest to the utility industry. Battelle-Northwest (BNW) was commissioned to process the ENDF/B Version-4 data files into a group-constant form for use inmore » the LASER and LEOPARD neutronics codes. Performance information on the library should provide the necessary feedback for improving the next version of the library, and a consistent data base is expected to be useful in intercomparing the versions of the LASER and LEOPARD codes presently being used by different utility groups. This report describes the BNW multi-group libraries and the procedures followed in their preparation and testing. (GRA)« less
Zhang, Yequn; Arabaci, Murat; Djordjevic, Ivan B
2012-04-09
Leveraging the advanced coherent optical communication technologies, this paper explores the feasibility of using four-dimensional (4D) nonbinary LDPC-coded modulation (4D-NB-LDPC-CM) schemes for long-haul transmission in future optical transport networks. In contrast to our previous works on 4D-NB-LDPC-CM which considered amplified spontaneous emission (ASE) noise as the dominant impairment, this paper undertakes transmission in a more realistic optical fiber transmission environment, taking into account impairments due to dispersion effects, nonlinear phase noise, Kerr nonlinearities, and stimulated Raman scattering in addition to ASE noise. We first reveal the advantages of using 4D modulation formats in LDPC-coded modulation instead of conventional two-dimensional (2D) modulation formats used with polarization-division multiplexing (PDM). Then we demonstrate that 4D LDPC-coded modulation schemes with nonbinary LDPC component codes significantly outperform not only their conventional PDM-2D counterparts but also the corresponding 4D bit-interleaved LDPC-coded modulation (4D-BI-LDPC-CM) schemes, which employ binary LDPC codes as component codes. We also show that the transmission reach improvement offered by the 4D-NB-LDPC-CM over 4D-BI-LDPC-CM increases as the underlying constellation size and hence the spectral efficiency of transmission increases. Our results suggest that 4D-NB-LDPC-CM can be an excellent candidate for long-haul transmission in next-generation optical networks.
Combined trellis coding with asymmetric MPSK modulation: An MSAT-X report
NASA Technical Reports Server (NTRS)
Simon, M. K.; Divsalar, D.
1985-01-01
Traditionally symmetric, multiple phase-shift-keyed (MPSK) signal constellations, i.e., those with uniformly spaced signal points around the circle, have been used for both uncoded and coded systems. Although symmetric MPSK signal constellations are optimum for systems with no coding, the same is not necessarily true for coded systems. This appears to show that by designing the signal constellations to be asymmetric, one can, in many instances, obtain a significant performance improvement over the traditional symmetric MPSK constellations combined with trellis coding. The joint design of n/(n + 1) trellis codes and asymmetric 2 sup n + 1 - point MPSK is considered, which has a unity bandwidth expansion relative to uncoded 2 sup n-point symmetric MPSK. The asymptotic performance gains due to coding and asymmetry are evaluated in terms of the minimum free Euclidean distance free of the trellis. A comparison of the maximum value of this performance measure with the minimum distance d sub min of the uncoded system is an indication of the maximum reduction in required E sub b/N sub O that can be achieved for arbitrarily small system bit-error rates. It is to be emphasized that the introduction of asymmetry into the signal set does not effect the bandwidth of power requirements of the system; hence, the above-mentioned improvements in performance come at little or no cost. MPSK signal sets in coded systems appear in the work of Divsalar.
What Not To Do: Anti-patterns for Developing Scientific Workflow Software Components
NASA Astrophysics Data System (ADS)
Futrelle, J.; Maffei, A. R.; Sosik, H. M.; Gallager, S. M.; York, A.
2013-12-01
Scientific workflows promise to enable efficient scaling-up of researcher code to handle large datasets and workloads, as well as documentation of scientific processing via standardized provenance records, etc. Workflow systems and related frameworks for coordinating the execution of otherwise separate components are limited, however, in their ability to overcome software engineering design problems commonly encountered in pre-existing components, such as scripts developed externally by scientists in their laboratories. In practice, this often means that components must be rewritten or replaced in a time-consuming, expensive process. In the course of an extensive workflow development project involving large-scale oceanographic image processing, we have begun to identify and codify 'anti-patterns'--problematic design characteristics of software--that make components fit poorly into complex automated workflows. We have gone on to develop and document low-effort solutions and best practices that efficiently address the anti-patterns we have identified. The issues, solutions, and best practices can be used to evaluate and improve existing code, as well as guiding the development of new components. For example, we have identified a common anti-pattern we call 'batch-itis' in which a script fails and then cannot perform more work, even if that work is not precluded by the failure. The solution we have identified--removing unnecessary looping over independent units of work--is often easier to code than the anti-pattern, as it eliminates the need for complex control flow logic in the component. Other anti-patterns we have identified are similarly easy to identify and often easy to fix. We have drawn upon experience working with three science teams at Woods Hole Oceanographic Institution, each of which has designed novel imaging instruments and associated image analysis code. By developing use cases and prototypes within these teams, we have undertaken formal evaluations of software components developed by programmers with widely varying levels of expertise, and have been able to discover and characterize a number of anti-patterns. Our evaluation methodology and testbed have also enabled us to assess the efficacy of strategies to address these anti-patterns according to scientifically relevant metrics, such as ability of algorithms to perform faster than the rate of data acquisition and the accuracy of workflow component output relative to ground truth. The set of anti-patterns and solutions we have identified augments of the body of more well-known software engineering anti-patterns by addressing additional concerns that obtain when a software component has to function as part of a workflow assembled out of independently-developed codebases. Our experience shows that identifying and resolving these anti-patterns reduces development time and improves performance without reducing component reusability.
NASA Astrophysics Data System (ADS)
Karriem, Veronica V.
Nuclear reactor design incorporates the study and application of nuclear physics, nuclear thermal hydraulic and nuclear safety. Theoretical models and numerical methods implemented in computer programs are utilized to analyze and design nuclear reactors. The focus of this PhD study's is the development of an advanced high-fidelity multi-physics code system to perform reactor core analysis for design and safety evaluations of research TRIGA-type reactors. The fuel management and design code system TRIGSIMS was further developed to fulfill the function of a reactor design and analysis code system for the Pennsylvania State Breazeale Reactor (PSBR). TRIGSIMS, which is currently in use at the PSBR, is a fuel management tool, which incorporates the depletion code ORIGEN-S (part of SCALE system) and the Monte Carlo neutronics solver MCNP. The diffusion theory code ADMARC-H is used within TRIGSIMS to accelerate the MCNP calculations. It manages the data and fuel isotopic content and stores it for future burnup calculations. The contribution of this work is the development of an improved version of TRIGSIMS, named TRIGSIMS-TH. TRIGSIMS-TH incorporates a thermal hydraulic module based on the advanced sub-channel code COBRA-TF (CTF). CTF provides the temperature feedback needed in the multi-physics calculations as well as the thermal hydraulics modeling capability of the reactor core. The temperature feedback model is using the CTF-provided local moderator and fuel temperatures for the cross-section modeling for ADMARC-H and MCNP calculations. To perform efficient critical control rod calculations, a methodology for applying a control rod position was implemented in TRIGSIMS-TH, making this code system a modeling and design tool for future core loadings. The new TRIGSIMS-TH is a computer program that interlinks various other functional reactor analysis tools. It consists of the MCNP5, ADMARC-H, ORIGEN-S, and CTF. CTF was coupled with both MCNP and ADMARC-H to provide the heterogeneous temperature distribution throughout the core. Each of these codes is written in its own computer language performing its function and outputs a set of data. TRIGSIMS-TH provides an effective use and data manipulation and transfer between different codes. With the implementation of feedback and control- rod-position modeling methodologies, the TRIGSIMS-TH calculations are more accurate and in a better agreement with measured data. The PSBR is unique in many ways and there are no "off-the-shelf" codes, which can model this design in its entirety. In particular, PSBR has an open core design, which is cooled by natural convection. Combining several codes into a unique system brings many challenges. It also requires substantial knowledge of both operation and core design of the PSBR. This reactor is in operation decades and there is a fair amount of studies and developments in both PSBR thermal hydraulics and neutronics. Measured data is also available for various core loadings and can be used for validation activities. The previous studies and developments in PSBR modeling also aids as a guide to assess the findings of the work herein. In order to incorporate new methods and codes into exiting TRIGSIMS, a re-evaluation of various components of the code was performed to assure the accuracy and efficiency of the existing CTF/MCNP5/ADMARC-H multi-physics coupling. A new set of ADMARC-H diffusion coefficients and cross sections was generated using the SERPENT code. This was needed as the previous data was not generated with thermal hydraulic feedback and the ARO position was used as the critical rod position. The B4C was re-evaluated for this update. The data exchange between ADMARC-H and MCNP5 was modified. The basic core model is given a flexibility to allow for various changes within the core model, and this feature was implemented in TRIGSIMS-TH. The PSBR core in the new code model can be expanded and changed. This allows the new code to be used as a modeling tool for design and analyses of future code loadings.
Self-Shielded Flux Cored Wire Evaluation
1980-12-01
5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) Naval Surface Warfare Center CD Code 2230 - Design Integration Tools Building...ADDRESS( ES ) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release...tensile and yield strength, percent elongation, and percent reduction of area reported. This testing was performed with a Satec 400 WHVP tensile
NASA Astrophysics Data System (ADS)
Baker, Allison H.; Hu, Yong; Hammerling, Dorit M.; Tseng, Yu-heng; Xu, Haiying; Huang, Xiaomeng; Bryan, Frank O.; Yang, Guangwen
2016-07-01
The Parallel Ocean Program (POP), the ocean model component of the Community Earth System Model (CESM), is widely used in climate research. Most current work in CESM-POP focuses on improving the model's efficiency or accuracy, such as improving numerical methods, advancing parameterization, porting to new architectures, or increasing parallelism. Since ocean dynamics are chaotic in nature, achieving bit-for-bit (BFB) identical results in ocean solutions cannot be guaranteed for even tiny code modifications, and determining whether modifications are admissible (i.e., statistically consistent with the original results) is non-trivial. In recent work, an ensemble-based statistical approach was shown to work well for software verification (i.e., quality assurance) on atmospheric model data. The general idea of the ensemble-based statistical consistency testing is to use a qualitative measurement of the variability of the ensemble of simulations as a metric with which to compare future simulations and make a determination of statistical distinguishability. The capability to determine consistency without BFB results boosts model confidence and provides the flexibility needed, for example, for more aggressive code optimizations and the use of heterogeneous execution environments. Since ocean and atmosphere models have differing characteristics in term of dynamics, spatial variability, and timescales, we present a new statistical method to evaluate ocean model simulation data that requires the evaluation of ensemble means and deviations in a spatial manner. In particular, the statistical distribution from an ensemble of CESM-POP simulations is used to determine the standard score of any new model solution at each grid point. Then the percentage of points that have scores greater than a specified threshold indicates whether the new model simulation is statistically distinguishable from the ensemble simulations. Both ensemble size and composition are important. Our experiments indicate that the new POP ensemble consistency test (POP-ECT) tool is capable of distinguishing cases that should be statistically consistent with the ensemble and those that should not, as well as providing a simple, subjective and systematic way to detect errors in CESM-POP due to the hardware or software stack, positively contributing to quality assurance for the CESM-POP code.
NASA Technical Reports Server (NTRS)
Minow, Joseph I.
2011-01-01
Internal charging is a risk to spacecraft in energetic electron environments. DICTAT, NU MIT computational codes are the most widely used engineering tools for evaluating internal charging of insulator materials exposed to these environments. Engineering tools are designed for rapid evaluation of ESD threats, but there is a need for more physics based models for investigating the science of materials interactions with energetic electron environments. Current tools are limited by the physics included in the models and ease of user implementation .... additional development work is needed to improve models.
Etiology of work-related electrical injuries: a narrative analysis of workers' compensation claims.
Lombardi, David A; Matz, Simon; Brennan, Melanye J; Smith, Gordon S; Courtney, Theodore K
2009-10-01
The purpose of this study was to provide new insight into the etiology of primarily nonfatal, work-related electrical injuries. We developed a multistage, case-selection algorithm to identify electrical-related injuries from workers' compensation claims and a customized coding taxonomy to identify pre-injury circumstances. Workers' compensation claims routinely collected over a 1-year period from a large U.S. insurance provider were used to identify electrical-related injuries using an algorithm that evaluated: coded injury cause information, nature of injury, "accident" description, and injury description narratives. Concurrently, a customized coding taxonomy for these narratives was developed to abstract the activity, source, initiating process, mechanism, vector, and voltage. Among the 586,567 reported claims during 2002, electrical-related injuries accounted for 1283 (0.22%) of nonfatal claims and 15 fatalities (1.2% of electrical). Most (72.3%) were male, average age of 36, working in services (33.4%), manufacturing (24.7%), retail trade (17.3%), and construction (7.2%). Body part(s) injured most often were the hands, fingers, or wrist (34.9%); multiple body parts/systems (25.0%); lower/upper arm; elbow; shoulder, and upper extremities (19.2%). The leading activities were conducting manual tasks (55.1%); working with machinery, appliances, or equipment; working with electrical wire; and operating powered or nonpowered hand tools. Primary injury sources were appliances and office equipment (24.4%); wires, cables/cords (18.0%); machines and other equipment (11.8%); fixtures, bulbs, and switches (10.4%); and lightning (4.3%). No vector was identified in 85% of cases. and the work process was initiated by others in less than 1% of cases. Injury narratives provide valuable information to overcome some of the limitations of precoded data, more specially for identifying additional injury cases and in supplementing traditional epidemiologic data for further understanding the etiology of work-related electrical injuries that may lead to further prevention opportunities.
Venepalli, Neeta K; Qamruzzaman, Yusuf; Li, Jianrong John; Lussier, Yves A; Boyd, Andrew D
2014-03-01
To quantify coding ambiguity in International Classification of Diseases Ninth Revision Clinical Modification conversions (ICD-9-CM) to ICD-10-CM mappings for hematology-oncology diagnoses within an Illinois Medicaid database and an academic cancer center database (University of Illinois Cancer Center [UICC]) with the goal of anticipating challenges during ICD-10-CM transition. One data set of ICD-9-CM diagnosis codes came from the 2010 Illinois Department of Medicaid, filtered for diagnoses generated by hematology-oncology providers. The other data set of ICD-9-CM diagnosis codes came from UICC. Using a translational methodology via the Motif Web portal ICD-9-CM conversion tool, ICD-9-CM to ICD-10-CM code conversions were graphically mapped and evaluated for clinical loss of information. The transition to ICD-10-CM led to significant information loss, affecting 8% of total Medicaid codes and 1% of UICC codes; 39 ICD-9-CM codes with information loss accounted for 2.9% of total Medicaid reimbursements and 5.3% of UICC billing charges. Prior work stated hematology-oncology would be the least affected medical specialty. However, information loss affecting 5% of billing costs could evaporate the operating margin of a practice. By identifying codes at risk for complex transitions, the analytic tools described can be replicated for oncology practices to forecast areas requiring additional training and resource allocation. In summary, complex transitions and diagnosis codes associated with information loss within clinical oncology require additional attention during the transition to ICD-10-CM.
Evaluation in industry of a draft code of practice for manual handling.
Ashby, Liz; Tappin, David; Bentley, Tim
2004-05-01
This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.
Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks.
Scarpetta, Silvia; Giacco, Ferdinando
2013-04-01
We study the collective dynamics of a Leaky Integrate and Fire network in which precise relative phase relationship of spikes among neurons are stored, as attractors of the dynamics, and selectively replayed at different time scales. Using an STDP-based learning process, we store in the connectivity several phase-coded spike patterns, and we find that, depending on the excitability of the network, different working regimes are possible, with transient or persistent replay activity induced by a brief signal. We introduce an order parameter to evaluate the similarity between stored and recalled phase-coded pattern, and measure the storage capacity. Modulation of spiking thresholds during replay changes the frequency of the collective oscillation or the number of spikes per cycle, keeping preserved the phases relationship. This allows a coding scheme in which phase, rate and frequency are dissociable. Robustness with respect to noise and heterogeneity of neurons parameters is studied, showing that, since dynamics is a retrieval process, neurons preserve stable precise phase relationship among units, keeping a unique frequency of oscillation, even in noisy conditions and with heterogeneity of internal parameters of the units.
Error correcting coding-theory for structured light illumination systems
NASA Astrophysics Data System (ADS)
Porras-Aguilar, Rosario; Falaggis, Konstantinos; Ramos-Garcia, Ruben
2017-06-01
Intensity discrete structured light illumination systems project a series of projection patterns for the estimation of the absolute fringe order using only the temporal grey-level sequence at each pixel. This work proposes the use of error-correcting codes for pixel-wise correction of measurement errors. The use of an error correcting code is advantageous in many ways: it allows reducing the effect of random intensity noise, it corrects outliners near the border of the fringe commonly present when using intensity discrete patterns, and it provides a robustness in case of severe measurement errors (even for burst errors where whole frames are lost). The latter aspect is particular interesting in environments with varying ambient light as well as in critical safety applications as e.g. monitoring of deformations of components in nuclear power plants, where a high reliability is ensured even in case of short measurement disruptions. A special form of burst errors is the so-called salt and pepper noise, which can largely be removed with error correcting codes using only the information of a given pixel. The performance of this technique is evaluated using both simulations and experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Rosa, Felice
2006-07-01
In the ambit of the Severe Accident Network of Excellence Project (SARNET), funded by the European Union, 6. FISA (Fission Safety) Programme, one of the main tasks is the development and validation of the European Accident Source Term Evaluation Code (ASTEC Code). One of the reference codes used to compare ASTEC results, coming from experimental and Reactor Plant applications, is MELCOR. ENEA is a SARNET member and also an ASTEC and MELCOR user. During the first 18 months of this project, we performed a series of MELCOR and ASTEC calculations referring to a French PWR 900 MWe and to themore » accident sequence of 'Loss of Steam Generator (SG) Feedwater' (known as H2 sequence in the French classification). H2 is an accident sequence substantially equivalent to a Station Blackout scenario, like a TMLB accident, with the only difference that in H2 sequence the scram is forced to occur with a delay of 28 seconds. The main events during the accident sequence are a loss of normal and auxiliary SG feedwater (0 s), followed by a scram when the water level in SG is equal or less than 0.7 m (after 28 seconds). There is also a main coolant pumps trip when {delta}Tsat < 10 deg. C, a total opening of the three relief valves when Tric (core maximal outlet temperature) is above 603 K (330 deg. C) and accumulators isolation when primary pressure goes below 1.5 MPa (15 bar). Among many other points, it is worth noting that this was the first time that a MELCOR 1.8.5 input deck was available for a French PWR 900. The main ENEA effort in this period was devoted to prepare the MELCOR input deck using the code version v.1.8.5 (build QZ Oct 2000 with the latest patch 185003 Oct 2001). The input deck, completely new, was prepared taking into account structure, data and same conditions as those found inside ASTEC input decks. The main goal of the work presented in this paper is to put in evidence where and when MELCOR provides good enough results and why, in some cases mainly referring to its specific models (candling, corium pool behaviour, etc.) they were less good. A future work will be the preparation of an input deck for the new MELCOR 1.8.6. and to perform a code-to-code comparison with ASTEC v1.2 rev. 1. (author)« less
LGBTQ Women, Appearance Negotiations, and Workplace Dress Codes.
Reddy-Best, Kelly L
2018-01-01
The purpose of this study was to explore LGBTQ women's experiences with unwritten or formal dress codes at work. I asked: What are LGBTQ women's experiences in the workplace with appearance management, and what are LGBTQ women's experiences navigating the written and unwritten dress codes in the workplace? To answer the research question, interviews were conducted with 24 self-identifying LGBTQ women. Six key themes emerged from the data. Themes included (1) expressed sexual identity in appearance, (2) unwritten dress codes in work environments did not always allow for expression of sexual identity in appearance, (3) motivations for pressure or desire to conceal expression of sexual identity in appearance at work, (4) negotiations of revealing or concealing sexual identity in appearance in the workplace impacted levels of comfort and confidence, (5) verbal and nonverbal negative experiences related to appearance at work, and (6) received compliments about appearance at work.
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...
2018-06-14
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
JPEG2000 still image coding quality.
Chen, Tzong-Jer; Lin, Sheng-Chieh; Lin, You-Chen; Cheng, Ren-Gui; Lin, Li-Hui; Wu, Wei
2013-10-01
This work demonstrates the image qualities between two popular JPEG2000 programs. Two medical image compression algorithms are both coded using JPEG2000, but they are different regarding the interface, convenience, speed of computation, and their characteristic options influenced by the encoder, quantization, tiling, etc. The differences in image quality and compression ratio are also affected by the modality and compression algorithm implementation. Do they provide the same quality? The qualities of compressed medical images from two image compression programs named Apollo and JJ2000 were evaluated extensively using objective metrics. These algorithms were applied to three medical image modalities at various compression ratios ranging from 10:1 to 100:1. Following that, the quality of the reconstructed images was evaluated using five objective metrics. The Spearman rank correlation coefficients were measured under every metric in the two programs. We found that JJ2000 and Apollo exhibited indistinguishable image quality for all images evaluated using the above five metrics (r > 0.98, p < 0.001). It can be concluded that the image quality of the JJ2000 and Apollo algorithms is statistically equivalent for medical image compression.
Efficient molecular dynamics simulations with many-body potentials on graphics processing units
NASA Astrophysics Data System (ADS)
Fan, Zheyong; Chen, Wei; Vierimaa, Ville; Harju, Ari
2017-09-01
Graphics processing units have been extensively used to accelerate classical molecular dynamics simulations. However, there is much less progress on the acceleration of force evaluations for many-body potentials compared to pairwise ones. In the conventional force evaluation algorithm for many-body potentials, the force, virial stress, and heat current for a given atom are accumulated within different loops, which could result in write conflict between different threads in a CUDA kernel. In this work, we provide a new force evaluation algorithm, which is based on an explicit pairwise force expression for many-body potentials derived recently (Fan et al., 2015). In our algorithm, the force, virial stress, and heat current for a given atom can be accumulated within a single thread and is free of write conflicts. We discuss the formulations and algorithms and evaluate their performance. A new open-source code, GPUMD, is developed based on the proposed formulations. For the Tersoff many-body potential, the double precision performance of GPUMD using a Tesla K40 card is equivalent to that of the LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) molecular dynamics code running with about 100 CPU cores (Intel Xeon CPU X5670 @ 2.93 GHz).
Practice patterns of academic general thoracic and adult cardiac surgeons.
Ingram, Michael T; Wisner, David H; Cooke, David T
2014-10-01
We hypothesized that academic adult cardiac surgeons (CSs) and general thoracic surgeons (GTSs) would have distinct practice patterns of, not just case-mix, but also time devoted to outpatient care, involvement in critical care, and work relative value unit (wRVU) generation for the procedures they perform. We queried the University Health System Consortium-Association of American Medical Colleges Faculty Practice Solution Center database for fiscal years 2007-2008, 2008-2009, and 2009-2010 for the frequency of inpatient and outpatient current procedural terminology coding and wRVU data of academic GTSs and CSs. The Faculty Practice Solution Center database is a compilation of productivity and payer data from 86 academic institutions. The greatest wRVU generating current procedural terminology codes for CSs were, in order, coronary artery bypass grafting, aortic valve replacement, and mitral valve replacement. In contrast, open lobectomy, video-assisted thoracic surgery wedge, and video-assisted thoracic surgery lobectomy were greatest for GTSs. The 10 greatest wRVU-generating procedures for CSs generated more wRVUs than those for GTSs (P<.001). Although CSs generated significantly more hospital inpatient evaluation and management (E & M) wRVUs than did GTSs (P<.001), only 2.5% of the total wRVUs generated by CSs were from E & M codes versus 18.8% for GTSs. Critical care codes were 1.5% of total evaluation and management billing for both CSs and GTSs. Academic CSs and GTSs have distinct practice patterns. CSs receive greater reimbursement for services because of the greater wRVUs of the procedures performed compared with GTSs, and evaluation and management coding is a more important wRVU generator for GTSs. The results of our study could guide academic CS and GTS practice structure and time prioritization. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
World Breastfeeding Week 1994: making the Code work.
1994-01-01
WHO adopted the International Code of Marketing of Breastmilk Substitutes in 1981, with the US being the only member voting against it. US abandoned its opposition and voted for the International Code at the World Health Assembly in May 1994. The US was also part of a unanimous vote to promote a resolution that clearly proclaims breast milk to be better than breast milk substitutes and the best food for infants. World Breastfeeding Week 1994 began more efforts to promote the International Code. In 1994, through its Making the Code Work campaign, the World Alliance for Breastfeeding Action (WABA) will work on increasing awareness about the mission and promise of the International Code, notify governments of the Innocenti target date, call for governments to introduce rules and regulations based on the International Code, and encourage public interest groups, professional organizations, and the general public to monitor enforcement of the Code. So far, 11 countries have passed legislation including all or almost all provisions of the International Code. Governments of 36 countries have passed legislation including only some provisions of the International Code. The International Baby Food Action Network (IBFAN), a coalition of more than 140 breastfeeding promotion groups, monitors implementation of the Code worldwide. IBFAN substantiates 1000s of violations of the Code in its report, Breaking the Rules 1994. The violations consist of promoting breast milk substitutes to health workers, using labels describing a brand of formula in idealizing terms, or using labels that do not have warnings in the local language. We should familiarize ourselves with the provisions of the International Code and the status of the Code in our country. WABA provides an action folder which contains basic background information on the code and action ideas.
Preliminary numerical investigation of bandwidth effects on CBET using the LPSE-CBET code
NASA Astrophysics Data System (ADS)
Bates, Jason; Myatt, Jason; Shaw, John; Weaver, James; Obenschain, Keith; Lehmberg, Robert; Obenschain, Steve
2016-10-01
Cross beam energy transfer (CBET) is a significant energy-loss mechanism for direct-drive implosions on the OMEGA laser facility. Recently, a working group that includes participants from the Laboratory for Laser Energetics (LLE) at the University of Rochester and the U.S. Naval Research Laboratory (NRL) was formed to investigate strategies for ameliorating the deleterious effects of CBET. As part of this collaboration, the wave-based code LPSE-CBET developed at LLE has been made available to researchers at NRL and is being used to study the feasibility of suppressing CBET through the enhancement of laser bandwidth by stimulated rotational Raman scattering (SRRS). In this poster, we present some preliminary results on this subject. In particular, we discuss initial efforts to evaluate mitigation levels of 4 discrete Stokes lines from SRRS in air and compare our findings with ray-based simulation results of wavelength shifted (-6Å ,0, +6Å) driver-lines on OMEGA. Work Supported by DoE/NNSA.
The construction FACE database - Codifying the NIOSH FACE reports.
Dong, Xiuwen Sue; Largay, Julie A; Wang, Xuanwen; Cain, Chris Trahan; Romano, Nancy
2017-09-01
The National Institute for Occupational Safety and Health (NIOSH) has published reports detailing the results of investigations on selected work-related fatalities through the Fatality Assessment and Control Evaluation (FACE) program since 1982. Information from construction-related FACE reports was coded into the Construction FACE Database (CFD). Use of the CFD was illustrated by analyzing major CFD variables. A total of 768 construction fatalities were included in the CFD. Information on decedents, safety training, use of PPE, and FACE recommendations were coded. Analysis shows that one in five decedents in the CFD died within the first two months on the job; 75% and 43% of reports recommended having safety training or installing protection equipment, respectively. Comprehensive research using FACE reports may improve understanding of work-related fatalities and provide much-needed information on injury prevention. The CFD allows researchers to analyze the FACE reports quantitatively and efficiently. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.
Performance of the OVERFLOW-MLP and LAURA-MLP CFD Codes on the NASA Ames 512 CPU Origin System
NASA Technical Reports Server (NTRS)
Taft, James R.
2000-01-01
The shared memory Multi-Level Parallelism (MLP) technique, developed last year at NASA Ames has been very successful in dramatically improving the performance of important NASA CFD codes. This new and very simple parallel programming technique was first inserted into the OVERFLOW production CFD code in FY 1998. The OVERFLOW-MLP code's parallel performance scaled linearly to 256 CPUs on the NASA Ames 256 CPU Origin 2000 system (steger). Overall performance exceeded 20.1 GFLOP/s, or about 4.5x the performance of a dedicated 16 CPU C90 system. All of this was achieved without any major modification to the original vector based code. The OVERFLOW-MLP code is now in production on the inhouse Origin systems as well as being used offsite at commercial aerospace companies. Partially as a result of this work, NASA Ames has purchased a new 512 CPU Origin 2000 system to further test the limits of parallel performance for NASA codes of interest. This paper presents the performance obtained from the latest optimization efforts on this machine for the LAURA-MLP and OVERFLOW-MLP codes. The Langley Aerothermodynamics Upwind Relaxation Algorithm (LAURA) code is a key simulation tool in the development of the next generation shuttle, interplanetary reentry vehicles, and nearly all "X" plane development. This code sustains about 4-5 GFLOP/s on a dedicated 16 CPU C90. At this rate, expected workloads would require over 100 C90 CPU years of computing over the next few calendar years. It is not feasible to expect that this would be affordable or available to the user community. Dramatic performance gains on cheaper systems are needed. This code is expected to be perhaps the largest consumer of NASA Ames compute cycles per run in the coming year.The OVERFLOW CFD code is extensively used in the government and commercial aerospace communities to evaluate new aircraft designs. It is one of the largest consumers of NASA supercomputing cycles and large simulations of highly resolved full aircraft are routinely undertaken. Typical large problems might require 100s of Cray C90 CPU hours to complete. The dramatic performance gains with the 256 CPU steger system are exciting. Obtaining results in hours instead of months is revolutionizing the way in which aircraft manufacturers are looking at future aircraft simulation work. Figure 2 below is a current state of the art plot of OVERFLOW-MLP performance on the 512 CPU Lomax system. As can be seen, the chart indicates that OVERFLOW-MLP continues to scale linearly with CPU count up to 512 CPUs on a large 35 million point full aircraft RANS simulation. At this point performance is such that a fully converged simulation of 2500 time steps is completed in less than 2 hours of elapsed time. Further work over the next few weeks will improve the performance of this code even further.The LAURA code has been converted to the MLP format as well. This code is currently being optimized for the 512 CPU system. Performance statistics indicate that the goal of 100 GFLOP/s will be achieved by year's end. This amounts to 20x the 16 CPU C90 result and strongly demonstrates the viability of the new parallel systems rapidly solving very large simulations in a production environment.
Froelich, John; Milbrandt, Joseph C; Allan, D Gordon
2009-01-01
This study examines the impact of the 80-hour workweek on the number of surgical cases performed by PGY-2 through PGY-5 orthopedic residents. We also evaluated orthopedic in-training examination (OITE) scores during the same time period. Data were collected from the Accreditation Council for Graduate Medical Education (ACGME) national database for 3 academic years before and 5 years after July 1, 2003. CPT surgical procedure codes logged by all residents 3 years before and 5 years after implementation of the 80-hour workweek were compared. The average raw OITE scores for each class obtained during the same time period were also evaluated. Data were reported as the mean +/- standard deviation (SD), and group means were compared using independent t-tests. No statistical difference was noted in the number of surgical procedure codes logged before or after the institution of the 80-hour week during any single year of training. However, an increase in the number of CPT codes logged in the PGY-3 years after 2003 did approach significance (457.7 vs 551.9, p = 0.057). Overall, the average number of cases performed per resident increased each year after implementation of the work-hour restriction (464.4 vs 515.5 cases). No statistically significant difference was noted in the raw OITE scores before or after work-hour restrictions for our residents or nationally. We found no statistical difference for each residency class in the average number of cases performed or OITE scores, although the total number of cases performed has increased after implementation of the work-hour restrictions. We also found no statistical difference in the national OITE scores. Our data suggest that the impact of the 80-hour workweek has not had a detrimental effect on these 2 resident training measurements.
Perception of and attitude toward ethical issues among Korean occupational physicians.
Choi, Junghye; Suh, Chunhui; Lee, Jong-Tae; Lee, Segyeong; Lee, Chae-Kwan; Lee, Gyeong-Jin; Kim, Taekjoong; Son, Byung-Chul; Kim, Jeong-Ho; Kim, Kunhyung; Kim, Dae Hwan; Ryu, Ji Young
2017-01-01
Occupational physicians (OPs) have complex relationships with employees, employers, and the general public. OPs may have simultaneous obligations towards third parties, which can lead to variable conflicts of interests. Among the various studies of ethical issues related to OPs, few have focused on the Korean OPs. The aim of the present survey was to investigate the ethical contexts, the practical resolutions, and the ethical principles for the Korean OPs. An email with a self-administered questionnaire was sent to members of the Korean Society of Occupational and Environmental Medicine, comprising 150 specialists and 130 residents. The questionnaire was also distributed to 52 specialists and 46 residents who attended the annual meeting of the Korean Association of Occupational and Environmental Clinics in October 2015, and to 240 specialists by uploading the questionnaire to the online community 'oem-doctors' in February 2016. The responses to each question (perception of general ethical conflicts, recognition of various ethical codes for OPs, core professional values in ethics of occupational medicine, and a mock case study) were compared between specialists and residents by the chi-squared test and Fisher's exact test. Responses were received from 80 specialists and 71 residents. Most participants had experienced ethical conflicts at work and felt the need for systematic education and training. OPs suffered the most ethical conflicts in decisions regarding occupational health examination and evaluation for work relatedness. Over 60% of total participants were unaware of the ethical codes of other countries. Participants thought 'consideration of worker's health and safety' (26.0%) and 'neutrality' (24.7%) as the prominent ethical values in professionality ofoccupational medicine. In mock cases, participants chose beneficence and justice for fitness for work and confidential information acquired while on duty, and beneficence and respect for autonomy in pre-placement examinations. This study evaluated the current perception of and attitude toward ethical issues among the Korean OPs. These findings will facilitate the development of a code of ethics and the ethical decision-making program forthe Korean OPs.
2013-01-01
Background The formulation and implementation of national ethical regulations to protect research participants is fundamental to ethical conduct of research. Ethics education and capacity are inadequate in developing African countries. This study was designed to develop a module for online training in research ethics based on the Nigerian National Code of Health Research Ethics and assess its ease of use and reliability among biomedical researchers in Nigeria. Methodology This was a three-phased evaluation study. Phase one involved development of an online training module based on the Nigerian Code of Health Research Ethics (NCHRE) and uploading it to the Collaborative Institutional Training Initiative (CITI) website while the second phase entailed the evaluation of the module for comprehensibility, readability and ease of use by 45 Nigerian biomedical researchers. The third phase involved modification and re-evaluation of the module by 30 Nigerian biomedical researchers and determination of test-retest reliability of the module using Cronbach’s alpha. Results The online module was easily accessible and comprehensible to 95% of study participants. There were significant differences in the pretest and posttest scores of study participants during the evaluation of the online module (p = 0.001) with correlation coefficients of 0.9 and 0.8 for the pretest and posttest scores respectively. The module also demonstrated excellent test-retest reliability and internal consistency as shown by Cronbach’s alpha coefficients of 0.92 and 0.84 for the pretest and posttest respectively. Conclusion The module based on the Nigerian Code was developed, tested and made available online as a valuable tool for training in cultural and societal relevant ethical principles to orient national and international biomedical researchers working in Nigeria. It would complement other general research ethics and Good Clinical Practice modules. Participants suggested that awareness of the online module should be increased through seminars, advertisement on government websites and portals used by Nigerian biomedical researchers, and incorporation of the Code into the undergraduate medical training curriculum. PMID:23281968
Empirical evaluation of H.265/HEVC-based dynamic adaptive video streaming over HTTP (HEVC-DASH)
NASA Astrophysics Data System (ADS)
Irondi, Iheanyi; Wang, Qi; Grecos, Christos
2014-05-01
Real-time HTTP streaming has gained global popularity for delivering video content over Internet. In particular, the recent MPEG-DASH (Dynamic Adaptive Streaming over HTTP) standard enables on-demand, live, and adaptive Internet streaming in response to network bandwidth fluctuations. Meanwhile, emerging is the new-generation video coding standard, H.265/HEVC (High Efficiency Video Coding) promises to reduce the bandwidth requirement by 50% at the same video quality when compared with the current H.264/AVC standard. However, little existing work has addressed the integration of the DASH and HEVC standards, let alone empirical performance evaluation of such systems. This paper presents an experimental HEVC-DASH system, which is a pull-based adaptive streaming solution that delivers HEVC-coded video content through conventional HTTP servers where the client switches to its desired quality, resolution or bitrate based on the available network bandwidth. Previous studies in DASH have focused on H.264/AVC, whereas we present an empirical evaluation of the HEVC-DASH system by implementing a real-world test bed, which consists of an Apache HTTP Server with GPAC, an MP4Client (GPAC) with open HEVC-based DASH client and a NETEM box in the middle emulating different network conditions. We investigate and analyze the performance of HEVC-DASH by exploring the impact of various network conditions such as packet loss, bandwidth and delay on video quality. Furthermore, we compare the Intra and Random Access profiles of HEVC coding with the Intra profile of H.264/AVC when the correspondingly encoded video is streamed with DASH. Finally, we explore the correlation among the quality metrics and network conditions, and empirically establish under which conditions the different codecs can provide satisfactory performance.
Modeling of transitional flows
NASA Technical Reports Server (NTRS)
Lund, Thomas S.
1988-01-01
An effort directed at developing improved transitional models was initiated. The focus of this work was concentrated on the critical assessment of a popular existing transitional model developed by McDonald and Fish in 1972. The objective of this effort was to identify the shortcomings of the McDonald-Fish model and to use the insights gained to suggest modifications or alterations of the basic model. In order to evaluate the transitional model, a compressible boundary layer code was required. Accordingly, a two-dimensional compressible boundary layer code was developed. The program was based on a three-point fully implicit finite difference algorithm where the equations were solved in an uncoupled manner with second order extrapolation used to evaluate the non-linear coefficients. Iteration was offered as an option if the extrapolation error could not be tolerated. The differencing scheme was arranged to be second order in both spatial directions on an arbitrarily stretched mesh. A variety of boundary condition options were implemented including specification of an external pressure gradient, specification of a wall temperature distribution, and specification of an external temperature distribution. Overall the results of the initial phase of this work indicate that the McDonald-Fish model does a poor job at predicting the details of the turbulent flow structure during the transition region.
Multiphysics Code Demonstrated for Propulsion Applications
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Melis, Matthew E.
1998-01-01
The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.
Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.
1990-01-01
Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results ofmore » such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech.« less
Production Level CFD Code Acceleration for Hybrid Many-Core Architectures
NASA Technical Reports Server (NTRS)
Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.
2012-01-01
In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.
Measuring diagnoses: ICD code accuracy.
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-10-01
To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.
McKenzie, Kirsten; Mitchell, Rebecca; Scott, Deborah Anne; Harrison, James Edward; McClure, Roderick John
2009-08-01
To examine the reliability of work-related activity coding for injury-related hospitalisations in Australia. A random sample of 4,373 injury-related hospital separations from 1 July 2002 to 30 June 2004 were obtained from a stratified random sample of 50 hospitals across four states in Australia. From this sample, cases were identified as work-related if they contained an ICD-10-AM work-related activity code (U73) allocated by either: (i) the original coder; (ii) an independent auditor, blinded to the original code; or (iii) a research assistant, blinded to both the original and auditor codes, who reviewed narrative text extracted from the medical record. The concordance of activity coding and number of cases identified as work-related using each method were compared. Of the 4,373 cases sampled, 318 cases were identified as being work-related using any of the three methods for identification. The original coder identified 217 and the auditor identified 266 work-related cases (68.2% and 83.6% of the total cases identified, respectively). Around 10% of cases were only identified through the text description review. The original coder and auditor agreed on the assignment of work-relatedness for 68.9% of cases. The best estimates of the frequency of hospital admissions for occupational injury underestimate the burden by around 32%. This is a substantial underestimate that has major implications for public policy, and highlights the need for further work on improving the quality and completeness of routine, administrative data sources for a more complete identification of work-related injuries.
Bhattacharya, Moumita; Jurkovitz, Claudine; Shatkay, Hagit
2018-04-12
Patients associated with multiple co-occurring health conditions often face aggravated complications and less favorable outcomes. Co-occurring conditions are especially prevalent among individuals suffering from kidney disease, an increasingly widespread condition affecting 13% of the general population in the US. This study aims to identify and characterize patterns of co-occurring medical conditions in patients employing a probabilistic framework. Specifically, we apply topic modeling in a non-traditional way to find associations across SNOMED-CT codes assigned and recorded in the EHRs of >13,000 patients diagnosed with kidney disease. Unlike most prior work on topic modeling, we apply the method to codes rather than to natural language. Moreover, we quantitatively evaluate the topics, assessing their tightness and distinctiveness, and also assess the medical validity of our results. Our experiments show that each topic is succinctly characterized by a few highly probable and unique disease codes, indicating that the topics are tight. Furthermore, inter-topic distance between each pair of topics is typically high, illustrating distinctiveness. Last, most coded conditions grouped together within a topic, are indeed reported to co-occur in the medical literature. Notably, our results uncover a few indirect associations among conditions that have hitherto not been reported as correlated in the medical literature. Copyright © 2018. Published by Elsevier Inc.
An evaluation of the effect of JPEG, JPEG2000, and H.264/AVC on CQR codes decoding process
NASA Astrophysics Data System (ADS)
Vizcarra Melgar, Max E.; Farias, Mylène C. Q.; Zaghetto, Alexandre
2015-02-01
This paper presents a binarymatrix code based on QR Code (Quick Response Code), denoted as CQR Code (Colored Quick Response Code), and evaluates the effect of JPEG, JPEG2000 and H.264/AVC compression on the decoding process. The proposed CQR Code has three additional colors (red, green and blue), what enables twice as much storage capacity when compared to the traditional black and white QR Code. Using the Reed-Solomon error-correcting code, the CQR Code model has a theoretical correction capability of 38.41%. The goal of this paper is to evaluate the effect that degradations inserted by common image compression algorithms have on the decoding process. Results show that a successful decoding process can be achieved for compression rates up to 0.3877 bits/pixel, 0.1093 bits/pixel and 0.3808 bits/pixel for JPEG, JPEG2000 and H.264/AVC formats, respectively. The algorithm that presents the best performance is the H.264/AVC, followed by the JPEG2000, and JPEG.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cullen, D.E.
1977-01-12
A code, SIGMA1, has been designed to Doppler broaden evaluated cross sections in the ENDF/B format. The code can only be applied to tabulated data that vary linearly in energy and cross section between tabulated points. This report describes the methods used in the code and serves as a user's guide to the code.
MHD thrust vectoring of a rocket engine
NASA Astrophysics Data System (ADS)
Labaune, Julien; Packan, Denis; Tholin, Fabien; Chemartin, Laurent; Stillace, Thierry; Masson, Frederic
2016-09-01
In this work, the possibility to use MagnetoHydroDynamics (MHD) to vectorize the thrust of a solid propellant rocket engine exhaust is investigated. Using a magnetic field for vectoring offers a mass gain and a reusability advantage compared to standard gimbaled, elastomer-joint systems. Analytical and numerical models were used to evaluate the flow deviation with a 1 Tesla magnetic field inside the nozzle. The fluid flow in the resistive MHD approximation is calculated using the KRONOS code from ONERA, coupling the hypersonic CFD platform CEDRE and the electrical code SATURNE from EDF. A critical parameter of these simulations is the electrical conductivity, which was evaluated using a set of equilibrium calculations with 25 species. Two models were used: local thermodynamic equilibrium and frozen flow. In both cases, chlorine captures a large fraction of free electrons, limiting the electrical conductivity to a value inadequate for thrust vectoring applications. However, when using chlorine-free propergols with 1% in mass of alkali, an MHD thrust vectoring of several degrees was obtained.
Spread Spectrum Visual Sensor Network Resource Management Using an End-to-End Cross-Layer Design
2011-02-01
Coding In this work, we use rate compatible punctured convolutional (RCPC) codes for channel coding [11]. Using RCPC codes al- lows us to utilize Viterbi’s...11] J. Hagenauer, “ Rate - compatible punctured convolutional codes (RCPC codes ) and their applications,” IEEE Trans. Commun., vol. 36, no. 4, pp. 389...source coding rate , a channel coding rate , and a power level to all nodes in the
Le Moual, Nicole; Zock, Jan-Paul; Dumas, Orianne; Lytras, Theodore; Andersson, Eva; Lillienberg, Linnéa; Schlünssen, Vivi; Benke, Geza; Kromhout, Hans
2018-07-01
We aimed to update an asthmagen job exposure matrix (JEM) developed in the late 1990s. Main reasons were: the number of suspected and recognised asthmagens has since tripled; understanding of the aetiological role of irritants in asthma and methodological insights in application of JEMs have emerged in the period. For each agent of the new occupational asthma-specific JEM (OAsJEM), a working group of three experts out of eight evaluated exposure for each International Standard Classification of Occupations, 1988 (ISCO-88) job code into three categories: 'high' (high probability of exposure and moderate-to-high intensity), 'medium' (low-to-moderate probability or low intensity) and 'unexposed'. Within a working group, experts evaluated exposures independently from each other. If expert assessments were inconsistent the final decision was taken by consensus. Specificity was favoured over sensitivity, that is, jobs were classified with high exposure only if the probability of exposure was high and the intensity moderate-to-high. In the final review, all experts checked assigned exposures and proposed/improved recommendations for expert re-evaluation after default application of the JEM. The OAsJEM covers exposures to 30 sensitisers/irritants, including 12 newly recognised, classified into seven broad groups. Initial agreement between the three experts was mostly fair to moderate (κ values 0.2-0.5). Out of 506 ISCO-88 codes, the majority was classified as unexposed (from 82.6% (organic solvents) to 99.8% (persulfates)) and a minority as 'high-exposed' (0.2% (persulfates) to 2.6% (organic solvents)). The OAsJEM developed to improve occupational exposure assessment may improve evaluations of associations with asthma in epidemiological studies and contribute to assessment of the burden of work-related asthma. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
The Contract Management Body of Knowledge: A Comparison of Contracting Competencies
2013-12-01
SME subject matter expert SOW statement of work TINA Truth in Negotiations Act UCC uniform commercial code WBS work breakdown structure xv...documents whose terms and condition are legally enforceable. Sources of law and guidance covered include the uniform commercial code ( UCC ), Federal...contracting including the uniform commercial code ( UCC ), Federal Acquisition Regulation (FAR), as well as various other laws pertaining to both
DOT National Transportation Integrated Search
2001-02-01
Problems, solutions and recommendations for implementation have been contributed by 16 of the 27 CODES states and organized as appropriate under the administrative, linkage and application requirements for a Crash Outcome Data Evaluation System (CODE...
The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions
NASA Astrophysics Data System (ADS)
Iwamoto, O.; Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.
2016-01-01
A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.
The content of the work of clinical nurse specialists described by use of daily activity diaries.
Oddsdóttir, Elín Jakobína; Sveinsdóttir, Herdís
2011-05-01
Evaluate the usefulness of the role of clinical nurse specialists and the content of their work by mapping their activities. The clinical work of advanced practice nursing differs in different countries, and a clear picture is lacking on what exactly advanced practice nurses do. Prospective exploratory study. The setting of the study was the largest hospital in Iceland where over half of the country's active nursing workforce are employed, including the only clinical nurse specialists. Of 19 clinical nurse specialists working at the hospital, 15 participated. Data were collected over seven days with a structured activity diary that lists 65 activities, classified into six roles and three domains. In 17 instances, the 'role activities' and 'domain activities' overlap and form 17 categories of practice. The clinical nurse specialists coded their activities at 15-minutes interval and could code up to four activities simultaneously. Daily, the clinical nurse specialists evaluated their clinical nurse specialist background. The roles that occupied the greatest proportion of the clinical nurse specialists' time were education, expert practice and 'other' activities, while the smallest proportions were in counselling, research and practice development. The domain they worked in most was the institutional domain, followed by the client/family domain and the clinical outcome management domain. All of the clinical nurse specialists reported working on two activities simultaneously, 11 of them on three activities and six on four activities. They self-assessed their background as clinical nurse specialists as being very useful. The activity diary is a useful tool for assessing the content of practice. Clinical nurse specialists spend too much time on activities related to the institution. Nurse managers are advised to provide clinical nurse specialists with ample time to develop the direct practice role in the client/family domain. The development of advanced nursing practice requires that clinical nurse specialists take an active and visible part in direct patient care. © 2011 Blackwell Publishing Ltd.
Sears, Jeanne M; Blanar, Laura; Bowman, Stephen M
2014-01-01
Acute work-related trauma is a leading cause of death and disability among U.S. workers. Occupational health services researchers have described the pressing need to identify valid injury severity measures for purposes such as case-mix adjustment and the construction of appropriate comparison groups in programme evaluation, intervention, quality improvement, and outcome studies. The objective of this study was to compare the performance of several injury severity scores and scoring methods in the context of predicting work-related disability and medical cost outcomes. Washington State Trauma Registry (WTR) records for injuries treated from 1998 to 2008 were linked with workers' compensation claims. Several Abbreviated Injury Scale (AIS)-based injury severity measures (ISS, New ISS, maximum AIS) were estimated directly from ICD-9-CM codes using two software packages: (1) ICDMAP-90, and (2) Stata's user-written ICDPIC programme (ICDPIC). ICDMAP-90 and ICDPIC scores were compared with existing WTR scores using the Akaike Information Criterion, amount of variance explained, and estimated effects on outcomes. Competing risks survival analysis was used to evaluate work disability outcomes. Adjusted total medical costs were modelled using linear regression. The linked sample contained 6052 work-related injury events. There was substantial agreement between WTR scores and those estimated by ICDMAP-90 (kappa=0.73), and between WTR scores and those estimated by ICDPIC (kappa=0.68). Work disability and medical costs increased monotonically with injury severity, and injury severity was a significant predictor of work disability and medical cost outcomes in all models. WTR and ICDMAP-90 scores performed better with regard to predicting outcomes than did ICDPIC scores, but effect estimates were similar. Of the three severity measures, maxAIS was usually weakest, except when predicting total permanent disability. Injury severity was significantly associated with work disability and medical cost outcomes for work-related injuries. Injury severity can be estimated using either ICDMAP-90 or ICDPIC when ICD-9-CM codes are available. We observed little practical difference between severity measures or scoring methods. This study demonstrated that using existing software to estimate injury severity may be useful to enhance occupational injury surveillance and research. Copyright © 2013 Elsevier Ltd. All rights reserved.
Application of IPAD to missile design
NASA Technical Reports Server (NTRS)
Santa, J. E.; Whiting, T. R.
1974-01-01
The application of an integrated program for aerospace-vehicle design (IPAD) to the design of a tactical missile is examined. The feasibility of modifying a proposed IPAD system for aircraft design work for use in missile design is evaluated. The tasks, cost, and schedule for the modification are presented. The basic engineering design process is described, explaining how missile design is achieved through iteration of six logical problem solving functions throughout the system studies, preliminary design, and detailed design phases of a new product. Existing computer codes used in various engineering disciplines are evaluated for their applicability to IPAD in missile design.
An Examination of the Reliability of the Organizational Assessment Package (OAP).
1981-07-01
reactiv- ity or pretest sensitization (Bracht and Glass, 1968) may occur. In this case, the change from pretest to posttest can be caused just by the...content items. The blocks for supervisor’s code were left blank, work group code was coded as all ones , and each person’s seminar number was coded in...63 5 19 .91 .74 5 (Work Group Effective- ness) 822 19 .83 .42 7 17 .90 .57 7 (Job Related Sati sfacti on ) 823 16 .91 .84 2 18 .93 .87 2 (Job Related
Cosmological parameters from a re-analysis of the WMAP 7 year low-resolution maps
NASA Astrophysics Data System (ADS)
Finelli, F.; De Rosa, A.; Gruppuso, A.; Paoletti, D.
2013-06-01
Cosmological parameters from Wilkinson Microwave Anisotropy Probe (WMAP) 7 year data are re-analysed by substituting a pixel-based likelihood estimator to the one delivered publicly by the WMAP team. Our pixel-based estimator handles exactly intensity and polarization in a joint manner, allowing us to use low-resolution maps and noise covariance matrices in T, Q, U at the same resolution, which in this work is 3.6°. We describe the features and the performances of the code implementing our pixel-based likelihood estimator. We perform a battery of tests on the application of our pixel-based likelihood routine to WMAP publicly available low-resolution foreground-cleaned products, in combination with the WMAP high-ℓ likelihood, reporting the differences on cosmological parameters evaluated by the full WMAP likelihood public package. The differences are not only due to the treatment of polarization, but also to the marginalization over monopole and dipole uncertainties present in the WMAP pixel likelihood code for temperature. The credible central value for the cosmological parameters change below the 1σ level with respect to the evaluation by the full WMAP 7 year likelihood code, with the largest difference in a shift to smaller values of the scalar spectral index nS.
Efficient space-time sampling with pixel-wise coded exposure for high-speed imaging.
Liu, Dengyu; Gu, Jinwei; Hitomi, Yasunobu; Gupta, Mohit; Mitsunaga, Tomoo; Nayar, Shree K
2014-02-01
Cameras face a fundamental trade-off between spatial and temporal resolution. Digital still cameras can capture images with high spatial resolution, but most high-speed video cameras have relatively low spatial resolution. It is hard to overcome this trade-off without incurring a significant increase in hardware costs. In this paper, we propose techniques for sampling, representing, and reconstructing the space-time volume to overcome this trade-off. Our approach has two important distinctions compared to previous works: 1) We achieve sparse representation of videos by learning an overcomplete dictionary on video patches, and 2) we adhere to practical hardware constraints on sampling schemes imposed by architectures of current image sensors, which means that our sampling function can be implemented on CMOS image sensors with modified control units in the future. We evaluate components of our approach, sampling function and sparse representation, by comparing them to several existing approaches. We also implement a prototype imaging system with pixel-wise coded exposure control using a liquid crystal on silicon device. System characteristics such as field of view and modulation transfer function are evaluated for our imaging system. Both simulations and experiments on a wide range of scenes show that our method can effectively reconstruct a video from a single coded image while maintaining high spatial resolution.
Application of Aeroelastic Solvers Based on Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Srivastava, Rakesh
1998-01-01
A pre-release version of the Navier-Stokes solver (TURBO) was obtained from MSU. Along with Dr. Milind Bakhle of the University of Toledo, subroutines for aeroelastic analysis were developed and added to the TURBO code to develop versions 1 and 2 of the TURBO-AE code. For specified mode shape, frequency and inter-blade phase angle the code calculates the work done by the fluid on the rotor for a prescribed sinusoidal motion. Positive work on the rotor indicates instability of the rotor. The version 1 of the code calculates the work for in-phase blade motions only. In version 2 of the code, the capability for analyzing all possible inter-blade phase angles, was added. The version 2 of TURBO-AE code was validated and delivered to NASA and the industry partners of the AST project. The capabilities and the features of the code are summarized in Refs. [1] & [2]. To release the version 2 of TURBO-AE, a workshop was organized at NASA Lewis, by Dr. Srivastava and Dr. M. A. Bakhle, both of the University of Toledo, in October of 1996 for the industry partners of NASA Lewis. The workshop provided the potential users of TURBO-AE, all the relevant information required in preparing the input data, executing the code, interpreting the results and bench marking the code on their computer systems. After the code was delivered to the industry partners, user support was also provided. A new version of the Navier-Stokes solver (TURBO) was later released by MSU. This version had significant changes and upgrades over the previous version. This new version was merged with the TURBO-AE code. Also, new boundary conditions for 3-D unsteady non-reflecting boundaries, were developed by researchers from UTRC, Ref. [3]. Time was spent on understanding, familiarizing, executing and implementing the new boundary conditions into the TURBO-AE code. Work was started on the phase lagged (time-shifted) boundary condition version (version 4) of the code. This will allow the users to calculate non-zero interblade phase angles using, only one blade passage for analysis.
NASA Astrophysics Data System (ADS)
Jafari, Mehdi; Kasaei, Shohreh
2012-01-01
Automatic brain tissue segmentation is a crucial task in diagnosis and treatment of medical images. This paper presents a new algorithm to segment different brain tissues, such as white matter (WM), gray matter (GM), cerebral spinal fluid (CSF), background (BKG), and tumor tissues. The proposed technique uses the modified intraframe coding yielded from H.264/(AVC), for feature extraction. Extracted features are then imposed to an artificial back propagation neural network (BPN) classifier to assign each block to its appropriate class. Since the newest coding standard, H.264/AVC, has the highest compression ratio, it decreases the dimension of extracted features and thus yields to a more accurate classifier with low computational complexity. The performance of the BPN classifier is evaluated using the classification accuracy and computational complexity terms. The results show that the proposed technique is more robust and effective with low computational complexity compared to other recent works.
NASA Astrophysics Data System (ADS)
Jafari, Mehdi; Kasaei, Shohreh
2011-12-01
Automatic brain tissue segmentation is a crucial task in diagnosis and treatment of medical images. This paper presents a new algorithm to segment different brain tissues, such as white matter (WM), gray matter (GM), cerebral spinal fluid (CSF), background (BKG), and tumor tissues. The proposed technique uses the modified intraframe coding yielded from H.264/(AVC), for feature extraction. Extracted features are then imposed to an artificial back propagation neural network (BPN) classifier to assign each block to its appropriate class. Since the newest coding standard, H.264/AVC, has the highest compression ratio, it decreases the dimension of extracted features and thus yields to a more accurate classifier with low computational complexity. The performance of the BPN classifier is evaluated using the classification accuracy and computational complexity terms. The results show that the proposed technique is more robust and effective with low computational complexity compared to other recent works.
Large eddy simulation of fine water sprays: comparative analysis of two models and computer codes
NASA Astrophysics Data System (ADS)
Tsoy, A. S.; Snegirev, A. Yu.
2015-09-01
The model and the computer code FDS, albeit widely used in engineering practice to predict fire development, is not sufficiently validated for fire suppression by fine water sprays. In this work, the effect of numerical resolution of the large scale turbulent pulsations on the accuracy of predicted time-averaged spray parameters is evaluated. Comparison of the simulation results obtained with the two versions of the model and code, as well as that of the predicted and measured radial distributions of the liquid flow rate revealed the need to apply monotonic and yet sufficiently accurate discrete approximations of the convective terms. Failure to do so delays jet break-up, otherwise induced by large turbulent eddies, thereby excessively focuses the predicted flow around its axis. The effect of the pressure drop in the spray nozzle is also examined, and its increase has shown to cause only weak increase of the evaporated fraction and vapor concentration despite the significant increase of flow velocity.
Photon Throughput Calculations for a Spherical Crystal Spectrometer
NASA Astrophysics Data System (ADS)
Gilman, C. J.; Bitter, M.; Delgado-Aparicio, L.; Efthimion, P. C.; Hill, K.; Kraus, B.; Gao, L.; Pablant, N.
2017-10-01
X-ray imaging crystal spectrometers of the type described in Refs. have become a standard diagnostic for Doppler measurements of profiles of the ion temperature and the plasma flow velocities in magnetically confined, hot fusion plasmas. These instruments have by now been implemented on major tokamak and stellarator experiments in Korea, China, Japan, and Germany and are currently also being designed by PPPL for ITER. A still missing part in the present data analysis is an efficient code for photon throughput calculations to evaluate the chord-integrated spectral data. The existing ray tracing codes cannot be used for a data analysis between shots, since they require extensive and time consuming numerical calculations. Here, we present a detailed analysis of the geometrical properties of the ray pattern. This method allows us to minimize the extent of numerical calculations and to create a more efficient code. This work was performed under the auspices of the U.S. Department of Energy by Princeton Plasma Physics Laboratory under contract DE-AC02-09CH11466.
ERIC Educational Resources Information Center
Meadows, William C.
2011-01-01
Interest in North American Indian code talkers continues to increase. In addition to numerous works about the Navajo code talkers, several publications on other groups of Native American code talkers--including the Choctaw, Comanche, Hopi, Meskwaki, Canadian Cree--and about code talkers in general have appeared. This article chronicles recent…
Programming (Tips) for Physicists & Engineers
Ozcan, Erkcan
2018-02-19
Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.
Programming (Tips) for Physicists & Engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozcan, Erkcan
2010-07-13
Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.
Measuring Diagnoses: ICD Code Accuracy
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-01-01
Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999
Evaluation of cross sections for neutron-induced reactions in sodium. [10/sup -5/ eV to 20 MeV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, D.C.
1980-09-01
An evaluation of the neutron-induced cross sections of /sup 23/Na has been done for the energy range from 10/sup -5/ eV to 20 MeV. All significant cross sections are given, including differential cross sections for production of gamma rays. The recommended values are based on experimental data where available, and use results of a consistent model code analysis of available data to predict cross sections where there are no experimental data. This report describes the evaluation that was submitted to the Cross Section Evaluation Working Group (CSEWG) for consideration as a part of the Evaluated Nuclear Data File, Version V,more » and subsequently issued as MAT 1311. 126 references, 130 figures, 14 tables.« less
Tam, Vivian; Edge, Jennifer S; Hoffman, Steven J
2016-10-12
Shortages of health workers in low-income countries are exacerbated by the international migration of health workers to more affluent countries. This problem is compounded by the active recruitment of health workers by destination countries, particularly Australia, Canada, UK and USA. The World Health Organization (WHO) adopted a voluntary Code of Practice in May 2010 to mitigate tensions between health workers' right to migrate and the shortage of health workers in source countries. The first empirical impact evaluation of this Code was conducted 11-months after its adoption and demonstrated a lack of impact on health workforce recruitment policy and practice in the short-term. This second empirical impact evaluation was conducted 4-years post-adoption using the same methodology to determine whether there have been any changes in the perceived utility, applicability, and implementation of the Code in the medium-term. Forty-four respondents representing government, civil society and the private sector from Australia, Canada, UK and USA completed an email-based survey evaluating their awareness of the Code, perceived impact, changes to policy or recruitment practices resulting from the Code, and the effectiveness of non-binding Codes generally. The same survey instrument from the original study was used to facilitate direct comparability of responses. Key lessons were identified through thematic analysis. The main findings between the initial impact evaluation and the current one are unchanged. Both sets of key informants reported no significant policy or regulatory changes to health worker recruitment in their countries as a direct result of the Code due to its lack of incentives, institutional mechanisms and interest mobilizers. Participants emphasized the existence of previous bilateral and regional Codes, the WHO Code's non-binding nature, and the primacy of competing domestic healthcare priorities in explaining this perceived lack of impact. The Code has probably still not produced the tangible improvements in health worker flows it aspired to achieve. Several actions, including a focus on developing bilateral codes, linking the Code to topical global priorities, and reframing the Code's purpose to emphasize health system sustainability, are proposed to improve the Code's uptake and impact.
Coding, Organization and Feedback Variables in Motor Skills.
1982-04-01
teachers) as anyone else--has been its nondirectional and incompletely conceptualized nature . Those involved in research now are being urged to avoid...functional evaluations. It constitutes more than simply a methodology; it is an ideology for studying ’how things work’ and by its nature draws on many...not necessarily dependent on the physical nature of the system. It furnishes a superstructure for interpreting and comparing input from a multitude of
Enhancing programming logic thinking using analogy mapping
NASA Astrophysics Data System (ADS)
Sukamto, R. A.; Megasari, R.
2018-05-01
Programming logic thinking is the most important competence for computer science students. However, programming is one of the difficult subject in computer science program. This paper reports our work about enhancing students' programming logic thinking using Analogy Mapping for basic programming subject. Analogy Mapping is a computer application which converts source code into analogies images. This research used time series evaluation and the result showed that Analogy Mapping can enhance students' programming logic thinking.
1994-04-01
Rather, it should provide, whenever possible, information on the location and/or quantity of work. Examples of good descriptions are as follows: Major...03100 D1UMNG IrMMlC CHAJE DOI TABI3s IMASON 1 -eascode 1323035 2 reason T231 64 code lookup table giving text for each integer reascode re*acode reaon 0
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-03-09
This work represents a first-of-its-kind successful application to employ advanced numerical methods in solving realistic two-phase flow problems with two-fluid six-equation two-phase flow model. These advanced numerical methods include high-resolution spatial discretization scheme with staggered grids (high-order) fully implicit time integration schemes, and Jacobian-free Newton–Krylov (JFNK) method as the nonlinear solver. The computer code developed in this work has been extensively validated with existing experimental flow boiling data in vertical pipes and rod bundles, which cover wide ranges of experimental conditions, such as pressure, inlet mass flux, wall heat flux and exit void fraction. Additional code-to-code benchmark with the RELAP5-3Dmore » code further verifies the correct code implementation. The combined methods employed in this work exhibit strong robustness in solving two-phase flow problems even when phase appearance (boiling) and realistic discrete flow regimes are considered. Transitional flow regimes used in existing system analysis codes, normally introduced to overcome numerical difficulty, were completely removed in this work. As a result, this in turn provides the possibility to utilize more sophisticated flow regime maps in the future to further improve simulation accuracy.« less
Assessing resident's knowledge and communication skills using four different evaluation tools.
Nuovo, Jim; Bertakis, Klea D; Azari, Rahman
2006-07-01
This study assesses the relationship between 4 Accreditation Council for Graduate Medical Education (ACGME) outcome project measures for interpersonal and communication skills and medical knowledge; specifically, monthly performance evaluations, objective structured clinical examinations (OSCEs), the American Board of Family Practice in-training examination (ABFP-ITE) and the Davis observation code (DOC) practice style profiles. Based on previous work, we have DOC scoring for 29 residents from the University of California, Davis Department of Family and Community Medicine. For all these residents we also had the results of monthly performance evaluations, 2 required OSCE exercises, and the results of 3 American Board of Family Medicine (ABFM) ITEs. Data for each of these measures were abstracted for each resident. The Pearson correlation coefficient was used to assess the presence or lack of correlation between each of these evaluation methods. There is little correlation between various evaluation methods used to assess medical knowledge, and there is also little correlation between various evaluation methods used to assess communication skills. The outcome project remains a 'work in progress', with the need for larger studies to assess the value of different assessment measures of resident competence. It is unlikely that DOC will become a useful evaluation tool.
BASiNET-BiologicAl Sequences NETwork: a case study on coding and non-coding RNAs identification.
Ito, Eric Augusto; Katahira, Isaque; Vicente, Fábio Fernandes da Rocha; Pereira, Luiz Filipe Protasio; Lopes, Fabrício Martins
2018-06-05
With the emergence of Next Generation Sequencing (NGS) technologies, a large volume of sequence data in particular de novo sequencing was rapidly produced at relatively low costs. In this context, computational tools are increasingly important to assist in the identification of relevant information to understand the functioning of organisms. This work introduces BASiNET, an alignment-free tool for classifying biological sequences based on the feature extraction from complex network measurements. The method initially transform the sequences and represents them as complex networks. Then it extracts topological measures and constructs a feature vector that is used to classify the sequences. The method was evaluated in the classification of coding and non-coding RNAs of 13 species and compared to the CNCI, PLEK and CPC2 methods. BASiNET outperformed all compared methods in all adopted organisms and datasets. BASiNET have classified sequences in all organisms with high accuracy and low standard deviation, showing that the method is robust and non-biased by the organism. The proposed methodology is implemented in open source in R language and freely available for download at https://cran.r-project.org/package=BASiNET.
Muharam, Yuswan; Warnatz, Jürgen
2007-08-21
A mechanism generator code to automatically generate mechanisms for the oxidation of large hydrocarbons has been successfully modified and considerably expanded in this work. The modification was through (1) improvement of the existing rules such as cyclic-ether reactions and aldehyde reactions, (2) inclusion of some additional rules to the code, such as ketone reactions, hydroperoxy cyclic-ether formations and additional reactions of alkenes, (3) inclusion of small oxygenates, produced by the code but not included in the handwritten C(1)-C(4) sub-mechanism yet, to the handwritten C(1)-C(4) sub-mechanism. In order to evaluate mechanisms generated by the code, simulations of observed results in different experimental environments have been carried out. Experimentally derived and numerically predicted ignition delays of n-heptane-air and n-decane-air mixtures in high-pressure shock tubes in a wide range of temperatures, pressures and equivalence ratios agree very well. Concentration profiles of the main products and intermediates of n-heptane and n-decane oxidation in jet-stirred reactors at a wide range of temperatures and equivalence ratios are generally well reproduced. In addition, the ignition delay times of different normal alkanes was numerically studied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cullen, D.E.
1978-07-04
The code SIGMA1 Doppler broadens evaluated cross sections in the ENDF/B format. The code can be applied only to data that vary as a linear function of energy and cross section between tabulated points. This report describes the methods used in the code and serves as a user's guide to the code. 6 figures, 2 tables.
Cracking the code: the accuracy of coding shoulder procedures and the repercussions.
Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M
2013-05-01
Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p < 0.0001) and the correct procedure code (odds ratio 310.0, p < 0.0001). Using the proforma resulted in a £28,562 increase in revenue for the 100 patients evaluated relative to the income generated from the coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.
Maakip, Ismail; Oakman, Jodi; Stuckey, Rwth
2017-06-01
Purpose Workers with musculoskeletal pain (MSP) often continue to work despite their condition. Understanding the factors that enable them to remain at work provides insights into the development of appropriate workplace accommodations. This qualitative study aims to explore the strategies utilised by female Malaysian office workers with MSP to maintain productive employment. Methods A qualitative approach using thematic analysis was used. Individual semi-structured interviews were conducted with 13 female Malaysian office workers with MSP. Initial codes were identified and refined through iterative discussion to further develop the emerging codes and modify the coding framework. A further stage of coding was undertaken to eliminate redundant codes and establish analytic connections between distinct themes. Results Two major themes were identified: managing the demands of work and maintaining employment with persistent musculoskeletal pain. Participants reported developing strategies to assist them to remain at work, but most focused on individually initiated adaptations or peer support, rather than systemic changes to work systems or practices. A combination of the patriarchal and hierarchical cultural occupational context emerged as a critical factor in the finding of individual or peer based adaptations rather than organizational accommodations. Conclusions It is recommended that supervisors be educated in the benefits of maintaining and retaining employees with MSP, and encouraged to challenge cultural norms and develop appropriate flexible workplace accommodations through consultation and negotiation with these workers.
Prediction of Business Jet Airloads Using The Overflow Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Bounajem, Elias; Buning, Pieter G.
2001-01-01
The objective of this work is to evaluate the application of Navier-Stokes computational fluid dynamics technology, for the purpose of predicting off-design condition airloads on a business jet configuration in the transonic regime. The NASA Navier-Stokes flow solver OVERFLOW with Chimera overset grid capability, availability of several numerical schemes and convergence acceleration techniques was selected for this work. A set of scripts which have been compiled to reduce the time required for the grid generation process are described. Several turbulence models are evaluated in the presence of separated flow regions on the wing. Computed results are compared to available wind tunnel data for two Mach numbers and a range of angles-of-attack. Comparisons of wing surface pressure from numerical simulation and wind tunnel measurements show good agreement up to fairly high angles-of-attack.
Hoare, Karen J; Mills, Jane; Francis, Karen
2012-12-01
The terminology used to analyse data in a grounded theory study can be confusing. Different grounded theorists use a variety of terms which all have similar meanings. In the following study, we use terms adopted by Charmaz including: initial, focused and axial coding. Initial codes are used to analyse data with an emphasis on identifying gerunds, a verb acting as a noun. If initial codes are relevant to the developing theory, they are grouped with similar codes into categories. Categories become saturated when there are no new codes identified in the data. Axial codes are used to link categories together into a grounded theory process. Memo writing accompanies this data sifting and sorting. The following article explains how one initial code became a category providing a worked example of the grounded theory method of constant comparative analysis. The interplay between coding and categorization is facilitated by the constant comparative method. © 2012 Wiley Publishing Asia Pty Ltd.
Overview of the ArbiTER edge plasma eigenvalue code
NASA Astrophysics Data System (ADS)
Baver, Derek; Myra, James; Umansky, Maxim
2011-10-01
The Arbitrary Topology Equation Reader, or ArbiTER, is a flexible eigenvalue solver that is currently under development for plasma physics applications. The ArbiTER code builds on the equation parser framework of the existing 2DX code, extending it to include a topology parser. This will give the code the capability to model problems with complicated geometries (such as multiple X-points and scrape-off layers) or model equations with arbitrary numbers of dimensions (e.g. for kinetic analysis). In the equation parser framework, model equations are not included in the program's source code. Instead, an input file contains instructions for building a matrix from profile functions and elementary differential operators. The program then executes these instructions in a sequential manner. These instructions may also be translated into analytic form, thus giving the code transparency as well as flexibility. We will present an overview of how the ArbiTER code is to work, as well as preliminary results from early versions of this code. Work supported by the U.S. DOE.
NASA Technical Reports Server (NTRS)
Ni, Jianjun David
2011-01-01
This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, D.G.: Watkins, J.C.
This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In additionmore » to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use.« less
Clinical application of ICF key codes to evaluate patients with dysphagia following stroke
Dong, Yi; Zhang, Chang-Jie; Shi, Jie; Deng, Jinggui; Lan, Chun-Na
2016-01-01
Abstract This study was aimed to identify and evaluate the International Classification of Functioning (ICF) key codes for dysphagia in stroke patients. Thirty patients with dysphagia after stroke were enrolled in our study. To evaluate the ICF dysphagia scale, 6 scales were used as comparisons, namely the Barthel Index (BI), Repetitive Saliva Swallowing Test (RSST), Kubota Water Swallowing Test (KWST), Frenchay Dysarthria Assessment, Mini-Mental State Examination (MMSE), and the Montreal Cognitive Assessment (MoCA). Multiple regression analysis was performed to quantitate the relationship between the ICF scale and the other 7 scales. In addition, 60 ICF scales were analyzed by the least absolute shrinkage and selection operator (LASSO) method. A total of 21 ICF codes were identified, which were closely related with the other scales. These included 13 codes from Body Function, 1 from Body Structure, 3 from Activities and Participation, and 4 from Environmental Factors. A topographic network map with 30 ICF key codes was also generated to visualize their relationships. The number of ICF codes identified is in line with other well-established evaluation methods. The network topographic map generated here could be used as an instruction tool in future evaluations. We also found that attention functions and biting were critical codes of these scales, and could be used as treatment targets. PMID:27661012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, Paolo; Theiler, C.; Fasoli, A.
A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less
NASA Astrophysics Data System (ADS)
White, Justin; Olson, Britton; Morgan, Brandon; McFarland, Jacob; Lawrence Livermore National Laboratory Team; University of Missouri-Columbia Team
2015-11-01
This work presents results from a large eddy simulation of a high Reynolds number Rayleigh-Taylor instability and Richtmyer-Meshkov instability. A tenth-order compact differencing scheme on a fixed Eulerian mesh is utilized within the Ares code developed at Lawrence Livermore National Laboratory. (LLNL) We explore the self-similar limit of the mixing layer growth in order to evaluate the k-L-a Reynolds Averaged Navier Stokes (RANS) model (Morgan and Wickett, Phys. Rev. E, 2015). Furthermore, profiles of turbulent kinetic energy, turbulent length scale, mass flux velocity, and density-specific-volume correlation are extracted in order to aid the creation a high fidelity LES data set for RANS modeling. Prepared by LLNL under Contract DE-AC52-07NA27344.
Evaluation of Aeroelastically Tailored Small Wind Turbine Blades Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffin, Dayton A.
2005-09-29
Evaluation of Aeroelastically Tailored Small Wind Turbine Blades Final Report Global Energy Concepts, LLC (GEC) has performed a conceptual design study concerning aeroelastic tailoring of small wind turbine blades. The primary objectives were to evaluate ways that blade/rotor geometry could be used to enable cost-of-energy reductions by enhancing energy capture while constraining or mitigating blade costs, system loads, and related component costs. This work builds on insights developed in ongoing adaptive-blade programs but with a focus on application to small turbine systems with isotropic blade material properties and with combined blade sweep and pre-bending/pre-curving to achieve the desired twist coupling.more » Specific goals of this project are to: (A) Evaluate and quantify the extent to which rotor geometry can be used to realize load-mitigating small wind turbine rotors. Primary aspects of the load mitigation are: (1) Improved overspeed safety affected by blades twisting toward stall in response to speed increases. (2) Reduced fatigue loading affected by blade twisting toward feather in response to turbulent gusts. (B) Illustrate trade-offs and design sensitivities for this concept. (C) Provide the technical basis for small wind turbine manufacturers to evaluate this concept and commercialize if the technology appears favorable. The SolidWorks code was used to rapidly develop solid models of blade with varying shapes and material properties. Finite element analyses (FEA) were performed using the COSMOS code modeling with tip-loads and centripetal accelerations. This tool set was used to investigate the potential for aeroelastic tailoring with combined planform sweep and pre-curve. An extensive matrix of design variables was investigated, including aerodynamic design, magnitude and shape of planform sweep, magnitude and shape of blade pre-curve, material stiffness, and rotor diameter. The FEA simulations resulted in substantial insights into the structural response of these blades. The trends were used to identify geometries and rotor configurations that showed the greatest promise for achieving beneficial aeroelastic response. The ADAMS code was used to perform complete aeroelastic simulations of selected rotor configurations; however, the results of these simulations were not satisfactory. This report documents the challenges encountered with the ADAMS simulations and presents recommendations for further development of this concept for aeroelastically tailored small wind turbine blades.« less
Reduction of PAPR in coded OFDM using fast Reed-Solomon codes over prime Galois fields
NASA Astrophysics Data System (ADS)
Motazedi, Mohammad Reza; Dianat, Reza
2017-02-01
In this work, two new techniques using Reed-Solomon (RS) codes over GF(257) and GF(65,537) are proposed for peak-to-average power ratio (PAPR) reduction in coded orthogonal frequency division multiplexing (OFDM) systems. The lengths of these codes are well-matched to the length of OFDM frames. Over these fields, the block lengths of codes are powers of two and we fully exploit the radix-2 fast Fourier transform algorithms. Multiplications and additions are simple modulus operations. These codes provide desirable randomness with a small perturbation in information symbols that is essential for generation of different statistically independent candidates. Our simulations show that the PAPR reduction ability of RS codes is the same as that of conventional selected mapping (SLM), but contrary to SLM, we can get error correction capability. Also for the second proposed technique, the transmission of side information is not needed. To the best of our knowledge, this is the first work using RS codes for PAPR reduction in single-input single-output systems.
Fundamental differences between optimization code test problems in engineering applications
NASA Technical Reports Server (NTRS)
Eason, E. D.
1984-01-01
The purpose here is to suggest that there is at least one fundamental difference between the problems used for testing optimization codes and the problems that engineers often need to solve; in particular, the level of precision that can be practically achieved in the numerical evaluation of the objective function, derivatives, and constraints. This difference affects the performance of optimization codes, as illustrated by two examples. Two classes of optimization problem were defined. Class One functions and constraints can be evaluated to a high precision that depends primarily on the word length of the computer. Class Two functions and/or constraints can only be evaluated to a moderate or a low level of precision for economic or modeling reasons, regardless of the computer word length. Optimization codes have not been adequately tested on Class Two problems. There are very few Class Two test problems in the literature, while there are literally hundreds of Class One test problems. The relative performance of two codes may be markedly different for Class One and Class Two problems. Less sophisticated direct search type codes may be less likely to be confused or to waste many function evaluations on Class Two problems. The analysis accuracy and minimization performance are related in a complex way that probably varies from code to code. On a problem where the analysis precision was varied over a range, the simple Hooke and Jeeves code was more efficient at low precision while the Powell code was more efficient at high precision.
NASA Astrophysics Data System (ADS)
Ortega, Jesus Daniel
This work focuses on the development of a solar power thermal receiver for a supercritical-carbon dioxide (sCO2), Brayton power-cycle to produce ~1 MWe. Closed-loop sCO2 Brayton cycles are being evaluated in combination with concentrating solar power to provide higher thermal-to-electric conversion efficiencies relative to conventional steam Rankine cycles. High temperatures (923--973 K) and pressures (20--25 MPa) are required in the solar receiver to achieve thermal efficiencies of ~50%, making concentrating solar power (CSP) technologies a competitive alternative to current power generation methods. In this study, the CSP receiver is required to achieve an outlet temperature of 923 K at 25 MPa or 973 K at 20 MPa to meet the operating needs. To obtain compatible receiver tube material, an extensive material review was performed based the ASME Boiler and Pressure Vessel Code, ASME B31.1 and ASME B313.3 codes respectively. Subsequently, a thermal-structural model was developed using a commercial computational fluid (CFD) dynamics and structural mechanics software for designing and analyzing the tubular receiver that could provide the heat input for a ~2 MWth plant. These results were used to perform an analytical cumulative damage creep-fatigue analysis to estimate the work-life of the tubes. In sequence, an optical-thermal-fluid model was developed to evaluate the resulting thermal efficiency of the tubular receiver from the NSTTF heliostat field. The ray-tracing tool SolTrace was used to obtain the heat-flux distribution on the surfaces of the receiver. The K-ω SST turbulence model and P-1 radiation model used in Fluent were coupled with SolTrace to provide the heat flux distribution on the receiver surface. The creep-fatigue analysis displays the damage accumulated due to the cycling and the permanent deformation of the tubes. Nonetheless, they are able to support the required lifetime. The receiver surface temperatures were found to be within the safe operational limit while exhibiting a receiver thermal efficiency of ~85%. Future work includes the completion of a cyclic loading analysis to be performed using the Larson-Miller creep model in nCode Design Life to corroborate the structural integrity of the receiver over the desired lifetime of ~10,000 cycles.
Constitutive modeling for isotropic materials (HOST)
NASA Technical Reports Server (NTRS)
Lindholm, U. S.; Chan, K. S.; Bodner, S. R.; Weber, R. M.; Walker, K. P.; Cassenti, B. N.
1985-01-01
This report presents the results of the second year of work on a problem which is part of the NASA HOST Program. Its goals are: (1) to develop and validate unified constitutive models for isotropic materials, and (2) to demonstrate their usefulness for structural analyses of hot section components of gas turbine engines. The unified models selected for development and evaluation are that of Bodner-Partom and Walker. For model evaluation purposes, a large constitutive data base is generated for a B1900 + Hf alloy by performing uniaxial tensile, creep, cyclic, stress relation, and thermomechanical fatigue (TMF) tests as well as biaxial (tension/torsion) tests under proportional and nonproportional loading over a wide range of strain rates and temperatures. Systematic approaches for evaluating material constants from a small subset of the data base are developed. Correlations of the uniaxial and biaxial tests data with the theories of Bodner-Partom and Walker are performed to establish the accuracy, range of applicability, and integability of the models. Both models are implemented in the MARC finite element computer code and used for TMF analyses. Benchmark notch round experiments are conducted and the results compared with finite-element analyses using the MARC code and the Walker model.
2014-01-01
Background Behavioral interventions such as psychotherapy are leading, evidence-based practices for a variety of problems (e.g., substance abuse), but the evaluation of provider fidelity to behavioral interventions is limited by the need for human judgment. The current study evaluated the accuracy of statistical text classification in replicating human-based judgments of provider fidelity in one specific psychotherapy—motivational interviewing (MI). Method Participants (n = 148) came from five previously conducted randomized trials and were either primary care patients at a safety-net hospital or university students. To be eligible for the original studies, participants met criteria for either problematic drug or alcohol use. All participants received a type of brief motivational interview, an evidence-based intervention for alcohol and substance use disorders. The Motivational Interviewing Skills Code is a standard measure of MI provider fidelity based on human ratings that was used to evaluate all therapy sessions. A text classification approach called a labeled topic model was used to learn associations between human-based fidelity ratings and MI session transcripts. It was then used to generate codes for new sessions. The primary comparison was the accuracy of model-based codes with human-based codes. Results Receiver operating characteristic (ROC) analyses of model-based codes showed reasonably strong sensitivity and specificity with those from human raters (range of area under ROC curve (AUC) scores: 0.62 – 0.81; average AUC: 0.72). Agreement with human raters was evaluated based on talk turns as well as code tallies for an entire session. Generated codes had higher reliability with human codes for session tallies and also varied strongly by individual code. Conclusion To scale up the evaluation of behavioral interventions, technological solutions will be required. The current study demonstrated preliminary, encouraging findings regarding the utility of statistical text classification in bridging this methodological gap. PMID:24758152
A Working Model for the System Alumina-Magnesia.
1983-05-01
Several regions in the resulting diagram appear rather uncertain: the liquidus ’National bureau of StandaTds. JANAF Thermochemical Tables, by D. R. Stull ...Code 131) 1 Naval Ordnance Station, Indian Head (Technical Library) 29 Naval Postgraduate School. Monterey Code 012, Dean of Research (1) Code 06... Dean of Science and Engineering (1) Code 1424. Library - Technical Reports (2) Code 33. Weapons Engineering Program Office (1) Code 61. Chairman
New technologies for advanced three-dimensional optimum shape design in aeronautics
NASA Astrophysics Data System (ADS)
Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno
1999-05-01
The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright
Code Switching and Code Superimposition in Music. Working Papers in Sociolinguistics, No. 63.
ERIC Educational Resources Information Center
Slobin, Mark
This paper illustrates how the sociolinguistic concept of code switching applies to the use of different styles of music. The two bases for the analogy are Labov's definition of code-switching as "moving from one consistent set of co-occurring rules to another," and the finding of sociolinguistics that code switching tends to be part of…
Audit of Clinical Coding of Major Head and Neck Operations
Mitra, Indu; Malik, Tass; Homer, Jarrod J; Loughran, Sean
2009-01-01
INTRODUCTION Within the NHS, operations are coded using the Office of Population Censuses and Surveys (OPCS) classification system. These codes, together with diagnostic codes, are used to generate Healthcare Resource Group (HRG) codes, which correlate to a payment bracket. The aim of this study was to determine whether allocated procedure codes for major head and neck operations were correct and reflective of the work undertaken. HRG codes generated were assessed to determine accuracy of remuneration. PATIENTS AND METHODS The coding of consecutive major head and neck operations undertaken in a tertiary referral centre over a retrospective 3-month period were assessed. Procedure codes were initially ascribed by professional hospital coders. Operations were then recoded by the surgical trainee in liaison with the head of clinical coding. The initial and revised procedure codes were compared and used to generate HRG codes, to determine whether the payment banding had altered. RESULTS A total of 34 cases were reviewed. The number of procedure codes generated initially by the clinical coders was 99, whereas the revised codes generated 146. Of the original codes, 47 of 99 (47.4%) were incorrect. In 19 of the 34 cases reviewed (55.9%), the HRG code remained unchanged, thus resulting in the correct payment. Six cases were never coded, equating to £15,300 loss of payment. CONCLUSIONS These results highlight the inadequacy of this system to reward hospitals for the work carried out within the NHS in a fair and consistent manner. The current coding system was found to be complicated, ambiguous and inaccurate, resulting in loss of remuneration. PMID:19220944
An improved algorithm for evaluating trellis phase codes
NASA Technical Reports Server (NTRS)
Mulligan, M. G.; Wilson, S. G.
1982-01-01
A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.
An improved algorithm for evaluating trellis phase codes
NASA Technical Reports Server (NTRS)
Mulligan, M. G.; Wilson, S. G.
1984-01-01
A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.
MX Siting Investigation Geotechnical Evaluation. Volume IA. Nevada-Utah Verification Studies, FY 79.
1979-08-24
JUSTIFICATION E ECTE DISTRIUrION, D AVAILABILITY CODES DIST AVAIL AND/OR SPECIAL DATE ACCESSIONED 11 * Original contains color plates: All DTIC reproduct- IA...White River North, Garden- Coal , Reveille-Railroad and Big Smoky CDPs. Section 11.0 briefly explains previous work performed in Dry Lake and Ralston...Hamlin CDP Volume V - White River North CDP Volume VI - Garden- Coal CDP Volume VII - Reveille-Railroad CDP Volume VIII - Big Smoky CDP Volume IX - Dry
NASA Technical Reports Server (NTRS)
Kelly, Michael P.
2011-01-01
Topics covered include: Develop Program/Project Quality Assurance Surveillance Plans The work activities performed by the developer and/or his suppliers are subject to evaluation and audit by government-designated representatives. CSO supports project by selecting on-site supplier representative s by one of several methods: (1) a Defense Contract Management Agency (DCMA) person via a Letter Of Delegation (LOD), (2) an independent assurance contractor (IAC) via a contract Audits, Assessments, and Assurance (A3) Contract Code 300 Mission Assurance Support Contract (MASC)
BNL program in support of LWR degraded-core accident analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ginsberg, T.; Greene, G.A.
1982-01-01
Two major sources of loading on dry watr reactor containments are steam generatin from core debris water thermal interactions and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described in this paper. 8 figures.
Separation of Prior-Service Reentrants in the U.S. (United States) Navy: A Preliminary Analysis.
1983-04-01
AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK ’ Institute for Policy Research and Evaluation AREA & WORK UNIT NUMBERS N-253 Burrowes Building, PSU...enlistment a Percent of area reentrants in each district - . . - • . .. .. p. . ia OTHER TECHNICAL REPORTS OF TIllS PROJECTa As part of the project...bOffie of Naval Research Contract No. N00014-82-K-0262. • . . . . Distribution List Director Technology Programs Office of Naval Research (Code 200
[Design of flat field holographic concave grating for near-infrared spectrophotometer].
Xiang, Xian-Yi; Wen, Zhi-Yu
2008-07-01
Near-infrared spectrum analysis can be used to determine the nature or test quantitatively some chemical compositions by detecting molecular double frequency and multiple frequency absorption. It has been used in agriculture, biology, petrifaction, foodstuff, medicament, spinning and other fields. Near-infrared spectrophotometer is the main apparatus for near-infrared spectrum analysis, and the grating is the most important part of the apparatus. Based on holographic concave grating theory and optic design software CODE V, a flat field holographic concave grating for near-infrared spectrophotometer was designed from primary structure, which relied on global optimization of the software. The contradiction between wide spectrum bound and limited spectrum extension was resolved, aberrations were reduced successfully, spectrum information was utilized fully, and the optic structure of spectrometer was highly efficient. Using CODE V software, complex high-order aberration equations need not be solved, the result can be evaluated quickly, flat field and resolving power can be kept in balance, and the work efficiency is also enhanced. A paradigm of flat field holographic concave grating is given, it works between 900 nm to 1 700 nm, the diameter of the concave grating is 25 mm, and F/ # is 1. 5. The design result was analyzed and evaluated. It was showed that if the slit source, whose width is 50 microm, is used to reconstruction, the theoretic resolution capacity is better than 6.3 nm.
Radioactivity evaluation for the KSTAR tokamak.
Kim, Hyunduk; Lee, Hee-Seock; Hong, Sukmo; Kim, Minho; Chung, Chinwha; Kim, Changsuk
2005-01-01
The deuterium-deuterium (D-D) reaction in the KSTAR (Korea Superconducting Tokamak Advanced Research) tokamak generates neutrons with a peak yield of 2.5 x 10(16) s(-1) through a pulse operation of 300 s. Since the structure material of the tokamak is irradiated with neutrons, this environment will restrict work around and inside the tokamak from a radiation protection physics point of view after shutdown. Identification of neutron-produced radionuclides and evaluation of absorbed dose in the structure material are needed to develop a guiding principle for radiation protection. The activation level was evaluated by MCNP4C2 and an inventory code, FISPACT. The absorbed dose in the working area decreased by 4.26 x 10(-4) mrem h(-1) in the inner vessel 1.5 d after shutdown. Furthermore, tritium strongly contributes to the contamination in the graphite tile. The amount of tritium produced by neutrons was 3.03 x 10(6) Bq kg(-1) in the carbon graphite of a plasma-facing wall.
Evaluation of the DRAGON code for VHTR design analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division
2006-01-12
This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less
Hydrodynamic Analyses and Evaluation of New Fluid Film Bearing Concepts
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Dimofte, Florin
1998-01-01
Over the past several years, numerical and experimental investigations have been performed on a waved journal bearing. The research work was undertaken by Dr. Florin Dimofte, a Senior Research Associate in the Mechanical Engineering Department at the University of Toledo. Dr. Theo Keith, Distinguished University Professor in the Mechanical Engineering Department was the Technical Coordinator of the project. The wave journal bearing is a bearing with a slight but precise variation in its circular profile such that a waved profile is circumscribed on the inner bearing diameter. The profile has a wave amplitude that is equal to a fraction of the bearing clearance. Prior to this period of research on the wave bearing, computer codes were written and an experimental facility was established. During this period of research considerable effort was directed towards the study of the bearing's stability. The previously developed computer codes and the experimental facility were of critical importance in performing this stability research. A collection of papers and reports were written to describe the results of this work. The attached captures that effort and represents the research output during the grant period.
The value of psychosocial group activity in nursing education: A qualitative analysis.
Choi, Yun-Jung
2018-05-01
Nursing faculty often struggle to find effective teaching strategies for nursing students that integrate group work into nursing students' learning activities. This study was conducted to evaluate students' experiences in a psychiatric and mental health nursing course using psychosocial group activities to develop therapeutic communication and interpersonal relationship skills, as well as to introduce psychosocial nursing interventions. A qualitative research design was used. The study explored nursing students' experiences of the course in accordance with the inductive, interpretative, and constructive approaches via focus group interviews. Participants were 17 undergraduate nursing students who registered for a psychiatric and mental health nursing course. The collected data were analyzed by qualitative content analysis. The analysis resulted in 28 codes, 14 interpretive codes, 4 themes (developing interpersonal relationships, learning problem-solving skills, practicing cooperation and altruism, and getting insight and healing), and a core theme (interdependent growth in self-confidence). The psychosocial group activity provided constructive opportunities for the students to work independently and interdependently as healthcare team members through reflective learning experiences. Copyright © 2018 Elsevier Ltd. All rights reserved.
System Design Description for the TMAD Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finfrock, S.H.
This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.
Great Expectations: Is there Evidence for Predictive Coding in Auditory Cortex?
Heilbron, Micha; Chait, Maria
2017-08-04
Predictive coding is possibly one of the most influential, comprehensive, and controversial theories of neural function. While proponents praise its explanatory potential, critics object that key tenets of the theory are untested or even untestable. The present article critically examines existing evidence for predictive coding in the auditory modality. Specifically, we identify five key assumptions of the theory and evaluate each in the light of animal, human and modeling studies of auditory pattern processing. For the first two assumptions - that neural responses are shaped by expectations and that these expectations are hierarchically organized - animal and human studies provide compelling evidence. The anticipatory, predictive nature of these expectations also enjoys empirical support, especially from studies on unexpected stimulus omission. However, for the existence of separate error and prediction neurons, a key assumption of the theory, evidence is lacking. More work exists on the proposed oscillatory signatures of predictive coding, and on the relation between attention and precision. However, results on these latter two assumptions are mixed or contradictory. Looking to the future, more collaboration between human and animal studies, aided by model-based analyses will be needed to test specific assumptions and implementations of predictive coding - and, as such, help determine whether this popular grand theory can fulfill its expectations. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
24 CFR 983.155 - Completion of housing.
Code of Federal Regulations, 2012 CFR
2012-04-01
... with local requirements (such as code and zoning requirements); and (ii) An architect's certification that the housing complies with: (A) HUD housing quality standards; (B) State, local, or other building codes; (C) Zoning; (D) The rehabilitation work write-up (for rehabilitated housing) or the work...
24 CFR 983.155 - Completion of housing.
Code of Federal Regulations, 2014 CFR
2014-04-01
... with local requirements (such as code and zoning requirements); and (ii) An architect's certification that the housing complies with: (A) HUD housing quality standards; (B) State, local, or other building codes; (C) Zoning; (D) The rehabilitation work write-up (for rehabilitated housing) or the work...
24 CFR 983.155 - Completion of housing.
Code of Federal Regulations, 2011 CFR
2011-04-01
... with local requirements (such as code and zoning requirements); and (ii) An architect's certification that the housing complies with: (A) HUD housing quality standards; (B) State, local, or other building codes; (C) Zoning; (D) The rehabilitation work write-up (for rehabilitated housing) or the work...
24 CFR 983.155 - Completion of housing.
Code of Federal Regulations, 2013 CFR
2013-04-01
... with local requirements (such as code and zoning requirements); and (ii) An architect's certification that the housing complies with: (A) HUD housing quality standards; (B) State, local, or other building codes; (C) Zoning; (D) The rehabilitation work write-up (for rehabilitated housing) or the work...
24 CFR 983.155 - Completion of housing.
Code of Federal Regulations, 2010 CFR
2010-04-01
... with local requirements (such as code and zoning requirements); and (ii) An architect's certification that the housing complies with: (A) HUD housing quality standards; (B) State, local, or other building codes; (C) Zoning; (D) The rehabilitation work write-up (for rehabilitated housing) or the work...
Trellis phase codes for power-bandwith efficient satellite communications
NASA Technical Reports Server (NTRS)
Wilson, S. G.; Highfill, J. H.; Hsu, C. D.; Harkness, R.
1981-01-01
Support work on improved power and spectrum utilization on digital satellite channels was performed. Specific attention is given to the class of signalling schemes known as continuous phase modulation (CPM). The specific work described in this report addresses: analytical bounds on error probability for multi-h phase codes, power and bandwidth characterization of 4-ary multi-h codes, and initial results of channel simulation to assess the impact of band limiting filters and nonlinear amplifiers on CPM performance.
Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.
Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile
2016-01-01
This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.
Subjective evaluation of next-generation video compression algorithms: a case study
NASA Astrophysics Data System (ADS)
De Simone, Francesca; Goldmann, Lutz; Lee, Jong-Seok; Ebrahimi, Touradj; Baroncini, Vittorio
2010-08-01
This paper describes the details and the results of the subjective quality evaluation performed at EPFL, as a contribution to the effort of the Joint Collaborative Team on Video Coding (JCT-VC) for the definition of the next-generation video coding standard. The performance of 27 coding technologies have been evaluated with respect to two H.264/MPEG-4 AVC anchors, considering high definition (HD) test material. The test campaign involved a total of 494 naive observers and took place over a period of four weeks. While similar tests have been conducted as part of the standardization process of previous video coding technologies, the test campaign described in this paper is by far the most extensive in the history of video coding standardization. The obtained subjective quality scores show high consistency and support an accurate comparison of the performance of the different coding solutions.
Kayenta Township Building & Safety Department, Tribal Green Building Code Summit Presentation
Tribal Green Building Code Summit Presentation by Kayenta Township Building & Safety Department showing how they established the building department, developed a code adoption and enforcement process, and hired staff to carry out the work.
Exclusively visual analysis of classroom group interactions
NASA Astrophysics Data System (ADS)
Tucker, Laura; Scherr, Rachel E.; Zickler, Todd; Mazur, Eric
2016-12-01
Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data only—without audio—as when using both visual and audio data to code. Also, interrater reliability is high when comparing use of visual and audio data to visual-only data. We see a small bias to code interactions as group discussion when visual and audio data are used compared with video-only data. This work establishes that meaningful educational observation can be made through visual information alone. Further, it suggests that after initial work to create a coding scheme and validate it in each environment, computer-automated visual coding could drastically increase the breadth of qualitative studies and allow for meaningful educational analysis on a far greater scale.
Warthog: Coupling Status Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Shane W. D.; Reardon, Bradley T.
The Warthog code was developed to couple codes that are developed in both the Multi-Physics Object-Oriented Simulation Environment (MOOSE) from Idaho National Laboratory (INL) and SHARP from Argonne National Laboratory (ANL). The initial phase of this work, focused on coupling the neutronics code PROTEUS with the fuel performance code BISON. The main technical challenge involves mapping the power density solution determined by PROTEUS to the fuel in BISON. This presents a challenge since PROTEUS uses the MOAB mesh format, but BISON, like all other MOOSE codes, uses the libMesh format. When coupling the different codes, one must consider that Warthogmore » is a light-weight MOOSE-based program that uses the Data Transfer Kit (DTK) to transfer data between the various mesh types. Users set up inputs for the codes they want to run, and then Warthog transfers the data between them. Currently Warthog supports XSProc from SCALE or the Sub-Group Application Programming Interface (SGAPI) in PROTEUS for generating cross sections. It supports arbitrary geometries using PROTEUS and BISON. DTK will transfer power densities and temperatures between the codes where the domains overlap. In the past fiscal year (FY), much work has gone into demonstrating two-way coupling for simple pin cells of various materials. XSProc was used to calculate the cross sections, which were then passed to PROTEUS in an external file. PROTEUS calculates the fission/power density, and Warthog uses DTK to pass this information to BISON, where it is used as the heat source. BISON then calculates the temperature profile of the pin cell and sends it back to XSProc to obtain the temperature corrected cross sections. This process is repeated until the convergence criteria (tolerance on BISON solve, or number of time steps) is reached. Models have been constructed and run for both uranium oxide and uranium silicide fuels. These models demonstrate a clear difference in power shape that is not accounted for in a stand-alone BISON run. Future work involves improving the user interface (UI), likely through integration with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Workbench. Furthermore, automating the input creation would ease the user experience. The next priority is to continue coupling the work with other codes in the SHARP package. Efforts on other projects include work to couple the Nek5000 thermo-hydraulics code to MOOSE, but this is in the preliminary stages.« less
Ethics and the Early Childhood Educator: Using the NAEYC Code. 2005 Code Edition
ERIC Educational Resources Information Center
Freeman, Nancy; Feeney, Stephanie
2005-01-01
With updated language and references to the 2005 revision of the Code of Ethical Conduct, this book, like the NAEYC Code of Ethical Conduct, seeks to inform, not prescribe, answers to tough questions that teachers face as they work with children, families, and colleagues. To help everyone become well acquainted with the Code and use it in one's…
Welfare Entitlements: Addressing New Realities.
ERIC Educational Resources Information Center
Belcher, Jon R.; Fandetti, Donald V.
1995-01-01
Because welfare entitlements are increasingly unpopular, social work advocates need to place greater emphasis on job growth and alternate mechanisms for wealth redistribution, including refundable tax credits for working poor people. The Internal Revenue Code can be an effective weapon in combating poverty if antipoverty approaches in the code are…
SOC-DS computer code provides tool for design evaluation of homogeneous two-material nuclear shield
NASA Technical Reports Server (NTRS)
Disney, R. K.; Ricks, L. O.
1967-01-01
SOC-DS Code /Shield Optimization Code-Direc Search/, selects a nuclear shield material of optimum volume, weight, or cost to meet the requirments of a given radiation dose rate or energy transmission constraint. It is applicable to evaluating neutron and gamma ray shields for all nuclear reactors.
ERIC Educational Resources Information Center
DeLyser, Dydia; Potter, Amy E.
2013-01-01
This article describes experiential-learning approaches to conveying the work and rewards involved in qualitative research. Seminar students interviewed one another, transcribed or took notes on those interviews, shared those materials to create a set of empirical materials for coding, developed coding schemes, and coded the materials using those…
Does the Holland Code Predict Job Satisfaction and Productivity in Clothing Factory Workers?
ERIC Educational Resources Information Center
Heesacker, Martin; And Others
1988-01-01
Administered Self-Directed Search to sewing machine operators to determine Holland code, and assessed work productivity, job satisfaction, absenteeism, and insurance claims. Most workers were of the Social code. Social subjects were the most satisfied, Conventional and Realistic subjects next, and subjects of other codes less so. Productivity of…
Detailed model for practical pulverized coal furnaces and gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, P.J.; Smoot, L.D.
1989-08-01
This study has been supported by a consortium of nine industrial and governmental sponsors. Work was initiated on May 1, 1985 and completed August 31, 1989. The central objective of this work was to develop, evaluate and apply a practical combustion model for utility boilers, industrial furnaces and gasifiers. Key accomplishments have included: Development of an advanced first-generation, computer model for combustion in three dimensional furnaces; development of a new first generation fouling and slagging submodel; detailed evaluation of an existing NO{sub x} submodel; development and evaluation of an improved radiation submodel; preparation and distribution of a three-volume final report:more » (a) Volume 1: General Technical Report; (b) Volume 2: PCGC-3 User's Manual; (c) Volume 3: Data Book for Evaluation of Three-Dimensional Combustion Models; and organization of a user's workshop on the three-dimensional code. The furnace computer model developed under this study requires further development before it can be applied generally to all applications; however, it can be used now by specialists for many specific applications, including non-combusting systems and combusting geseous systems. A new combustion center was organized and work was initiated to continue the important research effort initiated by this study. 212 refs., 72 figs., 38 tabs.« less
Accounting for the professional work of pathologists performing autopsies.
Sinard, John H
2013-02-01
With an increasing trend toward fee-code-based methods of measuring the clinical professional productivity of pathologists, those pathologists whose clinical activities include the performance of autopsies have been disadvantaged by the lack of generally accepted workload equivalents for autopsy performance and supervision. To develop recommended benchmarks to account for this important and often overlooked professional activity. Based on the professional experience of members of the Autopsy Committee of the College of American Pathologists, a survey of autopsy pathologists, and the limited additional material available in the literature, we developed recommended workload equivalents for the professional work associated with performing an autopsy, which we elected to express as multiples of established Current Procedural Terminology codes. As represented in Table 3 , we recommend that the professional work associated with a full adult autopsy be equivalent to 5.5 × 88309-26. Additional professional credit of 1.5 × 88309-26 should be added for evaluation of the brain and for a detailed clinical-pathologic discussion. The corresponding value for a fetal/neonatal autopsy is 4.0 × 88309-26. Although we recognize that autopsy practices vary significantly from institution to institution, it is hoped that our proposed guidelines will be a valuable starting point that individual practices can then adapt, taking into account the specifics of their practice environment.
Evaluation of large girth LDPC codes for PMD compensation by turbo equalization.
Minkov, Lyubomir L; Djordjevic, Ivan B; Xu, Lei; Wang, Ting; Kueppers, Franko
2008-08-18
Large-girth quasi-cyclic LDPC codes have been experimentally evaluated for use in PMD compensation by turbo equalization for a 10 Gb/s NRZ optical transmission system, and observing one sample per bit. Net effective coding gain improvement for girth-10, rate 0.906 code of length 11936 over maximum a posteriori probability (MAP) detector for differential group delay of 125 ps is 6.25 dB at BER of 10(-6). Girth-10 LDPC code of rate 0.8 outperforms the girth-10 code of rate 0.906 by 2.75 dB, and provides the net effective coding gain improvement of 9 dB at the same BER. It is experimentally determined that girth-10 LDPC codes of length around 15000 approach channel capacity limit within 1.25 dB.
Asres, Yihunie Hibstie; Mathuthu, Manny; Birhane, Marelgn Derso
2018-04-22
This study provides current evidence about cross-section production processes in the theoretical and experimental results of neutron induced reaction of uranium isotope on projectile energy range of 1-100 MeV in order to improve the reliability of nuclear stimulation. In such fission reactions of 235 U within nuclear reactors, much amount of energy would be released as a product that able to satisfy the needs of energy to the world wide without polluting processes as compared to other sources. The main objective of this work is to transform a related knowledge in the neutron-induced fission reactions on 235 U through describing, analyzing and interpreting the theoretical results of the cross sections obtained from computer code COMPLET by comparing with the experimental data obtained from EXFOR. The cross section value of 235 U(n,2n) 234 U, 235 U(n,3n) 233 U, 235 U(n,γ) 236 U, 235 U(n,f) are obtained using computer code COMPLET and the corresponding experimental values were browsed by EXFOR, IAEA. The theoretical results are compared with the experimental data taken from EXFOR Data Bank. Computer code COMPLET has been used for the analysis with the same set of input parameters and the graphs were plotted by the help of spreadsheet & Origin-8 software. The quantification of uncertainties stemming from both experimental data and computer code calculation plays a significant role in the final evaluated results. The calculated results for total cross sections were compared with the experimental data taken from EXFOR in the literature, and good agreement was found between the experimental and theoretical data. This comparison of the calculated data was analyzed and interpreted with tabulation and graphical descriptions, and the results were briefly discussed within the text of this research work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, Chunwei; Zhao, Hong; Zhu, Qian; Zhou, Changquan; Qiao, Jiacheng; Zhang, Lu
2018-06-01
Phase-shifting fringe projection profilometry (PSFPP) is a three-dimensional (3D) measurement technique widely adopted in industry measurement. It recovers the 3D profile of measured objects with the aid of the fringe phase. The phase accuracy is among the dominant factors that determine the 3D measurement accuracy. Evaluation of the phase accuracy helps refine adjustable measurement parameters, contributes to evaluating the 3D measurement accuracy, and facilitates improvement of the measurement accuracy. Although PSFPP has been deeply researched, an effective, easy-to-use phase accuracy evaluation method remains to be explored. In this paper, methods based on the uniform-phase coded image (UCI) are presented to accomplish phase accuracy evaluation for PSFPP. These methods work on the principle that the phase value of a UCI can be manually set to be any value, and once the phase value of a UCI pixel is the same as that of a pixel of a corresponding sinusoidal fringe pattern, their phase accuracy values are approximate. The proposed methods provide feasible approaches to evaluating the phase accuracy for PSFPP. Furthermore, they can be used to experimentally research the property of the random and gamma phase errors in PSFPP without the aid of a mathematical model to express random phase error or a large-step phase-shifting algorithm. In this paper, some novel and interesting phenomena are experimentally uncovered with the aid of the proposed methods.
Can a senior house officer's time be used more effectively?
Mitchell, J; Hayhurst, C; Robinson, S M
2004-09-01
To determine the amount of time senior house officers (SHO) spent performing tasks that could be delegated to a technician or administrative assistant and therefore to quantify the expected benefit that could be obtained by employing such physicians' assistants (PA). SHOs working in the emergency department were observed for one week by pre-clinical students who had been trained to code and time each task performed by SHOs. Activity was grouped into four categories (clinical, technical, administrative, and other). Those activities in the technical and administrative categories were those we believed could be performed by a PA. The SHOs worked 430 hours in total, of which only 25 hours were not coded due to lack of an observer. Of the 405 hours observed 86.2% of time was accounted for by the various codes. The process of taking a history and examining patients accounted for an average of 22% of coded time. Writing the patient's notes accounted for an average of 20% of coded time. Discussion with relatives and patients accounted for 4.7% of coded time and performing procedures accounted for 5.2% of coded time. On average across all shifts, 15% of coded time was spent doing either technical or administrative tasks. In this department an average of 15% of coded SHOs working time was spent performing administrative and technical tasks, rising to 17% of coded time during a night shift. This is equivalent to an average time of 78 minutes per 10 hour shift/SHO. Most tasks included in these categories could be performed by PAs thus potentially decreasing patient waiting times, improving risk management, allowing doctors to spend more time with their patients, and possibly improving doctors' training.
Monte Carlo simulations for angular and spatial distributions in therapeutic-energy proton beams
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Pan, C. Y.; Chiang, K. J.; Yuan, M. C.; Chu, C. H.; Tsai, Y. W.; Teng, P. K.; Lin, C. H.; Chao, T. C.; Lee, C. C.; Tung, C. J.; Chen, A. E.
2017-11-01
The purpose of this study is to compare the angular and spatial distributions of therapeutic-energy proton beams obtained from the FLUKA, GEANT4 and MCNP6 Monte Carlo codes. The Monte Carlo simulations of proton beams passing through two thin targets and a water phantom were investigated to compare the primary and secondary proton fluence distributions and dosimetric differences among these codes. The angular fluence distributions, central axis depth-dose profiles, and lateral distributions of the Bragg peak cross-field were calculated to compare the proton angular and spatial distributions and energy deposition. Benchmark verifications from three different Monte Carlo simulations could be used to evaluate the residual proton fluence for the mean range and to estimate the depth and lateral dose distributions and the characteristic depths and lengths along the central axis as the physical indices corresponding to the evaluation of treatment effectiveness. The results showed a general agreement among codes, except that some deviations were found in the penumbra region. These calculated results are also particularly helpful for understanding primary and secondary proton components for stray radiation calculation and reference proton standard determination, as well as for determining lateral dose distribution performance in proton small-field dosimetry. By demonstrating these calculations, this work could serve as a guide to the recent field of Monte Carlo methods for therapeutic-energy protons.
NASA Astrophysics Data System (ADS)
Dimech, C.
2013-12-01
In this contribution, I present a critical evaluation of my experience as a research student conducting an interdisciplinary project that bridges the world of geoscience with that of astronomy. The major challenge consists in studying and modifying existing geophysical software to work with synthetic solar data not obtained by direct measurement but useful for testing and evaluation, and data released from the satellite HINODE and the Solar Dynamics Observatory. I have been fortunate to collaborate closely with multiple geoscientists keen to share their software codes and help me understand their implementations so I can extend the methodology to solve problems in solar physics. Moreover, two additional experiences have helped me develop my research and collaborative skills. First was an opportunity to involve an undergraduate student, and secondly, my participation at the GNU Hackers Meeting in Paris. Three aspects that need particular attention to enhance the collective productivity of any group of individuals keen to extend existing codes to achieve further interdisciplinary goals have been identified. (1) The production of easily reusable code that users can study and modify even when large sets of computations are involved. (2) The transformation of solutions into tools that are 100% free software. (3) The harmonisation of collaborative interactions that effectively tackle the two aforementioned tasks. Each one will be discussed in detail during this session based on my experience as a research student.
Initial verification and validation of RAZORBACK - A research reactor transient analysis code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talley, Darren G.
2015-09-01
This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actualmore » ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.« less
Valdez-Martinez, E; Garduño-Espinosa, J; Martinez-Salgado, H; Porter, J D H
2004-07-01
To identify the structure, composition and work of the local research ethics committees (LRECs) of the Mexican Institute of Social Security (IMSS) in Mexico. A descriptive cross-sectional study was performed that included all LRECs of the IMSS. A total of 335 questionnaires coded in advance were posted to each LREC secretary. The requested information was from January to December 2001. The response rate was 100%. Two hundred and thirty-eight (71%) LRECs were reported as 'active' during the evaluation period. Although almost all LRECs were composed of diverse professionals, physicians dominated the LRECs' membership. The rejection rate for research projects was lower than 1 per 1000, and less than half of the LRECs held meetings to issue a report of projects' evaluation. LRECs need to foster good ethical research; implementation of an audit system to examine their work might help improve LRECs' performance and accountability.
Fuel burnup analysis for IRIS reactor using MCNPX and WIMS-D5 codes
NASA Astrophysics Data System (ADS)
Amin, E. A.; Bashter, I. I.; Hassan, Nabil M.; Mustafa, S. S.
2017-02-01
International Reactor Innovative and Secure (IRIS) reactor is a compact power reactor designed with especial features. It contains Integral Fuel Burnable Absorber (IFBA). The core is heterogeneous both axially and radially. This work provides the full core burn up analysis for IRIS reactor using MCNPX and WIMDS-D5 codes. Criticality calculations, radial and axial power distributions and nuclear peaking factor at the different stages of burnup were studied. Effective multiplication factor values for the core were estimated by coupling MCNPX code with WIMS-D5 code and compared with SAS2H/KENO-V code values at different stages of burnup. The two calculation codes show good agreement and correlation. The values of radial and axial powers for the full core were also compared with published results given by SAS2H/KENO-V code (at the beginning and end of reactor operation). The behavior of both radial and axial power distribution is quiet similar to the other data published by SAS2H/KENO-V code. The peaking factor values estimated in the present work are close to its values calculated by SAS2H/KENO-V code.
FY17 Status Report on the Initial Development of a Constitutive Model for Grade 91 Steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messner, M. C.; Phan, V. -T.; Sham, T. -L.
Grade 91 is a candidate structural material for high temperature advanced reactor applications. Existing ASME Section III, Subsection HB, Subpart B simplified design rules based on elastic analysis are setup as conservative screening tools with the intent to supplement these screening rules with full inelastic analysis when required. The Code provides general guidelines for suitable inelastic models but does not provide constitutive model implementations. This report describes the development of an inelastic constitutive model for Gr. 91 steel aimed at fulfilling the ASME Code requirements and being included into a new Section III Code appendix, HBB-Z. A large database ofmore » over 300 experiments on Gr. 91 was collected and converted to a standard XML form. Five families of Gr. 91 material models were identified in the literature. Of these five, two are potentially suitable for use in the ASME code. These two models were implemented and evaluated against the experimental database. Both models have deficiencies so the report develops a framework for developing and calibrating an improved model. This required creating a new modeling method for representing changes in material rate sensitivity across the full ASME allowable temperature range for Gr. 91 structural components: room temperature to 650° C. On top of this framework for rate sensitivity the report describes calibrating a model for work hardening and softening in the material using genetic algorithm optimization. Future work will focus on improving this trial model by including tension/compression asymmetry observed in experiments and necessary to capture material ratcheting under zero mean stress and by improving the optimization and analysis framework.« less
Development of code evaluation criteria for assessing predictive capability and performance
NASA Technical Reports Server (NTRS)
Lin, Shyi-Jang; Barson, S. L.; Sindir, M. M.; Prueger, G. H.
1993-01-01
Computational Fluid Dynamics (CFD), because of its unique ability to predict complex three-dimensional flows, is being applied with increasing frequency in the aerospace industry. Currently, no consistent code validation procedure is applied within the industry. Such a procedure is needed to increase confidence in CFD and reduce risk in the use of these codes as a design and analysis tool. This final contract report defines classifications for three levels of code validation, directly relating the use of CFD codes to the engineering design cycle. Evaluation criteria by which codes are measured and classified are recommended and discussed. Criteria for selecting experimental data against which CFD results can be compared are outlined. A four phase CFD code validation procedure is described in detail. Finally, the code validation procedure is demonstrated through application of the REACT CFD code to a series of cases culminating in a code to data comparison on the Space Shuttle Main Engine High Pressure Fuel Turbopump Impeller.
NASA Astrophysics Data System (ADS)
lwin, Maung Tin Moe; Kassim, Hassan Abu; Amin, Yusoff Mohd.
2008-05-01
The working procedures in the RESRAD for specific evaluations of environmental pollutants are briefly mentioned. The risk of human health associated with Naturally Occurring Radioactive Materials (NORM) who are working in the Malaysian oil and gas industry are analyzed. The sources of NORM and Technologically Enhanced NORM (TENORM) in the oil and gas industry are described. Some measurements for the external and internal effective dose equivalent on the workers will be described. These data are entered into the RESRAD software program and the output reports are taken. Long-term effects of TENORM to the industrial workers are also discussed with graphical illustrations. These results are compared with previous research work within the same field to validate and verify.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lwin, Maung Tin Moe; Kassim, Hassan Abu; Amin, Yusoff Mohd.
2008-05-20
The working procedures in the RESRAD for specific evaluations of environmental pollutants are briefly mentioned. The risk of human health associated with Naturally Occurring Radioactive Materials (NORM) who are working in the Malaysian oil and gas industry are analyzed. The sources of NORM and Technologically Enhanced NORM (TENORM) in the oil and gas industry are described. Some measurements for the external and internal effective dose equivalent on the workers will be described. These data are entered into the RESRAD software program and the output reports are taken. Long-term effects of TENORM to the industrial workers are also discussed with graphicalmore » illustrations. These results are compared with previous research work within the same field to validate and verify.« less
Performance evaluation of MPEG internet video coding
NASA Astrophysics Data System (ADS)
Luo, Jiajia; Wang, Ronggang; Fan, Kui; Wang, Zhenyu; Li, Ge; Wang, Wenmin
2016-09-01
Internet Video Coding (IVC) has been developed in MPEG by combining well-known existing technology elements and new coding tools with royalty-free declarations. In June 2015, IVC project was approved as ISO/IEC 14496-33 (MPEG- 4 Internet Video Coding). It is believed that this standard can be highly beneficial for video services in the Internet domain. This paper evaluates the objective and subjective performances of IVC by comparing it against Web Video Coding (WVC), Video Coding for Browsers (VCB) and AVC High Profile. Experimental results show that IVC's compression performance is approximately equal to that of the AVC High Profile for typical operational settings, both for streaming and low-delay applications, and is better than WVC and VCB.
Ethical Challenges in the Teaching of Multicultural Course Work
ERIC Educational Resources Information Center
Fier, Elizabeth Boyer; Ramsey, MaryLou
2005-01-01
The authors explore the ethical issues and challenges frequently encountered by counselor educators of multicultural course work. Existing ethics codes are examined, and the need for greater specificity with regard to teaching courses of multicultural content is addressed. Options for revising existing codes to better address the challenges of…
2012-12-27
of Work UCC Uniform Commercial Code USD(AT&L) Under Secretary of Defense for Acquisition, Technology, and Logistics WBS Work Breakdown Structure...intensive career field. The FAR, the DFARS, and other federal agency supplements of the FAR, the Uniform Commercial Code ( UCC ), installation guidelines
Computer aided design of extrusion forming tools for complex geometry profiles
NASA Astrophysics Data System (ADS)
Goncalves, Nelson Daniel Ferreira
In the profile extrusion, the experience of the die designer is crucial for obtaining good results. In industry, it is quite usual the need of several experimental trials for a specific extrusion die before a balanced flow distribution is obtained. This experimental based trial-and-error procedure is time and money consuming, but, it works, and most of the profile extrusion companies rely on such method. However, the competition is forcing the industry to look for more effective procedures and the design of profile extrusion dies is not an exception. For this purpose, computer aided design seems to be a good route. Nowadays, the available computational rheology numerical codes allow the simulation of complex fluid flows. This permits the die designer to evaluate and to optimize the flow channel, without the need to have a physical die and to perform real extrusion trials. In this work, a finite volume based numerical code was developed, for the simulation of non-Newtonian (inelastic) fluid and non-isothermal flows using unstructured meshes. The developed code is able to model the forming and cooling stages of profile extrusion, and can be used to aid the design of forming tools used in the production of complex profiles. For the code verification three benchmark problems were tested: flow between parallel plates, flow around a cylinder, and the lid driven cavity flow. The code was employed to design two extrusion dies to produce complex cross section profiles: a medical catheter die and a wood plastic composite profile for decking applications. The last was experimentally validated. Simple extrusion dies used to produced L and T shaped profiles were studied in detail, allowing a better understanding of the effect of the main geometry parameters on the flow distribution. To model the cooling stage a new implicit formulation was devised, which allowed the achievement of better convergence rates and thus the reduction of the computation times. Having in mind the solution of large dimension problems, the code was parallelized using graphics processing units (GPUs). Speedups of ten times could be obtained, drastically decreasing the time required to obtain results.
Cyclotron production of 48V via natTi(d,x)48V nuclear reaction; a promising radionuclide
NASA Astrophysics Data System (ADS)
Usman, A. R.; Khandaker, M. U.; Haba, H.
2017-06-01
In this experimental work, we studied the excitation function of natTi(d,x)48V nuclear reactions from 24 MeV down to threshold energy. Natural titanium foils were arranged in the popular stacked-foil method and activated with deuteron beam generated from an AVF cyclotron at RIKEN, Wako, Japan. The emitted γ activities from the activated foils were measured using an offline γ-ray spectrometry. The present results were analyzed, compared with earlier published experimental data and also with the evaluated data of Talys code. Our new measured data agree with some of the earlier reported experimental data while a partial agreement is found with the evaluated theoretical data. In addition to the use of 48V as a beam intensity monitor, recent studies indicate its potentials as calibrating source in PET cameras and also as a (radioactive) label for medical applications. The results are also expected to further enrich the experimental database and also to play an important role in nuclear reactions model codes design.
Facilitating cancer research using natural language processing of pathology reports.
Xu, Hua; Anderson, Kristin; Grann, Victor R; Friedman, Carol
2004-01-01
Many ongoing clinical research projects, such as projects involving studies associated with cancer, involve manual capture of information in surgical pathology reports so that the information can be used to determine the eligibility of recruited patients for the study and to provide other information, such as cancer prognosis. Natural language processing (NLP) systems offer an alternative to automated coding, but pathology reports have certain features that are difficult for NLP systems. This paper describes how a preprocessor was integrated with an existing NLP system (MedLEE) in order to reduce modification to the NLP system and to improve performance. The work was done in conjunction with an ongoing clinical research project that assesses disparities and risks of developing breast cancer for minority women. An evaluation of the system was performed using manually coded data from the research project's database as a gold standard. The evaluation outcome showed that the extended NLP system had a sensitivity of 90.6% and a precision of 91.6%. Results indicated that this system performed satisfactorily for capturing information for the cancer research project.
Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution
NASA Astrophysics Data System (ADS)
Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin
2018-04-01
The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.
Optimizing the performance and structure of the D0 Collie confidence limit evaluator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fishchler, Mark; /Fermilab
2010-07-01
D0 Collie is a program used to perform limit calculations based on ensembles of pseudo-experiments ('PEs'). Since the application of this program to the crucial Higgs mass limit is quite CPU intensive, it has been deemed important to carefully review this program, with an eye toward identifying and implementing potential performance improvements. At the same time, we identify any coding errors or opportunities for potential structural (or algorithm) improvement discovered in the course of gaining sufficient understanding of the workings of Collie to sensibly explore for optimizations. Based on a careful analysis of the program, a series of code changesmore » with potential for improving performance has been identified. The implementation and evaluation of the most important parts of this series has been done, with gratifying speedup results. The bottom line: We have identified and implemented changes leading to a factor of 2.19 speedup in the example program provided, and expected to translate to a factor of roughly 4 speedup in typical realistic usage.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Meredydd; Yu, Sha; Staniszewski, Aaron
Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less
Evans, Meredydd; Yu, Sha; Staniszewski, Aaron; ...
2018-04-17
Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less
Covariance Matrix Evaluations for Independent Mass Fission Yields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terranova, N., E-mail: nicholas.terranova@unibo.it; Serot, O.; Archier, P.
2015-01-15
Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yieldsmore » variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benitz, M. A.; Schmidt, D. P.; Lackner, M. A.
Hydrodynamic loads on the platforms of floating offshore wind turbines are often predicted with computer-aided engineering tools that employ Morison's equation and/or potential-flow theory. This work compares results from one such tool, FAST, NREL's wind turbine computer-aided engineering tool, and the computational fluid dynamics package, OpenFOAM, for the OC4-DeepCwind semi-submersible analyzed in the International Energy Agency Wind Task 30 project. Load predictions from HydroDyn, the offshore hydrodynamics module of FAST, are compared with high-fidelity results from OpenFOAM. HydroDyn uses a combination of Morison's equations and potential flow to predict the hydrodynamic forces on the structure. The implications of the assumptionsmore » in HydroDyn are evaluated based on this code-to-code comparison.« less
Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2003-01-01
This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.
Noise Measurements of the VAIIPR Fan
NASA Technical Reports Server (NTRS)
Mendoza, Jeff; Weir, Don
2012-01-01
This final report has been prepared by Honeywell Aerospace, Phoenix, Arizona, a unit of Honeywell International, Inc., documenting work performed during the period September 2004 through November 2005 for the National Aeronautics and Space Administration (NASA) Glenn Research Center, Cleveland, Ohio, under the Revolutionary Aero-Space Engine Research (RASER) Program, Contract No. NAS3- 01136, Task Order 6, Noise Measurements of the VAIIPR Fan. The NASA Task Manager was Dr. Joe Grady, NASA Glenn Research Center, Mail Code 60-6, Cleveland, Ohio 44135. The NASA Contract Officer was Mr. Albert Spence, NASA Glenn Research Center, Mail Code 60-6, Cleveland, Ohio 44135. This report focuses on the evaluation of internal fan noise as generated from various inflow disturbances based on measurements made from a circumferential array of sensors located near the fan and sensors upstream of a serpentine inlet.
NASA Technical Reports Server (NTRS)
Dunham, R. S.
1976-01-01
FORTRAN coded out-of-core equation solvers that solve using direct methods symmetric banded systems of simultaneous algebraic equations. Banded, frontal and column (skyline) solvers were studied as well as solvers that can partition the working area and thus could fit into any available core. Comparison timings are presented for several typical two dimensional and three dimensional continuum type grids of elements with and without midside nodes. Extensive conclusions are also given.
BNL severe-accident sequence experiments and analysis program. [PWR; BWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, G.A.; Ginsberg, T.; Tutu, N.K.
1983-01-01
In the analysis of degraded core accidents, the two major sources of pressure loading on light water reactor containments are: steam generation from core debris-water thermal interactions; and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described.
Nuclear Data Uncertainty Propagation to Reactivity Coefficients of a Sodium Fast Reactor
NASA Astrophysics Data System (ADS)
Herrero, J. J.; Ochoa, R.; Martínez, J. S.; Díez, C. J.; García-Herranz, N.; Cabellos, O.
2014-04-01
The assessment of the uncertainty levels on the design and safety parameters for the innovative European Sodium Fast Reactor (ESFR) is mandatory. Some of these relevant safety quantities are the Doppler and void reactivity coefficients, whose uncertainties are quantified. Besides, the nuclear reaction data where an improvement will certainly benefit the design accuracy are identified. This work has been performed with the SCALE 6.1 codes suite and its multigroups cross sections library based on ENDF/B-VII.0 evaluation.
2013-12-01
61 .62 .01 Operations Specialist 1,676 .74 .73 .72 .74 .02 Dental Technician 516 .63 .61 .62 .64 .02 Personnelman 942 .62 .63 .60 .63 .03...2 obviously apply to a mix of clerical and non-clerical occupations (i.e., Signalman, Radioman, Operations Specialist, and Dental Technician). More...Stachowski, & Dressel, 2011; Russell, Le, & Putka, 2007), Marine Corps ( Carey , 1994), and Navy (Held et al., 2002; Held et al., 2004). 9 Coding Speed
NASA Technical Reports Server (NTRS)
Gardner, Kevin D.; Liu, Jong-Shang; Murthy, Durbha V.; Kruse, Marlin J.; James, Darrell
1999-01-01
AlliedSignal Engines, in cooperation with NASA GRC (National Aeronautics and Space Administration Glenn Research Center), completed an evaluation of recently-developed aeroelastic computer codes using test cases from the AlliedSignal Engines fan blisk and turbine databases. Test data included strain gage, performance, and steady-state pressure information obtained for conditions where synchronous or flutter vibratory conditions were found to occur. Aeroelastic codes evaluated included quasi 3-D UNSFLO (MIT Developed/AE Modified, Quasi 3-D Aeroelastic Computer Code), 2-D FREPS (NASA-Developed Forced Response Prediction System Aeroelastic Computer Code), and 3-D TURBO-AE (NASA/Mississippi State University Developed 3-D Aeroelastic Computer Code). Unsteady pressure predictions for the turbine test case were used to evaluate the forced response prediction capabilities of each of the three aeroelastic codes. Additionally, one of the fan flutter cases was evaluated using TURBO-AE. The UNSFLO and FREPS evaluation predictions showed good agreement with the experimental test data trends, but quantitative improvements are needed. UNSFLO over-predicted turbine blade response reductions, while FREPS under-predicted them. The inviscid TURBO-AE turbine analysis predicted no discernible blade response reduction, indicating the necessity of including viscous effects for this test case. For the TURBO-AE fan blisk test case, significant effort was expended getting the viscous version of the code to give converged steady flow solutions for the transonic flow conditions. Once converged, the steady solutions provided an excellent match with test data and the calibrated DAWES (AlliedSignal 3-D Viscous Steady Flow CFD Solver). However, efforts expended establishing quality steady-state solutions prevented exercising the unsteady portion of the TURBO-AE code during the present program. AlliedSignal recommends that unsteady pressure measurement data be obtained for both test cases examined for use in aeroelastic code validation.
Kozlik, Julia; Neumann, Roland; Lozo, Ljubica
2015-01-01
Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus-response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus-response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S-R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed.
Kozlik, Julia; Neumann, Roland; Lozo, Ljubica
2015-01-01
Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus–response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus–response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S–R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed. PMID:25983718
ERIC Educational Resources Information Center
Uehara, Suwako; Noriega, Edgar Josafat Martinez
2016-01-01
The availability of user-friendly coding software is increasing, yet teachers might hesitate to use this technology to develop for educational needs. This paper discusses studies related to technology for educational uses and introduces an evaluation application being developed. Through questionnaires by student users and open-ended discussion by…
NASA Technical Reports Server (NTRS)
Hicks, Raymond M.; Cliff, Susan E.
1991-01-01
Full-potential, Euler, and Navier-Stokes computational fluid dynamics (CFD) codes were evaluated for use in analyzing the flow field about airfoils sections operating at Mach numbers from 0.20 to 0.60 and Reynolds numbers from 500,000 to 2,000,000. The potential code (LBAUER) includes weakly coupled integral boundary layer equations for laminar and turbulent flow with simple transition and separation models. The Navier-Stokes code (ARC2D) uses the thin-layer formulation of the Reynolds-averaged equations with an algebraic turbulence model. The Euler code (ISES) includes strongly coupled integral boundary layer equations and advanced transition and separation calculations with the capability to model laminar separation bubbles and limited zones of turbulent separation. The best experiment/CFD correlation was obtained with the Euler code because its boundary layer equations model the physics of the flow better than the other two codes. An unusual reversal of boundary layer separation with increasing angle of attack, following initial shock formation on the upper surface of the airfoil, was found in the experiment data. This phenomenon was not predicted by the CFD codes evaluated.
Modelling of aircrew radiation exposure from galactic cosmic rays and solar particle events.
Takada, M; Lewis, B J; Boudreau, M; Al Anid, H; Bennett, L G I
2007-01-01
Correlations have been developed for implementation into the semi-empirical Predictive Code for Aircrew Radiation Exposure (PCAIRE) to account for effects of extremum conditions of solar modulation and low altitude based on transport code calculations. An improved solar modulation model, as proposed by NASA, has been further adopted to interpolate between the bounding correlations for solar modulation. The conversion ratio of effective dose to ambient dose equivalent, as applied to the PCAIRE calculation (based on measurements) for the legal regulation of aircrew exposure, was re-evaluated in this work to take into consideration new ICRP-92 radiation-weighting factors and different possible irradiation geometries of the source cosmic-radiation field. A computational analysis with Monte Carlo N-Particle eXtended Code was further used to estimate additional aircrew exposure that may result from sporadic solar energetic particle events considering real-time monitoring by the Geosynchronous Operational Environmental Satellite. These predictions were compared with the ambient dose equivalent rates measured on-board an aircraft and to count rate data observed at various ground-level neutron monitors.
The Upper Midwest Health Study: industry and occupation of glioma cases and controls.
Ruder, Avima M; Waters, Martha A; Carreón, Tania; Butler, Mary A; Calvert, Geoffrey M; Davis-King, Karen E; Waters, Kathleen M; Schulte, Paul A; Mandel, Jack S; Morton, Roscoe F; Reding, Douglas J; Rosenman, Kenneth D
2012-09-01
Understanding glioma etiology requires determining which environmental factors are associated with glioma. Upper Midwest Health Study case-control participant work histories collected 1995-1998 were evaluated for occupational associations with glioma. "Exposures of interest" from our study protocol comprise our a priori hypotheses. Year-long or longer jobs for 1,973 participants were assigned Standard Occupational Classifications (SOC) and Standard Industrial Classifications (SIC). The analysis file includes 8,078 SIC- and SOC-coded jobs. For each individual, SAS 9.2 programs collated employment with identical SIC-SOC coding. Distributions of longest "total employment duration" (total years worked in jobs with identical industry and occupation codes, including multiple jobs, and non-consecutive jobs) were compared between cases and controls, using an industrial hygiene algorithm to group occupations. Longest employment duration was calculated for 780 cases and 1,156 controls. More case than control longest total employment duration was in the "engineer, architect" occupational group [16 cases, 10 controls, odds ratio (OR) 2.50, adjusted for age group, sex, age and education, 95% confidence interval (CI) 1.12-5.60]. Employment as a food processing worker [mostly butchers and meat cutters] was of borderline significance (27 cases, 21 controls, adjusted OR: 1.78, CI: 0.99-3.18). Among our exposures of interest work as engineers or as butchers and meat cutters was associated with increased glioma risk. Significant associations could be due to chance, because of multiple comparisons, but similar findings have been reported for other glioma studies. Our results suggest some possible associations but by themselves could not provide conclusive evidence. Copyright © 2012 Wiley Periodicals, Inc.
The Upper Midwest Health Study: Industry and Occupation of Glioma Cases and Controls
Ruder, Avima M.; Waters, Martha A.; Carreón, Tania; Butler, Mary A.; Calvert, Geoffrey M.; Davis-King, Karen E.; Waters, Kathleen M.; Schulte, Paul A.; Mandel, Jack S.; Morton, Roscoe F.; Reding, Douglas J.; Rosenman, Kenneth D.
2015-01-01
Background Understanding glioma etiology requires determining which environmental factors are associated with glioma. Upper Midwest Health Study case–control participant work histories collected 1995–1998 were evaluated for occupational associations with glioma. “Exposures of interest” from our study protocol comprise our a priori hypotheses. Materials and Methods Year-long or longer jobs for 1,973 participants were assigned Standard Occupational Classifications (SOC) and Standard Industrial Classifications (SIC). The analysis file includes 8,078 SIC- and SOC-coded jobs. For each individual, SAS 9.2 programs collated employment with identical SIC-SOC coding. Distributions of longest “total employment duration” (total years worked in jobs with identical industry and occupation codes, including multiple jobs, and non-consecutive jobs) were compared between cases and controls, using an industrial hygiene algorithm to group occupations. Results Longest employment duration was calculated for 780 cases and 1,156 controls. More case than control longest total employment duration was in the “engineer, architect” occupational group [16 cases, 10 controls, odds ratio (OR) 2.50, adjusted for age group, sex, age and education, 95% confidence interval (CI) 1.12–5.60]. Employment as a food processing worker [mostly butchers and meat cutters] was of borderline significance (27 cases, 21 controls, adjusted OR: 1.78, CI: 0.99–3.18). Conclusions Among our exposures of interest work as engineers or as butchers and meat cutters was associated with increased glioma risk. Significant associations could be due to chance, because of multiple comparisons, but similar findings have been reported for other glioma studies. Our results suggest some possible associations but by themselves could not provide conclusive evidence. PMID:22715102
NASA Astrophysics Data System (ADS)
Bird, Robert; Nystrom, David; Albright, Brian
2017-10-01
The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.
ERIC Educational Resources Information Center
Baker, Opal Ruth
Research on Spanish/English code switching is reviewed and the definitions and categories set up by the investigators are examined. Their methods of locating, limiting, and classifying true code switches, and the terms used and results obtained, are compared. It is found that in these studies, conversational (intra-discourse) code switching is…
Low-delay predictive audio coding for the HIVITS HDTV codec
NASA Astrophysics Data System (ADS)
McParland, A. K.; Gilchrist, N. H. C.
1995-01-01
The status of work relating to predictive audio coding, as part of the European project on High Quality Video Telephone and HD(TV) Systems (HIVITS), is reported. The predictive coding algorithm is developed, along with six-channel audio coding and decoding hardware. Demonstrations of the audio codec operating in conjunction with the video codec, are given.
Code of Fair Testing Practices in Education (Revised)
ERIC Educational Resources Information Center
Educational Measurement: Issues and Practice, 2005
2005-01-01
A note from the Working Group of the Joint Committee on Testing Practices: The "Code of Fair Testing Practices in Education (Code)" prepared by the Joint Committee on Testing Practices (JCTP) has just been revised for the first time since its initial introduction in 1988. The revision of the Code was inspired primarily by the revision of…
Critical Care Coding for Neurologists.
Nuwer, Marc R; Vespa, Paul M
2015-10-01
Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.
Coding of Neuroinfectious Diseases.
Barkley, Gregory L
2015-12-01
Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.
Diagnostic Coding for Epilepsy.
Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R
2016-02-01
Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.
Evaluation of Persons of Varying Ages.
ERIC Educational Resources Information Center
Stolte, John F.
1996-01-01
Reviews two experiments that strongly support dual coding theory. Dual coding theory holds that communicating concretely (tactile, auditory, or visual stimuli) affects evaluative thinking stronger than communicating abstractly through words and numbers. The experiments applied this theory to the realm of age and evaluation. (MJP)
Impact of the level of state tax code progressivity on children's health outcomes.
Granruth, Laura Brierton; Shields, Joseph J
2011-08-01
This research study examines the impact of the level of state tax code progressivity on selected children's health outcomes. Specifically, it examines the degree to which a state's tax code ranking along the progressive-regressive continuum relates to percentage of low birthweight babies, infant and child mortality rates, and percentage of uninsured children. Using data merged from a number of public data sets, the authors find that the level of state tax code progressivity is a factor in state rates of infant and child mortality. States with lower median incomes and regressive tax policies have the highest rates of infant and child mortality.With regard to the percentage of children 17 years of age and below who lack health insurance, it is found that larger states with regressive tax policies have the largest percentage of uninsured children. In general, more heavily populated states with more progressive tax codes have healthier children. The implications of these findings are discussed in terms of tax policy and the well-being of children as well as for social work education, social work practice, and social work research.
Wilkinson, Karl A; Hine, Nicholas D M; Skylaris, Chris-Kriton
2014-11-11
We present a hybrid MPI-OpenMP implementation of Linear-Scaling Density Functional Theory within the ONETEP code. We illustrate its performance on a range of high performance computing (HPC) platforms comprising shared-memory nodes with fast interconnect. Our work has focused on applying OpenMP parallelism to the routines which dominate the computational load, attempting where possible to parallelize different loops from those already parallelized within MPI. This includes 3D FFT box operations, sparse matrix algebra operations, calculation of integrals, and Ewald summation. While the underlying numerical methods are unchanged, these developments represent significant changes to the algorithms used within ONETEP to distribute the workload across CPU cores. The new hybrid code exhibits much-improved strong scaling relative to the MPI-only code and permits calculations with a much higher ratio of cores to atoms. These developments result in a significantly shorter time to solution than was possible using MPI alone and facilitate the application of the ONETEP code to systems larger than previously feasible. We illustrate this with benchmark calculations from an amyloid fibril trimer containing 41,907 atoms. We use the code to study the mechanism of delamination of cellulose nanofibrils when undergoing sonification, a process which is controlled by a large number of interactions that collectively determine the structural properties of the fibrils. Many energy evaluations were needed for these simulations, and as these systems comprise up to 21,276 atoms this would not have been feasible without the developments described here.
Mass transfer effects in a gasification riser
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breault, Ronald W.; Li, Tingwen; Nicoletti, Phillip
2013-07-01
In the development of multiphase reacting computational fluid dynamics (CFD) codes, a number of simplifications were incorporated into the codes and models. One of these simplifications was the use of a simplistic mass transfer correlation for the faster reactions and omission of mass transfer effects completely on the moderate speed and slow speed reactions such as those in a fluidized bed gasifier. Another problem that has propagated is that the mass transfer correlation used in the codes is not universal and is being used far from its developed bubbling fluidized bed regime when applied to circulating fluidized bed (CFB) risermore » reactors. These problems are true for the major CFD codes. To alleviate this problem, a mechanistic based mass transfer coefficient algorithm has been developed based upon an earlier work by Breault et al. This fundamental approach uses the local hydrodynamics to predict a local, time varying mass transfer coefficient. The predicted mass transfer coefficients and the corresponding Sherwood numbers agree well with literature data and are typically about an order of magnitude lower than the correlation noted above. The incorporation of the new mass transfer model gives the expected behavior for all the gasification reactions evaluated in the paper. At the expected and typical design values for the solid flow rate in a CFB riser gasifier an ANOVA analysis has shown the predictions from the new code to be significantly different from the original code predictions. The new algorithm should be used such that the conversions are not over predicted. Additionally, its behaviors with changes in solid flow rate are consistent with the changes in the hydrodynamics.« less
First-principles study of structural and electronic properties of Be0.25Zn0.75S mixed compound
NASA Astrophysics Data System (ADS)
Paliwal, U.; Joshi, K. B.
2018-05-01
In this work the first-principles study of structural and electronic properties of Be0.25Zn0.75S mixed compound is presented. The calculations are performed applying the QUANTUM ESPRESSO code utilizing the Perdew, Becke, Ernzerhof generalized gradient approximation in the framework of density functional theory. Adopting standard optimization strategy, the ground state equilibrium lattice constant and bulk modulus are calculated. After settling the structure the electronic band structure, bandgap and static dielectric constant are evaluated. In absence of any experimental work on this system our findings are compared with the available theoretical calculations which are found to follow well anticipated general trends.
Moral codes of mothering and the introduction of welfare-to-work in Ontario.
Gazso, Amber
2012-02-01
In this paper, I trace how the reform of social assistance in Ontario, especially the post-1990s enforcement of lone mothers' employability via welfare-to-work programs, parallels shifts in dominant moral codes of mothering, from "mother-carer" to "mother-worker." Additionally, I use this case as an entry point to consider the implications of public and policy allegiance to these moral codes for all mothers. The central argument I make is that the introduction of welfare-to-work programs in Ontario did not occur in a neoliberal state-sanctioned vacuum but also involved the circulation of ideas about moral mothering outside of policy into policy.
Overview of FAR-TECH's magnetic fusion energy research
NASA Astrophysics Data System (ADS)
Kim, Jin-Soo; Bogatu, I. N.; Galkin, S. A.; Spencer, J. Andrew; Svidzinski, V. A.; Zhao, L.
2017-10-01
FAR-TECH, Inc. has been working on magnetic fusion energy research over two-decades. During the years, we have developed unique approaches to help understanding the physics, and resolving issues in magnetic fusion energy. The specific areas of work have been in modeling RF waves in plasmas, MHD modeling and mode-identification, and nano-particle plasma jet and its application to disruption mitigation. Our research highlights in recent years will be presented with examples, specifically, developments of FullWave (Full Wave RF code), PMARS (Parallelized MARS code), and HEM (Hybrid ElectroMagnetic code). In addition, nano-particle plasma-jet (NPPJ) and its application for disruption mitigation will be presented. Work is supported by the U.S. DOE SBIR program.
Path Toward a Unifid Geometry for Radiation Transport
NASA Technical Reports Server (NTRS)
Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann
2014-01-01
The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats
CARES/LIFE Software Commercialization
NASA Technical Reports Server (NTRS)
1995-01-01
The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.
Meyer, John D; Nichols, Ginger H; Warren, Nicholas; Reisine, Susan
2008-03-01
To determine the effects of employment on low birth weight (LBW) in a service-based economy, we evaluated the association of LBW delivery with occupational data collected in a state birth registry. Occupational data in the 2000 Connecticut birth registry were coded for 41,009 singleton births. Associations between employment and LBW delivery were analyzed using logistic regression controlling for covariates in the registry data set. Evidence for improved LBW outcomes in working mothers did not persist when adjusted for maternal covariates. Among working mothers, elevated risk of LBW was seen in textile, food service, personal appearance, material dispatching or distributing, and retail sales workers. Improved overall birth outcomes seen in working mothers may arise from favorable demographic and health attributes. Higher LBW risk was seen in several types of service sector jobs and in textile work.
CTF (Subchannel) Calculations and Validation L3:VVI.H2L.P15.01
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, Natalie
The goal of the Verification and Validation Implementation (VVI) High to Low (Hi2Lo) process is utilizing a validated model in a high resolution code to generate synthetic data for improvement of the same model in a lower resolution code. This process is useful in circumstances where experimental data does not exist or it is not sufficient in quantity or resolution. Data from the high-fidelity code is treated as calibration data (with appropriate uncertainties and error bounds) which can be used to train parameters that affect solution accuracy in the lower-fidelity code model, thereby reducing uncertainty. This milestone presents a demonstrationmore » of the Hi2Lo process derived in the VVI focus area. The majority of the work performed herein describes the steps of the low-fidelity code used in the process with references to the work detailed in the companion high-fidelity code milestone (Reference 1). The CASL low-fidelity code used to perform this work was Cobra Thermal Fluid (CTF) and the high-fidelity code was STAR-CCM+ (STAR). The master branch version of CTF (pulled May 5, 2017 – Reference 2) was utilized for all CTF analyses performed as part of this milestone. The statistical and VVUQ components of the Hi2Lo framework were performed using Dakota version 6.6 (release date May 15, 2017 – Reference 3). Experimental data from Westinghouse Electric Company (WEC – Reference 4) was used throughout the demonstrated process to compare with the high-fidelity STAR results. A CTF parameter called Beta was chosen as the calibration parameter for this work. By default, Beta is defined as a constant mixing coefficient in CTF and is essentially a tuning parameter for mixing between subchannels. Since CTF does not have turbulence models like STAR, Beta is the parameter that performs the most similar function to the turbulence models in STAR. The purpose of the work performed in this milestone is to tune Beta to an optimal value that brings the CTF results closer to those measured in the WEC experiments.« less
Goode, N; Salmon, P M; Taylor, N Z; Lenné, M G; Finch, C F
2017-10-01
One factor potentially limiting the uptake of Rasmussen's (1997) Accimap method by practitioners is the lack of a contributing factor classification scheme to guide accident analyses. This article evaluates the intra- and inter-rater reliability and criterion-referenced validity of a classification scheme developed to support the use of Accimap by led outdoor activity (LOA) practitioners. The classification scheme has two levels: the system level describes the actors, artefacts and activity context in terms of 14 codes; the descriptor level breaks the system level codes down into 107 specific contributing factors. The study involved 11 LOA practitioners using the scheme on two separate occasions to code a pre-determined list of contributing factors identified from four incident reports. Criterion-referenced validity was assessed by comparing the codes selected by LOA practitioners to those selected by the method creators. Mean intra-rater reliability scores at the system (M = 83.6%) and descriptor (M = 74%) levels were acceptable. Mean inter-rater reliability scores were not consistently acceptable for both coding attempts at the system level (M T1 = 68.8%; M T2 = 73.9%), and were poor at the descriptor level (M T1 = 58.5%; M T2 = 64.1%). Mean criterion referenced validity scores at the system level were acceptable (M T1 = 73.9%; M T2 = 75.3%). However, they were not consistently acceptable at the descriptor level (M T1 = 67.6%; M T2 = 70.8%). Overall, the results indicate that the classification scheme does not currently satisfy reliability and validity requirements, and that further work is required. The implications for the design and development of contributing factors classification schemes are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kalashnikova, Olga; Garay, Michael; Xu, Feng; Diner, David; Seidel, Felix
2016-07-01
Multiangle spectro-polarimetric measurements have been advocated as an additional tool for better understanding and quantifying the aerosol properties needed for atmospheric correction for ocean color retrievals. The central concern of this work is the assessment of the effects of absorbing aerosol properties on remote sensing reflectance measurement uncertainty caused by neglecting UV-enhanced absorption of carbonaceous particles and by not accounting for dust nonsphericity. In addition, we evaluate the polarimetric sensitivity of absorbing aerosol properties in light of measurement uncertainties achievable for the next generation of multi-angle polarimetric imaging instruments, and demonstrate advantages and disadvantages of wavelength selection in the UV/VNIR range. In this work a vector Markov Chain radiative transfer code including bio-optical models was used to quantitatively evaluate in water leaving radiances between atmospheres containing realistic UV-enhanced and non-spherical aerosols and the SEADAS carbonaceous and dust-like aerosol models. The phase matrices for the spherical smoke particles were calculated using a standard Mie code, while those for non-spherical dust particles were calculated using the numerical approach developed for modeling dust for the AERONET network of ground-based sunphotometers. As a next step, we have developed a retrieval code that employs a coupled Markov Chain (MC) and adding/doubling radiative transfer method for joint retrieval of aerosol properties and water leaving radiance from Airborne Multiangle SpectroPolarimetric Imager-1 (AirMSPI-1) polarimetric observations. The AirMSPI-1 instrument has been flying aboard the NASA ER-2 high altitude aircraft since October 2010. AirMSPI typically acquires observations of a target area at 9 view angles between ±67° at 10 m resolution. AirMSPI spectral channels are centered at 355, 380, 445, 470, 555, 660, and 865 nm, with 470, 660, and 865 reporting linear polarization. We tested prototype retrievals by comparing the retrieved aerosol concentration, size distribution, water-leaving radiance, and chlorophyll concentrations from Airborne Multiangle SpectroPolarimetric Imager-1 (AirMSPI-1) observations to values reported by the USC SeaPRISM AERONET-OC site off the coast of California. The retrieval was then applied to a variety of costal regions in California to evaluate variability in the water-leaving radiance under different atmospheric conditions. We will present results, and will discuss algorithm sensitivity and potential applications for future space-borne coastal monitoring.
Code qualification of structural materials for AFCI advanced recycling reactors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Natesan, K.; Li, M.; Majumdar, S.
2012-05-31
This report summarizes the further findings from the assessments of current status and future needs in code qualification and licensing of reference structural materials and new advanced alloys for advanced recycling reactors (ARRs) in support of Advanced Fuel Cycle Initiative (AFCI). The work is a combined effort between Argonne National Laboratory (ANL) and Oak Ridge National Laboratory (ORNL) with ANL as the technical lead, as part of Advanced Structural Materials Program for AFCI Reactor Campaign. The report is the second deliverable in FY08 (M505011401) under the work package 'Advanced Materials Code Qualification'. The overall objective of the Advanced Materials Codemore » Qualification project is to evaluate key requirements for the ASME Code qualification and the Nuclear Regulatory Commission (NRC) approval of structural materials in support of the design and licensing of the ARR. Advanced materials are a critical element in the development of sodium reactor technologies. Enhanced materials performance not only improves safety margins and provides design flexibility, but also is essential for the economics of future advanced sodium reactors. Code qualification and licensing of advanced materials are prominent needs for developing and implementing advanced sodium reactor technologies. Nuclear structural component design in the U.S. must comply with the ASME Boiler and Pressure Vessel Code Section III (Rules for Construction of Nuclear Facility Components) and the NRC grants the operational license. As the ARR will operate at higher temperatures than the current light water reactors (LWRs), the design of elevated-temperature components must comply with ASME Subsection NH (Class 1 Components in Elevated Temperature Service). However, the NRC has not approved the use of Subsection NH for reactor components, and this puts additional burdens on materials qualification of the ARR. In the past licensing review for the Clinch River Breeder Reactor Project (CRBRP) and the Power Reactor Innovative Small Module (PRISM), the NRC/Advisory Committee on Reactor Safeguards (ACRS) raised numerous safety-related issues regarding elevated-temperature structural integrity criteria. Most of these issues remained unresolved today. These critical licensing reviews provide a basis for the evaluation of underlying technical issues for future advanced sodium-cooled reactors. Major materials performance issues and high temperature design methodology issues pertinent to the ARR are addressed in the report. The report is organized as follows: the ARR reference design concepts proposed by the Argonne National Laboratory and four industrial consortia were reviewed first, followed by a summary of the major code qualification and licensing issues for the ARR structural materials. The available database is presented for the ASME Code-qualified structural alloys (e.g. 304, 316 stainless steels, 2.25Cr-1Mo, and mod.9Cr-1Mo), including physical properties, tensile properties, impact properties and fracture toughness, creep, fatigue, creep-fatigue interaction, microstructural stability during long-term thermal aging, material degradation in sodium environments and effects of neutron irradiation for both base metals and weld metals. An assessment of modified versions of Type 316 SS, i.e. Type 316LN and its Japanese version, 316FR, was conducted to provide a perspective for codification of 316LN or 316FR in Subsection NH. Current status and data availability of four new advanced alloys, i.e. NF616, NF616+TMT, NF709, and HT-UPS, are also addressed to identify the R&D needs for their code qualification for ARR applications. For both conventional and new alloys, issues related to high temperature design methodology are described to address the needs for improvements for the ARR design and licensing. Assessments have shown that there are significant data gaps for the full qualification and licensing of the ARR structural materials. Development and evaluation of structural materials require a variety of experimental facilities that have been seriously degraded in the past. The availability and additional needs for the key experimental facilities are summarized at the end of the report. Detailed information covered in each Chapter is given.« less
The structure of affective action representations: temporal binding of affective response codes.
Eder, Andreas B; Müsseler, Jochen; Hommel, Bernhard
2012-01-01
Two experiments examined the hypothesis that preparing an action with a specific affective connotation involves the binding of this action to an affective code reflecting this connotation. This integration into an action plan should lead to a temporary occupation of the affective code, which should impair the concurrent representation of affectively congruent events, such as the planning of another action with the same valence. This hypothesis was tested with a dual-task setup that required a speeded choice between approach- and avoidance-type lever movements after having planned and before having executed an evaluative button press. In line with the code-occupation hypothesis, slower lever movements were observed when the lever movement was affectively compatible with the prepared evaluative button press than when the two actions were affectively incompatible. Lever movements related to approach and avoidance and evaluative button presses thus seem to share a code that represents affective meaning. A model of affective action control that is based on the theory of event coding is discussed.
Ethical Behavior in Early Childhood Education.
ERIC Educational Resources Information Center
Katz, Lilian G.; Ward, Evangeline H.
This booklet contains two essays on ethics for early childhood educators. The first essay discusses the meaning of a code of ethics, the importance of a code of ethics for working with preschool children, ethical conflicts in day care and preschool work, and steps which may be taken to help early childhood workers resolve these conflicts. Ethical…
1987-09-15
MAC; CODE NUMBER: NONE AND REPAIR PARTS AND SPECIAL TOOLS LIST (RPSTL). RESPONSIBILITY: ROY & ILS DURATION: 32.00 WORK DAYS PRE PPPL SCHEDULE...ILS DURATION: 22.00 WORK DAYS R/V PPPL SCHEDULE: DVPMARPS REVIEW AND VALIDATE PRELIMINARY PROVISIONING PARTS LIST. CODE NUMBER: NONE RESPONSIBILITY
NASA Astrophysics Data System (ADS)
Brogt, Erik; Foster, Tom; Dokter, Erin; Buxner, Sanlyn; Antonellis, Jessie
We present an argument for, and suggested implementation of, a code of ethics for the astronomy education research community. This code of ethics is based on legal and ethical considerations set forth by U.S. federal regulations and the existing code of conduct of the American Educational Research Association. We also provide a fictitious research study as an example for working through the suggested code of ethics.
CFD code evaluation for internal flow modeling
NASA Technical Reports Server (NTRS)
Chung, T. J.
1990-01-01
Research on the computational fluid dynamics (CFD) code evaluation with emphasis on supercomputing in reacting flows is discussed. Advantages of unstructured grids, multigrids, adaptive methods, improved flow solvers, vector processing, parallel processing, and reduction of memory requirements are discussed. As examples, researchers include applications of supercomputing to reacting flow Navier-Stokes equations including shock waves and turbulence and combustion instability problems associated with solid and liquid propellants. Evaluation of codes developed by other organizations are not included. Instead, the basic criteria for accuracy and efficiency have been established, and some applications on rocket combustion have been made. Research toward an ultimate goal, the most accurate and efficient CFD code, is in progress and will continue for years to come.
An object-based visual attention model for robotic applications.
Yu, Yuanlong; Mann, George K I; Gosine, Raymond G
2010-10-01
By extending integrated competition hypothesis, this paper presents an object-based visual attention model, which selects one object of interest using low-dimensional features, resulting that visual perception starts from a fast attentional selection procedure. The proposed attention model involves seven modules: learning of object representations stored in a long-term memory (LTM), preattentive processing, top-down biasing, bottom-up competition, mediation between top-down and bottom-up ways, generation of saliency maps, and perceptual completion processing. It works in two phases: learning phase and attending phase. In the learning phase, the corresponding object representation is trained statistically when one object is attended. A dual-coding object representation consisting of local and global codings is proposed. Intensity, color, and orientation features are used to build the local coding, and a contour feature is employed to constitute the global coding. In the attending phase, the model preattentively segments the visual field into discrete proto-objects using Gestalt rules at first. If a task-specific object is given, the model recalls the corresponding representation from LTM and deduces the task-relevant feature(s) to evaluate top-down biases. The mediation between automatic bottom-up competition and conscious top-down biasing is then performed to yield a location-based saliency map. By combination of location-based saliency within each proto-object, the proto-object-based saliency is evaluated. The most salient proto-object is selected for attention, and it is finally put into the perceptual completion processing module to yield a complete object region. This model has been applied into distinct tasks of robots: detection of task-specific stationary and moving objects. Experimental results under different conditions are shown to validate this model.
NASA Astrophysics Data System (ADS)
Cunha, Diego M.; Tomal, Alessandra; Poletti, Martin E.
2013-04-01
In this work, the Monte Carlo (MC) code PENELOPE was employed for simulation of x-ray spectra in mammography and contrast-enhanced digital mammography (CEDM). Spectra for Mo, Rh and W anodes were obtained for tube potentials between 24-36 kV, for mammography, and between 45-49 kV, for CEDM. The spectra obtained from the simulations were analytically filtered to correspond to the anode/filter combinations usually employed in each technique (Mo/Mo, Rh/Rh and W/Rh for mammography and Mo/Cu, Rh/Cu and W/Cu for CEDM). For the Mo/Mo combination, the simulated spectra were compared with those obtained experimentally, and for spectra for the W anode, with experimental data from the literature, through comparison of distribution shape, average energies, half-value layers (HVL) and transmission curves. For all combinations evaluated, the simulated spectra were also compared with those provided by different models from the literature. Results showed that the code PENELOPE provides mammographic x-ray spectra in good agreement with those experimentally measured and those from the literature. The differences in the values of HVL ranged between 2-7%, for anode/filter combinations and tube potentials employed in mammography, and they were less than 5% for those employed in CEDM. The transmission curves for the spectra obtained also showed good agreement compared to those computed from reference spectra, with average relative differences less than 12% for mammography and CEDM. These results show that the code PENELOPE can be a useful tool to generate x-ray spectra for studies in mammography and CEDM, and also for evaluation of new x-ray tube designs and new anode materials.
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-07
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.
Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project
ERIC Educational Resources Information Center
Bolstad, Rachel
2016-01-01
This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…
Swinburn, Boyd; Vandevijvere, Stefanie; Woodward, Alistair; Hornblow, Andrew; Richardson, Ann; Burlingame, Barbara; Borman, Barry; Taylor, Barry; Breier, Bernhard; Arroll, Bruce; Drummond, Bernadette; Grant, Cameron; Bullen, Chris; Wall, Clare; Mhurchu, Cliona Ni; Cameron-Smith, David; Menkes, David; Murdoch, David; Mangin, Dee; Lennon, Diana; Sarfati, Diana; Sellman, Doug; Rush, Elaine; Sopoaga, Faafetai; Thomson, George; Devlin, Gerry; Abel, Gillian; White, Harvey; Coad, Jane; Hoek, Janet; Connor, Jennie; Krebs, Jeremy; Douwes, Jeroen; Mann, Jim; McCall, John; Broughton, John; Potter, John D; Toop, Les; McCowan, Lesley; Signal, Louise; Beckert, Lutz; Elwood, Mark; Kruger, Marlena; Farella, Mauro; Baker, Michael; Keall, Michael; Skeaff, Murray; Thomson, Murray; Wilson, Nick; Chandler, Nicholas; Reid, Papaarangi; Priest, Patricia; Brunton, Paul; Crampton, Peter; Davis, Peter; Gendall, Philip; Howden-Chapman, Philippa; Taylor, Rachael; Edwards, Richard; Beaglehole, Robert; Doughty, Robert; Scragg, Robert; Gauld, Robin; McGee, Robert; Jackson, Rod; Hughes, Roger; Mulder, Roger; Bonita, Ruth; Kruger, Rozanne; Casswell, Sally; Derrett, Sarah; Ameratunga, Shanthi; Denny, Simon; Hales, Simon; Pullon, Sue; Wells, Susan; Cundy, Tim; Blakely, Tony
2017-02-17
Reducing the exposure of children and young people to the marketing of unhealthy foods is a core strategy for reducing the high overweight and obesity prevalence in this population. The Advertising Standards Authority (ASA) has recently reviewed its self-regulatory codes and proposed a revised single code on advertising to children. This article evaluates the proposed code against eight criteria for an effective code, which were included in a submission to the ASA review process from over 70 New Zealand health professors. The evaluation found that the proposed code largely represents no change or uncertain change from the existing codes, and cannot be expected to provide substantial protection for children and young people from the marketing of unhealthy foods. Government regulations will be needed to achieve this important outcome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less
MO-A-213-01: 2015 Economics Update Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dirksen, B.
2015-06-15
The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less
MO-A-213-02: 2015 Economics Update Part 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fontenot, J.
2015-06-15
The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less
A Degree Distribution Optimization Algorithm for Image Transmission
NASA Astrophysics Data System (ADS)
Jiang, Wei; Yang, Junjie
2016-09-01
Luby Transform (LT) code is the first practical implementation of digital fountain code. The coding behavior of LT code is mainly decided by the degree distribution which determines the relationship between source data and codewords. Two degree distributions are suggested by Luby. They work well in typical situations but not optimally in case of finite encoding symbols. In this work, the degree distribution optimization algorithm is proposed to explore the potential of LT code. Firstly selection scheme of sparse degrees for LT codes is introduced. Then probability distribution is optimized according to the selected degrees. In image transmission, bit stream is sensitive to the channel noise and even a single bit error may cause the loss of synchronization between the encoder and the decoder. Therefore the proposed algorithm is designed for image transmission situation. Moreover, optimal class partition is studied for image transmission with unequal error protection. The experimental results are quite promising. Compared with LT code with robust soliton distribution, the proposed algorithm improves the final quality of recovered images obviously with the same overhead.
The Effect of Cold Work on Properties of Alloy 617
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, Richard
2014-08-01
Alloy 617 is approved for non-nuclear construction in the ASME Boiler and Pressure Vessel Code Section I and Section VIII, but is not currently qualified for nuclear use in ASME Code Section III. A draft Code Case was submitted in 1992 to qualify the alloy for nuclear service but efforts were stopped before the approval process was completed.1 Renewed interest in high temperature nuclear reactors has resulted in a new effort to qualify Alloy 617 for use in nuclear pressure vessels. The mechanical and physical properties of Alloy 617 were extensively characterized for the VHTR programs in the 1980’s andmore » incorporated into the 1992 draft Code Case. Recently, the properties of modern heats of the alloy that incorporate an additional processing step, electro-slag re-melting, have been characterized both to confirm that the properties of contemporary material are consistent with those in the historical record and to increase the available database. A number of potential issues that were identified as requiring further consideration prior to the withdrawal of the 1992 Code Case are also being re-examined in the current R&D program. Code Cases are again being developed to allow use of Alloy 617 for nuclear design within the rules of the ASME Boiler and Pressure Vessel Code. In general the Code defines two temperature ranges for nuclear design with austenitic and nickel based alloys. Below 427°C (800°F) time dependent behavior is not considered, while above this temperature creep and creep-fatigue are considered to be the dominant life-limiting deformation modes. There is a corresponding differentiation in the treatment of the potential for effects associated with cold work. Below 427°C the principal issue is the relationship between the level of cold work and the propensity for stress corrosion cracking and above that temperature the primary concern is the impact of cold work on creep-rupture behavior.« less
Qing, Yu; Shuai, Han; Qiang, Wang; Jing-Bo, Xue
2017-06-08
To report the integrated progress of the hydatid disease information management system, and to provide the reference for further system improvements by analysis of results on simulation test feedback. The work of institutional code matching by collecting fundamental and integrated information of the system in epidemic areas of hydatid disease was carried out, and professional control agencies were selected to carry out the simulation test. The results of agencies code matching at stage indicated the average completion rate was 94.30% on administrative agencies, 69.94% on registered professional agencies and 56.40% on professional institutions matching related to hydatid disease prevention and control implements in seven provinces (autonomous regions) and Xinjiang Production and Construction Corps. Meanwhile, the response rate of open-ended proposals was 93.33% on fifteen feedbacks, and the statistics showed 21.43% believed the system was low fluency, 64.29% considered the system was inconvenience for data inputs and 42.86% considered it would be improved on system statistics functions, of which 27.78% were provincial users, 22.22% were the city users and 50.00% were the county users. The hydatid disease prevention information management system meets the fundamental needs of the majority agencies in hyperendemic areas of echinococcosis, it needs to develop the further test with more agencies joining after the work of the institutional code matching completion and the system service improvement in the next stage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Womeldorff, Geoffrey Alan; Payne, Joshua Estes; Bergen, Benjamin Karl
These are slides for a presentation on PARTISN Research and FleCSI Updates. The following topics are covered: SNAP vs PARTISN, Background Research, Production Code (structural design and changes, kernel design and implementation, lessons learned), NuT IMC Proxy, FleCSI Update (design and lessons learned). It can all be summarized in the following manner: Kokkos was shown to be effective in FY15 in implementing a C++ version of SNAP's kernel. This same methodology was applied to a production IC code, PARTISN. This was a much more complex endeavour than in FY15 for many reasons; a C++ kernel embedded in Fortran, overloading Fortranmore » memory allocations, general language interoperability, and a fully fleshed out production code versus a simplified proxy code. Lessons learned are Legion. In no particular order: Interoperability between Fortran and C++ was really not that hard, and a useful engineering effort. Tracking down all necessary memory allocations for a kernel in a production code is pretty hard. Modifying a production code to work for more than a handful of use cases is also pretty hard. Figuring out the toolchain that will allow a successful implementation of design decisions is quite hard, if making use of "bleeding edge" design choices. In terms of performance, production code concurrency architecture can be a virtual showstopper; being too complex to easily rewrite and test in a short period of time, or depending on tool features which do not exist yet. Ultimately, while the tools used in this work were not successful in speeding up the production code, they helped to identify how work would be done, and provide requirements to tools.« less
ERIC Educational Resources Information Center
Swank, Linda K.
1994-01-01
Relationships between phonological coding abilities and reading outcomes have implications for differential diagnosis of language-based reading problems. The theoretical construct of specific phonological coding ability is explained, including phonological encoding, phonological awareness and metaphonology, lexical access, working memory, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelly, H.; Fullwood, R.; Glancy, J.
This is the second volume of a two volume report on the VISA method for evaluating safeguards at fixed-site facilities. This volume contains appendices that support the description of the VISA concept and the initial working version of the method, VISA-1, presented in Volume I. The information is separated into four appendices, each describing details of one of the four analysis modules that comprise the analysis sections of the method. The first appendix discusses Path Analysis methodology, applies it to a Model Fuel Facility, and describes the computer codes that are being used. Introductory material on Path Analysis given inmore » Chapter 3.2.1 and Chapter 4.2.1 of Volume I. The second appendix deals with Detection Analysis, specifically the schemes used in VISA-1 for classifying adversaries and the methods proposed for evaluating individual detection mechanisms in order to build the data base required for detection analysis. Examples of evaluations on identity-access systems, SNM portal monitors, and intrusion devices are provided. The third appendix describes the Containment Analysis overt-segment path ranking, the Monte Carlo engagement model, the network simulation code, the delay mechanism data base, and the results of a sensitivity analysis. The last appendix presents general equations used in Interruption Analysis for combining covert-overt segments and compares them with equations given in Volume I, Chapter 3.« less
Code-modulated interferometric imaging system using phased arrays
NASA Astrophysics Data System (ADS)
Chauhan, Vikas; Greene, Kevin; Floyd, Brian
2016-05-01
Millimeter-wave (mm-wave) imaging provides compelling capabilities for security screening, navigation, and bio- medical applications. Traditional scanned or focal-plane mm-wave imagers are bulky and costly. In contrast, phased-array hardware developed for mass-market wireless communications and automotive radar promise to be extremely low cost. In this work, we present techniques which can allow low-cost phased-array receivers to be reconfigured or re-purposed as interferometric imagers, removing the need for custom hardware and thereby reducing cost. Since traditional phased arrays power combine incoming signals prior to digitization, orthogonal code-modulation is applied to each incoming signal using phase shifters within each front-end and two-bit codes. These code-modulated signals can then be combined and processed coherently through a shared hardware path. Once digitized, visibility functions can be recovered through squaring and code-demultiplexing operations. Pro- vided that codes are selected such that the product of two orthogonal codes is a third unique and orthogonal code, it is possible to demultiplex complex visibility functions directly. As such, the proposed system modulates incoming signals but demodulates desired correlations. In this work, we present the operation of the system, a validation of its operation using behavioral models of a traditional phased array, and a benchmarking of the code-modulated interferometer against traditional interferometer and focal-plane arrays.
Work motivation in health care: a scoping literature review.
Perreira, Tyrone A; Innis, Jennifer; Berta, Whitney
2016-12-01
The aim of this scoping literature review was to examine and summarize the factors, context, and processes that influence work motivation of health care workers. A scoping literature review was done to answer the question: What is known from the existing empirical literature about factors, context, and processes that influence work motivation of health care workers? This scoping review used the Arksey and O'Malley framework to describe and summarize findings. Inclusion and exclusion criteria were developed to screen studies. Relevant studies published between January 2005 and May 2016 were identified using five electronic databases. Study abstracts were screened for eligibility by two reviewers. Following this screening process, full-text articles were reviewed to determine the eligibility of the studies. Eligible studies were then evaluated by coding findings with descriptive labels to distinguish elements that appeared pertinent to this review. Coding was used to form groups, and these groups led to the development of themes. Twenty-five studies met the eligibility criteria for this literature review. The themes identified were work performance, organizational justice, pay, status, personal characteristics, work relationships (including bullying), autonomy, organizational identification, training, and meaningfulness of work. Most of the research involved the use of surveys. There is a need for more qualitative research and for the use of case studies to examine work motivation in health care organizations. All of the studies were cross-sectional. Longitudinal research would provide insight into how work motivation changes, and how it can be influenced and shaped. Several implications for practice were identified. There is a need to ensure that health care workers have access to training opportunities, and that autonomy is optimized. To improve work motivation, there is a need to address bullying and hostile behaviours in the workplace. Addressing the factors that influence work motivation in health care settings has the potential to influence the care that patients receive.
Overview of Recent Radiation Transport Code Comparisons for Space Applications
NASA Astrophysics Data System (ADS)
Townsend, Lawrence
Recent advances in radiation transport code development for space applications have resulted in various comparisons of code predictions for a variety of scenarios and codes. Comparisons among both Monte Carlo and deterministic codes have been made and published by vari-ous groups and collaborations, including comparisons involving, but not limited to HZETRN, HETC-HEDS, FLUKA, GEANT, PHITS, and MCNPX. In this work, an overview of recent code prediction inter-comparisons, including comparisons to available experimental data, is presented and discussed, with emphases on those areas of agreement and disagreement among the various code predictions and published data.
Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal
NASA Astrophysics Data System (ADS)
Zamudio, Gabriel S.; José, Marco V.
2018-03-01
In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.
Unsteady Analysis of Inlet-Compressor Acoustic Interactions Using Coupled 3-D and 1-D CFD Codes
NASA Technical Reports Server (NTRS)
Suresh, A.; Cole, G. L.
2000-01-01
It is well known that the dynamic response of a mixed compression supersonic inlet is very sensitive to the boundary condition imposed at the subsonic exit (engine face) of the inlet. In previous work, a 3-D computational fluid dynamics (CFD) inlet code (NPARC) was coupled at the engine face to a 3-D turbomachinery code (ADPAC) simulating an isolated rotor and the coupled simulation used to study the unsteady response of the inlet. The main problem with this approach is that the high fidelity turbomachinery simulation becomes prohibitively expensive as more stages are included in the simulation. In this paper, an alternative approach is explored, wherein the inlet code is coupled to a lesser fidelity 1-D transient compressor code (DYNTECC) which simulates the whole compressor. The specific application chosen for this evaluation is the collapsing bump experiment performed at the University of Cincinnati, wherein reflections of a large-amplitude acoustic pulse from a compressor were measured. The metrics for comparison are the pulse strength (time integral of the pulse amplitude) and wave form (shape). When the compressor is modeled by stage characteristics the computed strength is about ten percent greater than that for the experiment, but the wave shapes are in poor agreement. An alternate approach that uses a fixed rise in duct total pressure and temperature (so-called 'lossy' duct) to simulate a compressor gives good pulse shapes but the strength is about 30 percent low.
CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, III, F. G.
2016-07-29
One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less
Pan, Yi-Ling; Hwang, Ai-Wen; Simeonsson, Rune J; Lu, Lu; Liao, Hua-Fang
2015-01-01
Comprehensive description of functioning is important in providing early intervention services for infants with developmental delay/disabilities (DD). A code set of the International Classification of Functioning, Disability and Health: Children and Youth Version (ICF-CY) could facilitate the practical use of the ICF-CY in team evaluation. The purpose of this study was to derive an ICF-CY code set for infants under three years of age with early delay and disabilities (EDD Code Set) for initial team evaluation. The EDD Code Set based on the ICF-CY was developed on the basis of a Delphi survey of international professionals experienced in implementing the ICF-CY and professionals in early intervention service system in Taiwan. Twenty-five professionals completed the Delphi survey. A total of 82 ICF-CY second-level categories were identified for the EDD Code Set, including 28 categories from the domain Activities and Participation, 29 from body functions, 10 from body structures and 15 from environmental factors. The EDD Code Set of 82 ICF-CY categories could be useful in multidisciplinary team evaluations to describe functioning of infants younger than three years of age with DD, in a holistic manner. Future validation of the EDD Code Set and examination of its clinical utility are needed. The EDD Code Set with 82 essential ICF-CY categories could be useful in the initial team evaluation as a common language to describe functioning of infants less than three years of age with developmental delay/disabilities, with a more holistic view. The EDD Code Set including essential categories in activities and participation, body functions, body structures and environmental factors could be used to create a functional profile for each infant with special needs and to clarify the interaction of child and environment accounting for the child's functioning.
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
Radiation Protection Studies for Medical Particle Accelerators using Fluka Monte Carlo Code.
Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario
2017-04-01
Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Evaluation of nonlinear structural dynamic responses using a fast-running spring-mass formulation
NASA Astrophysics Data System (ADS)
Benjamin, A. S.; Altman, B. S.; Gruda, J. D.
In today's world, accurate finite-element simulations of large nonlinear systems may require meshes composed of hundreds of thousands of degrees of freedom. Even with today's fast computers and the promise of ever-faster ones in the future, central processing unit (CPU) expenditures for such problems could be measured in days. Many contemporary engineering problems, such as those found in risk assessment, probabilistic structural analysis, and structural design optimization, cannot tolerate the cost or turnaround time for such CPU-intensive analyses, because these applications require a large number of cases to be run with different inputs. For many risk assessment applications, analysts would prefer running times to be measurable in minutes. There is therefore a need for approximation methods which can solve such problems far more efficiently than the very detailed methods and yet maintain an acceptable degree of accuracy. For this purpose, we have been working on two methods of approximation: neural networks and spring-mass models. This paper presents our work and results to date for spring-mass modeling and analysis, since we are further along in this area than in the neural network formulation. It describes the physical and numerical models contained in a code we developed called STRESS, which stands for 'Spring-mass Transient Response Evaluation for structural Systems'. The paper also presents results for a demonstration problem, and compares these with results obtained for the same problem using PRONTO3D, a state-of-the-art finite element code which was also developed at Sandia.
Assessment of the prevailing physics codes: LEOPARD, LASER, and EPRI-CELL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lan, J.S.
1981-01-01
In order to analyze core performance and fuel management, it is necessary to verify reactor physics codes in great detail. This kind of work not only serves the purpose of understanding and controlling the characteristics of each code, but also ensures the reliability as codes continually change due to constant modifications and machine transfers. This paper will present the results of a comprehensive verification of three code packages - LEOPARD, LASER, and EPRI-CELL.
NASA Technical Reports Server (NTRS)
1992-01-01
The work performed in the previous six months can be divided into three main cases: (1) transmission of images over local area networks (LAN's); (2) coding of color mapped (pseudo-color) images; and (3) low rate video coding. A brief overview of the work done in the first two areas is presented. The third item is reported in somewhat more detail.
Minimal-memory realization of pearl-necklace encoders of general quantum convolutional codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houshmand, Monireh; Hosseini-Khayat, Saied
2011-02-15
Quantum convolutional codes, like their classical counterparts, promise to offer higher error correction performance than block codes of equivalent encoding complexity, and are expected to find important applications in reliable quantum communication where a continuous stream of qubits is transmitted. Grassl and Roetteler devised an algorithm to encode a quantum convolutional code with a ''pearl-necklace'' encoder. Despite their algorithm's theoretical significance as a neat way of representing quantum convolutional codes, it is not well suited to practical realization. In fact, there is no straightforward way to implement any given pearl-necklace structure. This paper closes the gap between theoretical representation andmore » practical implementation. In our previous work, we presented an efficient algorithm to find a minimal-memory realization of a pearl-necklace encoder for Calderbank-Shor-Steane (CSS) convolutional codes. This work is an extension of our previous work and presents an algorithm for turning a pearl-necklace encoder for a general (non-CSS) quantum convolutional code into a realizable quantum convolutional encoder. We show that a minimal-memory realization depends on the commutativity relations between the gate strings in the pearl-necklace encoder. We find a realization by means of a weighted graph which details the noncommutative paths through the pearl necklace. The weight of the longest path in this graph is equal to the minimal amount of memory needed to implement the encoder. The algorithm has a polynomial-time complexity in the number of gate strings in the pearl-necklace encoder.« less
The Crash Outcome Data Evaluation System (CODES)
DOT National Transportation Integrated Search
1996-01-01
The CODES Technical Report presents state-specific results from the Crash : Outcome Data Evaluation System project. These results confirm previous NHTSA : studies and show that safety belts and motorcycle helmets are effective in : reducing fatalitie...
Medical decision making: guide to improved CPT coding.
Holt, Jim; Warsy, Ambreen; Wright, Paula
2010-04-01
The Current Procedural Terminology (CPT) coding system for office visits, which has been in use since 1995, has not been well studied, but it is generally agreed that the system contains much room for error. In fact, the available literature suggests that only slightly more than half of physicians will agree on the same CPT code for a given visit, and only 60% of professional coders will agree on the same code for a particular visit. In addition, the criteria used to assign a code are often related to the amount of written documentation. The goal of this study was to evaluate two novel methods to assess if the most appropriate CPT code is used: the level of medical decision making, or the sum of all problems mentioned by the patient during the visit. The authors-a professional coder, a residency faculty member, and a PGY-3 family medicine resident-reviewed 351 randomly selected visit notes from two residency programs in the Northeast Tennessee region for the level of documentation, the level of medical decision making, and the total number of problems addressed. The authors assigned appropriate CPT codes at each of those three levels. Substantial undercoding occurred at each of the three levels. Approximately 33% of visits were undercoded based on the written documentation. Approximately 50% of the visits were undercoded based on the level of documented medical decision making. Approximately 80% of the visits were undercoded based on the total number of problems which the patient presented during the visit. Interrater agreement was fair, and similar to that noted in other coding studies. Undercoding is not only common in a family medicine residency program but it also occurs at levels that would not be evident from a simple audit of the documentation on the visit note. Undercoding also occurs from not exploring problems mentioned by the patient and not documenting additional work that was performed. Family physicians may benefit from minor alterations in their documentation of office visit notes.
Levesque, Eric; Hoti, Emir; de La Serna, Sofia; Habouchi, Houssam; Ichai, Philippe; Saliba, Faouzi; Samuel, Didier; Azoulay, Daniel
2013-03-01
In the French healthcare system, the intensive care budget allocated is directly dependent on the activity level of the center. To evaluate this activity level, it is necessary to code the medical diagnoses and procedures performed on Intensive Care Unit (ICU) patients. The aim of this study was to evaluate the effects of using an Intensive Care Information System (ICIS) on the incidence of coding errors and its impact on the ICU budget allocated. Since 2005, the documentation on and monitoring of every patient admitted to our ICU has been carried out using an ICIS. However, the coding process was performed manually until 2008. This study focused on two periods: the period of manual coding (year 2007) and the period of computerized coding (year 2008) which covered a total of 1403 ICU patients. The time spent on the coding process, the rate of coding errors (defined as patients missed/not coded or wrongly identified as undergoing major procedure/s) and the financial impact were evaluated for these two periods. With computerized coding, the time per admission decreased significantly (from 6.8 ± 2.8 min in 2007 to 3.6 ± 1.9 min in 2008, p<0.001). Similarly, a reduction in coding errors was observed (7.9% vs. 2.2%, p<0.001). This decrease in coding errors resulted in a reduced difference between the potential and real ICU financial supplements obtained in the respective years (€194,139 loss in 2007 vs. a €1628 loss in 2008). Using specific computer programs improves the intensive process of manual coding by shortening the time required as well as reducing errors, which in turn positively impacts the ICU budget allocation. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Improvements of the particle-in-cell code EUTERPE for petascaling machines
NASA Astrophysics Data System (ADS)
Sáez, Xavier; Soba, Alejandro; Sánchez, Edilberto; Kleiber, Ralf; Castejón, Francisco; Cela, José M.
2011-09-01
In the present work we report some performance measures and computational improvements recently carried out using the gyrokinetic code EUTERPE (Jost, 2000 [1] and Jost et al., 1999 [2]), which is based on the general particle-in-cell (PIC) method. The scalability of the code has been studied for up to sixty thousand processing elements and some steps towards a complete hybridization of the code were made. As a numerical example, non-linear simulations of Ion Temperature Gradient (ITG) instabilities have been carried out in screw-pinch geometry and the results are compared with earlier works. A parametric study of the influence of variables (step size of the time integrator, number of markers, grid size) on the quality of the simulation is presented.
The queueing perspective of asynchronous network coding in two-way relay network
NASA Astrophysics Data System (ADS)
Liang, Yaping; Chang, Qing; Li, Xianxu
2018-04-01
Asynchronous network coding (NC) has potential to improve the wireless network performance compared with a routing or the synchronous network coding. Recent researches concentrate on the optimization between throughput/energy consuming and delay with a couple of independent input flow. However, the implementation of NC requires a thorough investigation of its impact on relevant queueing systems where few work focuses on. Moreover, few works study the probability density function (pdf) in network coding scenario. In this paper, the scenario with two independent Poisson input flows and one output flow is considered. The asynchronous NC-based strategy is that a new arrival evicts a head packet holding in its queue when waiting for another packet from the other flow to encode. The pdf for the output flow which contains both coded and uncoded packets is derived. Besides, the statistic characteristics of this strategy are analyzed. These results are verified by numerical simulations.
Mobile Code: The Future of the Internet
1999-01-01
code ( mobile agents) to multiple proxies or servers " Customization " (e.g., re-formatting, filtering, metasearch) Information overload Diversified... Mobile code is necessary, rather than client-side code, since many customization features (such as information monitoring) do not work if the...economic foundation for Web sites, many Web sites earn money solely from advertisements . If these sites allow mobile agents to easily access the content
Impacts of Model Building Energy Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Athalye, Rahul A.; Sivaraman, Deepak; Elliott, Douglas B.
The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO 2 emissions atmore » the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.« less
Initial Kernel Timing Using a Simple PIM Performance Model
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Block, Gary L.; Springer, Paul L.; Sterling, Thomas; Brockman, Jay B.; Callahan, David
2005-01-01
This presentation will describe some initial results of paper-and-pencil studies of 4 or 5 application kernels applied to a processor-in-memory (PIM) system roughly similar to the Cascade Lightweight Processor (LWP). The application kernels are: * Linked list traversal * Sun of leaf nodes on a tree * Bitonic sort * Vector sum * Gaussian elimination The intent of this work is to guide and validate work on the Cascade project in the areas of compilers, simulators, and languages. We will first discuss the generic PIM structure. Then, we will explain the concepts needed to program a parallel PIM system (locality, threads, parcels). Next, we will present a simple PIM performance model that will be used in the remainder of the presentation. For each kernel, we will then present a set of codes, including codes for a single PIM node, and codes for multiple PIM nodes that move data to threads and move threads to data. These codes are written at a fairly low level, between assembly and C, but much closer to C than to assembly. For each code, we will present some hand-drafted timing forecasts, based on the simple PIM performance model. Finally, we will conclude by discussing what we have learned from this work, including what programming styles seem to work best, from the point-of-view of both expressiveness and performance.
Decay heat uncertainty quantification of MYRRHA
NASA Astrophysics Data System (ADS)
Fiorito, Luca; Buss, Oliver; Hoefer, Axel; Stankovskiy, Alexey; Eynde, Gert Van den
2017-09-01
MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.
COR V2: teaching observational research with multimedia courseware.
Blasko, Dawn G; Kazmerski, Victoria A; Torgerson, Carla N
2004-05-01
Courseware for Observational Research (COR Version 2) is an interactive multimedia program designed to teach the foundation of the scientific method: systematic observation. COR uses digital video with interactive coding to teach basic concepts, such as creating precise operational definitions; using frequency, interval, and duration coding; developing sampling strategies; and analyzing and interpreting data. Through lessons, a case study, and laboratory exercises, it gradually scaffolds students from teacher-directed learning into self-directed learning. The newest addition to COR is a case study in which students work collaboratively, using their own observations to make recommendations about a child's disruptive behavior in an after-school program. Evaluations of the lessons showed that classes using COR received better grades on their field observations than did those using methods that are more traditional. Students' confidence and knowledge increased as they moved through each section of the program.
Real-space processing of helical filaments in SPARX
Behrmann, Elmar; Tao, Guozhi; Stokes, David L.; Egelman, Edward H.; Raunser, Stefan; Penczek, Pawel A.
2012-01-01
We present a major revision of the iterative helical real-space refinement (IHRSR) procedure and its implementation in the SPARX single particle image processing environment. We built on over a decade of experience with IHRSR helical structure determination and we took advantage of the flexible SPARX infrastructure to arrive at an implementation that offers ease of use, flexibility in designing helical structure determination strategy, and high computational efficiency. We introduced the 3D projection matching code which now is able to work with non-cubic volumes, the geometry better suited for long helical filaments, we enhanced procedures for establishing helical symmetry parameters, and we parallelized the code using distributed memory paradigm. Additional feature includes a graphical user interface that facilitates entering and editing of parameters controlling the structure determination strategy of the program. In addition, we present a novel approach to detect and evaluate structural heterogeneity due to conformer mixtures that takes advantage of helical structure redundancy. PMID:22248449
Use of Spacecraft Command Language for Advanced Command and Control Applications
NASA Technical Reports Server (NTRS)
Mims, Tikiela L.
2008-01-01
The purpose of this work is to evaluate the use of SCL in building and monitoring command and control applications in order to determine its fitness for space operations. Approximately 24,325 lines of PCG2 code was converted to SCL yielding a 90% reduction in the number of lines of code as many of the functions and scripts utilized in SCL could be ported and reused. Automated standalone testing, simulating the actual production environment, was performed in order to generalize and gauge the relative time it takes for SCL to update and write a given display. The use of SCL rules, functions, and scripts allowed the creation of several test cases permitting the detection of the amount of time it takes update a given set of measurements given the change in a globally existing CUI or CUI. It took the SCL system an average 926.09 ticks to update the entire display of 323 measurements.
Bayesian Models for Astrophysical Data Using R, JAGS, Python, and Stan
NASA Astrophysics Data System (ADS)
Hilbe, Joseph M.; de Souza, Rafael S.; Ishida, Emille E. O.
2017-05-01
This comprehensive guide to Bayesian methods in astronomy enables hands-on work by supplying complete R, JAGS, Python, and Stan code, to use directly or to adapt. It begins by examining the normal model from both frequentist and Bayesian perspectives and then progresses to a full range of Bayesian generalized linear and mixed or hierarchical models, as well as additional types of models such as ABC and INLA. The book provides code that is largely unavailable elsewhere and includes details on interpreting and evaluating Bayesian models. Initial discussions offer models in synthetic form so that readers can easily adapt them to their own data; later the models are applied to real astronomical data. The consistent focus is on hands-on modeling, analysis of data, and interpretations that address scientific questions. A must-have for astronomers, its concrete approach will also be attractive to researchers in the sciences more generally.
Developing Information Power Grid Based Algorithms and Software
NASA Technical Reports Server (NTRS)
Dongarra, Jack
1998-01-01
This exploratory study initiated our effort to understand performance modeling on parallel systems. The basic goal of performance modeling is to understand and predict the performance of a computer program or set of programs on a computer system. Performance modeling has numerous applications, including evaluation of algorithms, optimization of code implementations, parallel library development, comparison of system architectures, parallel system design, and procurement of new systems. Our work lays the basis for the construction of parallel libraries that allow for the reconstruction of application codes on several distinct architectures so as to assure performance portability. Following our strategy, once the requirements of applications are well understood, one can then construct a library in a layered fashion. The top level of this library will consist of architecture-independent geometric, numerical, and symbolic algorithms that are needed by the sample of applications. These routines should be written in a language that is portable across the targeted architectures.
NASA Technical Reports Server (NTRS)
McManus, Hugh L.; Chamis, Christos C.
1996-01-01
This report describes analytical methods for calculating stresses and damage caused by degradation of the matrix constituent in polymer matrix composite materials. Laminate geometry, material properties, and matrix degradation states are specified as functions of position and time. Matrix shrinkage and property changes are modeled as functions of the degradation states. The model is incorporated into an existing composite mechanics computer code. Stresses, strains, and deformations at the laminate, ply, and micro levels are calculated, and from these calculations it is determined if there is failure of any kind. The rationale for the model (based on published experimental work) is presented, its integration into the laminate analysis code is outlined, and example results are given, with comparisons to existing material and structural data. The mechanisms behind the changes in properties and in surface cracking during long-term aging of polyimide matrix composites are clarified. High-temperature-material test methods are also evaluated.
Quantifying consumption rates of dissolved oxygen along bed forms
NASA Astrophysics Data System (ADS)
Boano, Fulvio; De Falco, Natalie; Arnon, Shai
2016-04-01
Streambed interfaces represent hotspots for nutrient transformations because they host different microbial species, and the evaluation of these reaction rates is important to assess the fate of nutrients in riverine environments. In this work we analyze a series of flume experiments on oxygen demand in dune-shaped hyporheic sediments under losing and gaining flow conditions. We employ a new modeling code to quantify oxygen consumption rates from observed vertical profiles of oxygen concentration. The code accounts for transport by molecular diffusion and water advection, and automatically determines the reaction rates that provide the best fit between observed and modeled concentration values. The results show that reaction rates are not uniformly distributed across the streambed, in agreement with the expected behavior predicted by hyporheic exchange theory. Oxygen consumption was found to be highly influenced by the presence of gaining or losing flow conditions, which controlled the delivery of labile DOC to streambed microorganisms.
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Continuing the tradition established in prior years, this panel encompasses one of the broadest ranges of topics and issues of any panel at the Summer Study. It includes papers addressing all sectors, low-income residential to industrial, and views energy efficiency from many perspectives including programmatic, evaluation, codes, standards, legislation, technical transfer, economic development, and least-cost planning. The papers represent work being performed in most geographic regions of the United States and in the international arena, specifically Thailand, China, Europe, and Scandinavia. This delightful smorgasbord has been organized, based on general content area, into the following eight sessions: (1) new directionsmore » for low-income weatherization; (2) pursuing efficiency through legislation and standards; (3) international perspectives on energy efficiency; (4) technical transfer strategies; (5) government energy policy; (6) commercial codes and standards; (7) innovative programs; and, (8) state-of-the-art review. For these conference proceedings, individual papers are processed separately for the Energy Data Base.« less
Electron Thermalization in the Solar Wind and Planetary Plasma Boundaries
NASA Technical Reports Server (NTRS)
Krauss-Varban, Dietmar
1998-01-01
The work carried out under this contract attempts a better understanding of whistler wave generation and associated scattering of electrons in the solar wind. This task is accomplished through simulations using a particle-in-cell code and a Vlasov code. In addition, the work is supported by the utilization of a linear kinetic dispersion solver. Previously, we have concentrated on gaining a better understanding of the linear mode properties, and have tested the simulation codes within a known parameter regime. We are now in a new phase in which we implement, execute, and analyze production simulations. This phase is projected to last over several reporting periods, with this being the second cycle. In addition, we have started to research to what extent the evolution of the pertinent instabilities is two-dimensional. We are also continuing our work on the visualization aspects of the simulation results, and on a code version that runs on single-user Alpha-processor based workstations.
Error Reduction Program. [combustor performance evaluation codes
NASA Technical Reports Server (NTRS)
Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.
1985-01-01
The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.
NASA Technical Reports Server (NTRS)
Schuster, David M.; Scott, Robert C.; Bartels, Robert E.; Edwards, John W.; Bennett, Robert M.
2000-01-01
As computational fluid dynamics methods mature, code development is rapidly transitioning from prediction of steady flowfields to unsteady flows. This change in emphasis offers a number of new challenges to the research community, not the least of which is obtaining detailed, accurate unsteady experimental data with which to evaluate new methods. Researchers at NASA Langley Research Center (LaRC) have been actively measuring unsteady pressure distributions for nearly 40 years. Over the last 20 years, these measurements have focused on developing high-quality datasets for use in code evaluation. This paper provides a sample of unsteady pressure measurements obtained by LaRC and available for government, university, and industry researchers to evaluate new and existing unsteady aerodynamic analysis methods. A number of cases are highlighted and discussed with attention focused on the unique character of the individual datasets and their perceived usefulness for code evaluation. Ongoing LaRC research in this area is also presented.
Hanford business structure for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, D.
The Hanford Business Structure integrates the project`s technical, schedule, and cost baselines; implements the use of a standard code of accounts; and streamlines performance reporting and cost collection. Technical requirements drive the technical functions and come from the RDD 100 database. The functions will be identified in the P3 scheduling system and also in the PeopleSoft system. Projects will break their work down from the technical requirements in the P3 schedules. When the level at which they want to track cost via the code of accounts is reached, a Project ID will be generated in the PeopleSoft system. P3 maymore » carry more detailed schedules below the Project ID level. The standard code of accounts will identify discrete work activities done across the site and various projects. They will include direct and overhead type work scopes. Activities in P3 will roll up to this standard code of accounts. The field that will be used to record this in PeopleSoft is ``Activity``. In Passport it is a user-defined field. It will have to be added to other feeder systems. Project ID and code of accounts are required fields on all cost records.« less
Leveraging Code Comments to Improve Software Reliability
ERIC Educational Resources Information Center
Tan, Lin
2009-01-01
Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…
29 CFR 1915.90 - Safety color code for marking physical hazards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 7 2013-07-01 2013-07-01 false Safety color code for marking physical hazards. 1915.90 Section 1915.90 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH... General Working Conditions § 1915.90 Safety color code for marking physical hazards. The requirements...
29 CFR 1915.90 - Safety color code for marking physical hazards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 7 2014-07-01 2014-07-01 false Safety color code for marking physical hazards. 1915.90 Section 1915.90 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH... General Working Conditions § 1915.90 Safety color code for marking physical hazards. The requirements...
29 CFR 1915.90 - Safety color code for marking physical hazards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 7 2012-07-01 2012-07-01 false Safety color code for marking physical hazards. 1915.90 Section 1915.90 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH... General Working Conditions § 1915.90 Safety color code for marking physical hazards. The requirements...
Design and evaluation of sparse quantization index modulation watermarking schemes
NASA Astrophysics Data System (ADS)
Cornelis, Bruno; Barbarien, Joeri; Dooms, Ann; Munteanu, Adrian; Cornelis, Jan; Schelkens, Peter
2008-08-01
In the past decade the use of digital data has increased significantly. The advantages of digital data are, amongst others, easy editing, fast, cheap and cross-platform distribution and compact storage. The most crucial disadvantages are the unauthorized copying and copyright issues, by which authors and license holders can suffer considerable financial losses. Many inexpensive methods are readily available for editing digital data and, unlike analog information, the reproduction in the digital case is simple and robust. Hence, there is great interest in developing technology that helps to protect the integrity of a digital work and the copyrights of its owners. Watermarking, which is the embedding of a signal (known as the watermark) into the original digital data, is one method that has been proposed for the protection of digital media elements such as audio, video and images. In this article, we examine watermarking schemes for still images, based on selective quantization of the coefficients of a wavelet transformed image, i.e. sparse quantization-index modulation (QIM) watermarking. Different grouping schemes for the wavelet coefficients are evaluated and experimentally verified for robustness against several attacks. Wavelet tree-based grouping schemes yield a slightly improved performance over block-based grouping schemes. Additionally, the impact of the deployment of error correction codes on the most promising configurations is examined. The utilization of BCH-codes (Bose, Ray-Chaudhuri, Hocquenghem) results in an improved robustness as long as the capacity of the error codes is not exceeded (cliff-effect).
Simulation of the ELMs triggering by lithium pellet on EAST tokamak using BOUT + +
NASA Astrophysics Data System (ADS)
Wang, Y. M.; Xu, X. Q.; Wang, Z.; Sun, Z.; Hu, J. S.; Gao, X.
2017-10-01
A new lithium granule injector (LGI) was developed on EAST. Using the LGI, lithium granules can be efficiently injected into EAST tokamak with the granule radius 0.2-1 mm and the granules velocity 30-110 m/s. ELM pacing was realized during EAST shot #70123 at time window from 4.4-4.7s, the average velocity of the pellet was 75 m/s and the average injection rate is at 99Hz. The BOUT + + 6-field electromagnetic turbulence code has been used to simulate the ELM pacing process. A neutral gas shielding (NGS) model has been implemented during the pellet ablation process. The neutral transport code is used to evaluate the ionized electron and Li ion densities with the charge exchange as a dominant factor in the neutral cloud diffusion process. The snapshot plasma profiles during the pellet ablation and toroidal symmetrization process are used in the 6-field turbulence code to evaluate the impact of the pellets on ELMs. Destabilizing effects of the peeling-ballooning modes are found with lithium pellet injection, which is consistent with the experimental results. A scan of the pellet size, shape and the injection velocity will be conducted, which will benefit the pellet injection design in both the present and future devices. Prepared by LLNL under Contract DE-AC52-07NA27344 and this work is supported by the National Natural Science Fonudation of China (Grant No. 11505221) and China Scholarship Council (Grant No. 201504910132).
An Accessible User Interface for Geoscience and Programming
NASA Astrophysics Data System (ADS)
Sevre, E. O.; Lee, S.
2012-12-01
The goal of this research is to develop an interface that will simplify user interaction with software for scientists. The motivating factor of the research is to develop tools that assist scientists with limited motor skills with the efficient generation and use of software tools. Reliance on computers and programming is increasing in the world of geology, and it is increasingly important for geologists and geophysicists to have the computational resources to use advanced software and edit programs for their research. I have developed a prototype of a program to help geophysicists write programs using a simple interface that requires only simple single-mouse-clicks to input code. It is my goal to minimize the amount of typing necessary to create simple programs and scripts to increase accessibility for people with disabilities limiting fine motor skills. This interface can be adapted for various programming and scripting languages. Using this interface will simplify development of code for C/C++, Java, and GMT, and can be expanded to support any other text based programming language. The interface is designed around the concept of maximizing the amount of code that can be written using a minimum number of clicks and typing. The screen is split into two sections: a list of click-commands is on the left hand side, and a text area is on the right hand side. When the user clicks on a command on the left hand side the applicable code is automatically inserted at the insertion point in the text area. Currently in the C/C++ interface, there are commands for common code segments that are often used, such as for loops, comments, print statements, and structured code creation. The primary goal is to provide an interface that will work across many devices for developing code. A simple prototype has been developed for the iPad. Due to the limited number of devices that an iOS application can be used with, the code has been re-written in Java to run on a wider range of devices. Currently, the software works in a prototype mode, and it is our goal to further development to create software that can benefit a wide range of people working in geosciences, which will make code development practical and accessible for a wider audience of scientists. By using an interface like this, it reduces potential for errors by reusing known working code.
Farzandipour, Mehrdad; Sheikhtaheri, Abbas
2009-01-01
To evaluate the accuracy of procedural coding and the factors that influence it, 246 records were randomly selected from four teaching hospitals in Kashan, Iran. “Recodes” were assigned blindly and then compared to the original codes. Furthermore, the coders' professional behaviors were carefully observed during the coding process. Coding errors were classified as major or minor. The relations between coding accuracy and possible effective factors were analyzed by χ2 or Fisher exact tests as well as the odds ratio (OR) and the 95 percent confidence interval for the OR. The results showed that using a tabular index for rechecking codes reduces errors (83 percent vs. 72 percent accuracy). Further, more thorough documentation by the clinician positively affected coding accuracy, though this relation was not significant. Readability of records decreased errors overall (p = .003), including major ones (p = .012). Moreover, records with no abbreviations had fewer major errors (p = .021). In conclusion, not using abbreviations, ensuring more readable documentation, and paying more attention to available information increased coding accuracy and the quality of procedure databases. PMID:19471647
Sanders, Elizabeth A.; Berninger, Virginia W.; Abbott, Robert D.
2017-01-01
Sequential regression was used to evaluate whether language-related working memory components uniquely predict reading and writing achievement beyond cognitive-linguistic translation for students in grades 4–9 (N=103) with specific learning disabilities (SLDs) in subword handwriting (dysgraphia, n=25), word reading and spelling (dyslexia, n=60), or oral and written language (OWL LD, n=18). That is, SLDs are defined on basis of cascading level of language impairment (subword, word, and syntax/text). A 5-block regression model sequentially predicted literacy achievement from cognitive-linguistic translation (Block 1); working memory components for word form coding (Block 2), phonological and orthographic loops (Block 3), and supervisory focused or switching attention (Block4); and SLD groups (Block 5). Results showed that cognitive-linguistic translation explained an average of 27% and 15% of the variance in reading and writing achievement, respectively, but working memory components explained an additional 39% and 27% variance. Orthographic word form coding uniquely predicted nearly every measure, whereas attention switching only uniquely predicted reading. Finally, differences in reading and writing persisted between dyslexia and dysgraphia, with dysgraphia higher, even after controlling for Block 1 to 4 predictors. Differences in literacy achievement between students with dyslexia and OWL LD were largely explained by the Block 1 predictors. Applications to identifying and teaching students with these SLDs are discussed. PMID:28199175
Coded Statutory Data Sets for Evaluation of Public Health Law
ERIC Educational Resources Information Center
Costich, Julia Field
2012-01-01
Background and objectives: The evaluation of public health law requires reliable accounts of underlying statutes and regulations. States often enact public health-related statutes with nonuniform provisions, and variation in the structure of state legal codes can foster inaccuracy in evaluating the impact of specific categories of law. The optimal…
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-05-17
PelePhysics is a suite of physics packages that provides functionality of use to reacting hydrodynamics CFD codes. The initial release includes an interface to reaction rate mechanism evaluation, transport coefficient evaluation, and a generalized equation of state (EOS) facility. Both generic evaluators and interfaces to code from externally available tools (Fuego for chemical rates, EGLib for transport coefficients) are provided.
Terahertz wave manipulation based on multi-bit coding artificial electromagnetic surfaces
NASA Astrophysics Data System (ADS)
Li, Jiu-Sheng; Zhao, Ze-Jiang; Yao, Jian-Quan
2018-05-01
A polarization insensitive multi-bit coding artificial electromagnetic surface is proposed for terahertz wave manipulation. The coding artificial electromagnetic surfaces composed of four-arrow-shaped particles with certain coding sequences can generate multi-bit coding in the terahertz frequencies and manipulate the reflected terahertz waves to the numerous directions by using of different coding distributions. Furthermore, we demonstrate that our coding artificial electromagnetic surfaces have strong abilities to reduce the radar cross section with polarization insensitive for TE and TM incident terahertz waves as well as linear-polarized and circular-polarized terahertz waves. This work offers an effectively strategy to realize more powerful manipulation of terahertz wave.
Hetzel, C; Flach, T; Schmidt, C
2012-08-01
This paper is aimed at identifying labour market factors impacting vocational retraining centre participants' return to work on Employment Agencies level and at comparing results to unemployed people's return to work (Social Code Book III). Databases are regional return to work rates of 2006 graduates, selected labour market indicators 2007, and the 2007 labour market classification of the Institute for Employment Research (IAB). The n = 75 Employment Agency districts where 74.5 % of the participants followed-up lived were analyzed using analyses of variance and multiple loglinear regression. Compared to the unemployment context (Social Code Book III), the impact of the labour market is much lower and less complex. In the multiple model, the regional unemployment rate and the regional tertiarization rate (size of the service sector) are found to be significant and superior to the IAB-classification. Hence, participants' return to work is less dependent on labour market conditions than unemployed people's return to work (Social Code Book III). © Georg Thieme Verlag KG Stuttgart · New York.
Guida, Alessandro; van Dijck, Jean-Philippe; Abrahamse, Elger
2017-05-01
In a recent study, Kreitz et al. (Psychological Research 79:1034-1041, 2015) reported on a relationship between verbal working memory capacity and visuo-spatial attentional breadth. The authors hinted at attentional control to be the major link underlying this relationship. We put forward an alternative explanation by framing it within the context of a recent theory on serial order in memory: verbal item sequences entering in working memory are coded by adding a spatial context that can be derived from reading/writing habits. The observation by Kreitz et al. (Psychological Research 79:1034-1041, 2015) enriches this framework by suggesting that a larger visuo-spatial attentional breadth allows for internal coding of the verbal items in a more (spatially) distinct manner-thereby increasing working memory performance. As such, Kreitz et al. (Psychological Research 79:1034-1041, 2015) is the first study revealing a functional link between visuo-spatial attentional breadth and verbal working memory size, which strengthens spatial accounts of serial order coding in working memory.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
NASA Astrophysics Data System (ADS)
Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.
2017-04-01
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.
Kapeller, Christoph; Kamada, Kyousuke; Ogawa, Hiroshi; Prueckl, Robert; Scharinger, Josef; Guger, Christoph
2014-01-01
A brain-computer-interface (BCI) allows the user to control a device or software with brain activity. Many BCIs rely on visual stimuli with constant stimulation cycles that elicit steady-state visual evoked potentials (SSVEP) in the electroencephalogram (EEG). This EEG response can be generated with a LED or a computer screen flashing at a constant frequency, and similar EEG activity can be elicited with pseudo-random stimulation sequences on a screen (code-based BCI). Using electrocorticography (ECoG) instead of EEG promises higher spatial and temporal resolution and leads to more dominant evoked potentials due to visual stimulation. This work is focused on BCIs based on visual evoked potentials (VEP) and its capability as a continuous control interface for augmentation of video applications. One 35 year old female subject with implanted subdural grids participated in the study. The task was to select one out of four visual targets, while each was flickering with a code sequence. After a calibration run including 200 code sequences, a linear classifier was used during an evaluation run to identify the selected visual target based on the generated code-based VEPs over 20 trials. Multiple ECoG buffer lengths were tested and the subject reached a mean online classification accuracy of 99.21% for a window length of 3.15 s. Finally, the subject performed an unsupervised free run in combination with visual feedback of the current selection. Additionally, an algorithm was implemented that allowed to suppress false positive selections and this allowed the subject to start and stop the BCI at any time. The code-based BCI system attained very high online accuracy, which makes this approach very promising for control applications where a continuous control signal is needed. PMID:25147509
NASA Astrophysics Data System (ADS)
Rueda, Antonio J.; Noguera, José M.; Luque, Adrián
2016-02-01
In recent years GPU computing has gained wide acceptance as a simple low-cost solution for speeding up computationally expensive processing in many scientific and engineering applications. However, in most cases accelerating a traditional CPU implementation for a GPU is a non-trivial task that requires a thorough refactorization of the code and specific optimizations that depend on the architecture of the device. OpenACC is a promising technology that aims at reducing the effort required to accelerate C/C++/Fortran code on an attached multicore device. Virtually with this technology the CPU code only has to be augmented with a few compiler directives to identify the areas to be accelerated and the way in which data has to be moved between the CPU and GPU. Its potential benefits are multiple: better code readability, less development time, lower risk of errors and less dependency on the underlying architecture and future evolution of the GPU technology. Our aim with this work is to evaluate the pros and cons of using OpenACC against native GPU implementations in computationally expensive hydrological applications, using the classic D8 algorithm of O'Callaghan and Mark for river network extraction as case-study. We implemented the flow accumulation step of this algorithm in CPU, using OpenACC and two different CUDA versions, comparing the length and complexity of the code and its performance with different datasets. We advance that although OpenACC can not match the performance of a CUDA optimized implementation (×3.5 slower in average), it provides a significant performance improvement against a CPU implementation (×2-6) with by far a simpler code and less implementation effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barani, T.; Bruschi, E.; Pizzocri, D.
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
Eccher, Claudio; Eccher, Lorenzo; Izzo, Umberto
2005-01-01
In this poster we describe the security solutions implemented in a web-based cooperative work frame-work for managing heart failure patients among different health care professionals involved in the care process. The solution, developed in close collaboration with the Law Department of the University of Trento, is compliant with the new Italian Personal Data Protection Code, issued in 2003, that regulates also the storing and processing of health data.
Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J
1997-01-01
To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.
Brief Overlook on the Occupational Accidents Occurring During the Geotechnical Site Works
NASA Astrophysics Data System (ADS)
Akboğa Kale, Özge; Eskişar, Tuğba
2017-10-01
The aim of this paper is to evaluate occupational accidents reported in geotechnical site works. Variables of the accidents are categorized as the year and month of accidents, the technical codes used for defining the scope of work trades, end use and project type and cost, nature and cause of accidents, occupation of the victims and finally the cause of fatality. As a result, it is seen that the majority of victims were construction laborers or in special trade constructors who were working on a new project or new additions to an existing project. The geotechnical phase of the projects was whether excavation, landfill, sewer-water treatment, pipeline construction, commercial building or road construction. As the outcomes of the study it is evaluated that excavation, trenching and installing pipe or pile driving were the main causes of the accidents while trench collapse, struck by a falling object / projectile and wall collapse were the main causes of fatality. Moreover, it is established that more than half of the fatalities were due to asphyxia followed by fracture. These findings show that accidents occurred in geotechnical works do not only have high frequency but also high severity. This study emphasizes project specific countermeasures should be taken regarding the nature, cost and importance of the project and the occupation variabilities working on the project.
Uniform emergency codes: will they improve safety?
2005-01-01
There are pros and cons to uniform code systems, according to emergency medicine experts. Uniformity can be a benefit when ED nurses and other staff work at several facilities. It's critical that your staff understand not only what the codes stand for, but what they must do when codes are called. If your state institutes a new system, be sure to hold regular drills to familiarize your ED staff.
Turbulence and modeling in transonic flow
NASA Technical Reports Server (NTRS)
Rubesin, Morris W.; Viegas, John R.
1989-01-01
A review is made of the performance of a variety of turbulence models in the evaluation of a particular well documented transonic flow. This is done to supplement a previous attempt to calibrate and verify transonic airfoil codes by including many more turbulence models than used in the earlier work and applying the calculations to an experiment that did not suffer from uncertainties in angle of attack and was free of wind tunnel interference. It is found from this work, as well as in the earlier study, that the Johnson-King turbulence model is superior for transonic flows over simple aerodynamic surfaces, including moderate separation. It is also shown that some field equation models with wall function boundary conditions can be competitive with it.
Fungible weights in logistic regression.
Jones, Jeff A; Waller, Niels G
2016-06-01
In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet
NASA Technical Reports Server (NTRS)
Muss, J. A.; Johnson, C. W.; Gotchy, M. B.
2000-01-01
The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.
Wake Management Strategies for Reduction of Turbomachinery Fan Noise
NASA Technical Reports Server (NTRS)
Waitz, Ian A.
1998-01-01
The primary objective of our work was to evaluate and test several wake management schemes for the reduction of turbomachinery fan noise. Throughout the course of this work we relied on several tools. These include 1) Two-dimensional steady boundary-layer and wake analyses using MISES (a thin-shear layer Navier-Stokes code), 2) Two-dimensional unsteady wake-stator interaction simulations using UNSFLO, 3) Three-dimensional, steady Navier-Stokes rotor simulations using NEWT, 4) Internal blade passage design using quasi-one-dimensional passage flow models developed at MIT, 5) Acoustic modeling using LINSUB, 6) Acoustic modeling using VO72, 7) Experiments in a low-speed cascade wind-tunnel, and 8) ADP fan rig tests in the MIT Blowdown Compressor.
The ZPIC educational code suite
NASA Astrophysics Data System (ADS)
Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.
2017-10-01
Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Ratings The Cardiovascular System § 4.100 Application of the evaluation criteria for diagnostic codes 7000... medical information does not sufficiently reflect the severity of the veteran's cardiovascular disability...
Why Data Linkage? The Importance of CODES (Crash Outcome Data Evaluation System)
DOT National Transportation Integrated Search
1996-06-01
This report briefly explains the computerized linked data system, Crash Outcome : Data Evaluation System (CODES) that provides greater depth accident data : analysis. The linking of data helps researchers to understand the nature of : traffic acciden...
Jones, Lyell K; Ney, John P
2016-12-01
Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.
Assessment and Application of the ROSE Code for Reactor Outage Thermal-Hydraulic and Safety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Thomas K.S.; Ko, F.-K.; Dai, L.-C
The currently available tools, such as RELAP5, RETRAN, and others, cannot easily and correctly perform the task of analyzing the system behavior during plant outages. Therefore, a medium-sized program aiming at reactor outage simulation and evaluation, such as midloop operation (MLO) with loss of residual heat removal (RHR), has been developed. Important thermal-hydraulic processes involved during MLO with loss of RHR can be properly simulated by the newly developed reactor outage simulation and evaluation (ROSE) code. The two-region approach with a modified two-fluid model has been adopted to be the theoretical basis of the ROSE code.To verify the analytical modelmore » in the first step, posttest calculations against the integral midloop experiments with loss of RHR have been performed. The excellent simulation capacity of the ROSE code against the Institute of Nuclear Energy Research Integral System Test Facility test data is demonstrated. To further mature the ROSE code in simulating a full-sized pressurized water reactor, assessment against the WGOTHIC code and the Maanshan momentary-loss-of-RHR event has been undertaken. The successfully assessed ROSE code is then applied to evaluate the abnormal operation procedure (AOP) with loss of RHR during MLO (AOP 537.4) for the Maanshan plant. The ROSE code also has been successfully transplanted into the Maanshan training simulator to support operator training. How the simulator was upgraded by the ROSE code for MLO will be presented in the future.« less
Working research codes into fluid dynamics education: a science gateway approach
NASA Astrophysics Data System (ADS)
Mason, Lachlan; Hetherington, James; O'Reilly, Martin; Yong, May; Jersakova, Radka; Grieve, Stuart; Perez-Suarez, David; Klapaukh, Roman; Craster, Richard V.; Matar, Omar K.
2017-11-01
Research codes are effective for illustrating complex concepts in educational fluid dynamics courses, compared to textbook examples, an interactive three-dimensional visualisation can bring a problem to life! Various barriers, however, prevent the adoption of research codes in teaching: codes are typically created for highly-specific `once-off' calculations and, as such, have no user interface and a steep learning curve. Moreover, a code may require access to high-performance computing resources that are not readily available in the classroom. This project allows academics to rapidly work research codes into their teaching via a minimalist `science gateway' framework. The gateway is a simple, yet flexible, web interface allowing students to construct and run simulations, as well as view and share their output. Behind the scenes, the common operations of job configuration, submission, monitoring and post-processing are customisable at the level of shell scripting. In this talk, we demonstrate the creation of an example teaching gateway connected to the Code BLUE fluid dynamics software. Student simulations can be run via a third-party cloud computing provider or a local high-performance cluster. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).
DOT National Transportation Integrated Search
2000-06-01
The purpose of the Revised Catalog of Types of CODES Applications Implemented Using Linked : State Data (CODES) is to inspire the development of new applications for linked data that support : efforts to reduce death, disability, severity, and health...
Semantic and Phonological Coding in Poor and Normal Readers.
ERIC Educational Resources Information Center
Vellutino, Frank R.; And Others
1995-01-01
Using poor and normal readers, three studies evaluated semantic coding and phonological coding deficits as explanations for reading disability. It was concluded that semantic coding deficits are unlikely causes of difficulties in poor readers in early stages but accrue with prolonged reading difficulties in older readers. Phonological coding…
Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1990-01-01
The primary tasks during January 1990 to June 1990 have been the development and evaluation of various electron and electron-electronic energy equation models, the continued development of improved nonequilibrium radiation models for molecules and atoms, and the continued development and investigation of precursor models and their effects. In addition, work was initiated to develop a vibrational model for the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code. Also, an effort was started associated with the effects of including carbon species, say from an ablator, in the flowfield.
Evaluation of Savings in Energy-Efficient Public Housing in the Pacific Northwest
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, A.; Lubliner, M.; Howard, L.
2013-10-01
This report presents the results of an energy performance and cost-effectiveness analysis. The Salishan phase 7 and demonstration homes were compared to Salishan phase 6 homes built to 2006 Washington State Energy Code specifications 2. Predicted annual energy savings (over Salishan phase 6) was 19% for Salishan phase 7, and between 19-24% for the demonstration homes (depending on ventilation strategy). Approximately two-thirds of the savings are attributable to the DHP. Working with the electric utility provider, Tacoma Public Utilities, researchers conducted a billing analysis for Salishan phase 7.
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.
The Plasma Simulation Code: A modern particle-in-cell code with patch-based load-balancing
NASA Astrophysics Data System (ADS)
Germaschewski, Kai; Fox, William; Abbott, Stephen; Ahmadi, Narges; Maynard, Kristofor; Wang, Liang; Ruhl, Hartmut; Bhattacharjee, Amitava
2016-08-01
This work describes the Plasma Simulation Code (PSC), an explicit, electromagnetic particle-in-cell code with support for different order particle shape functions. We review the basic components of the particle-in-cell method as well as the computational architecture of the PSC code that allows support for modular algorithms and data structure in the code. We then describe and analyze in detail a distinguishing feature of PSC: patch-based load balancing using space-filling curves which is shown to lead to major efficiency gains over unbalanced methods and a previously used simpler balancing method.
Theoretical Thermal Evaluation of Energy Recovery Incinerators
1985-12-01
Army Logistics Mgt Center, Fort Lee , VA DTIC Alexandria, VA DTNSRDC Code 4111 (R. Gierich), Bethesda MD; Code 4120, Annapolis, MD; Code 522 (Library...Washington. DC: Code (I6H4. Washington. DC NAVSECGRUACT PWO (Code .’^O.’^). Winter Harbor. IVIE ; PWO (Code 4(1). Edzell. Scotland; PWO. Adak AK...NEW YORK Fort Schuyler. NY (Longobardi) TEXAS A&M UNIVERSITY W.B. Ledbetter College Station. TX UNIVERSITY OF CALIFORNIA Energy Engineer. Davis CA
Fuel-Air Mixing and Combustion in Scramjets
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Diskin, Glenn S.; Cutler, A. D.
2002-01-01
Activities in the area of scramjet fuel-air mixing and combustion associated with the Research and Technology Organization Working Group on Technologies for Propelled Hypersonic Flight are described. Work discussed in this paper has centered on the design of two basic experiments for studying the mixing and combustion of fuel and air in a scramjet. Simulations were conducted to aid in the design of these experiments. The experimental models were then constructed, and data were collected in the laboratory. Comparison of the data from a coaxial jet mixing experiment and a supersonic combustor experiment with a combustor code were then made and described. This work was conducted by NATO to validate combustion codes currently employed in scramjet design and to aid in the development of improved turbulence and combustion models employed by the codes.
Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, Derek Elswick
This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performingmore » a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.« less
Franklin, Rodney C G; Jacobs, Jeffrey Phillip; Krogmann, Otto N; Béland, Marie J; Aiello, Vera D; Colan, Steven D; Elliott, Martin J; William Gaynor, J; Kurosawa, Hiromi; Maruszewski, Bohdan; Stellin, Giovanni; Tchervenkov, Christo I; Walters Iii, Henry L; Weinberg, Paul; Anderson, Robert H
2008-12-01
Clinicians working in the field of congenital and paediatric cardiology have long felt the need for a common diagnostic and therapeutic nomenclature and coding system with which to classify patients of all ages with congenital and acquired cardiac disease. A cohesive and comprehensive system of nomenclature, suitable for setting a global standard for multicentric analysis of outcomes and stratification of risk, has only recently emerged, namely, The International Paediatric and Congenital Cardiac Code. This review, will give an historical perspective on the development of systems of nomenclature in general, and specifically with respect to the diagnosis and treatment of patients with paediatric and congenital cardiac disease. Finally, current and future efforts to merge such systems into the paperless environment of the electronic health or patient record on a global scale are briefly explored. On October 6, 2000, The International Nomenclature Committee for Pediatric and Congenital Heart Disease was established. In January, 2005, the International Nomenclature Committee was constituted in Canada as The International Society for Nomenclature of Paediatric and Congenital Heart Disease. This International Society now has three working groups. The Nomenclature Working Group developed The International Paediatric and Congenital Cardiac Code and will continue to maintain, expand, update, and preserve this International Code. It will also provide ready access to the International Code for the global paediatric and congenital cardiology and cardiac surgery communities, related disciplines, the healthcare industry, and governmental agencies, both electronically and in published form. The Definitions Working Group will write definitions for the terms in the International Paediatric and Congenital Cardiac Code, building on the previously published definitions from the Nomenclature Working Group. The Archiving Working Group, also known as The Congenital Heart Archiving Research Team, will link images and videos to the International Paediatric and Congenital Cardiac Code. The images and videos will be acquired from cardiac morphologic specimens and imaging modalities such as echocardiography, angiography, computerized axial tomography and magnetic resonance imaging, as well as intraoperative images and videos. Efforts are ongoing to expand the usage of The International Paediatric and Congenital Cardiac Code to other areas of global healthcare. Collaborative efforts are underway involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the representatives of the steering group responsible for the creation of the 11th revision of the International Classification of Diseases, administered by the World Health Organisation. Similar collaborative efforts are underway involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the International Health Terminology Standards Development Organisation, who are the owners of the Systematized Nomenclature of Medicine or "SNOMED". The International Paediatric and Congenital Cardiac Code was created by specialists in the field to name and classify paediatric and congenital cardiac disease and its treatment. It is a comprehensive code that can be freely downloaded from the internet (http://www.IPCCC.net) and is already in use worldwide, particularly for international comparisons of outcomes. The goal of this effort is to create strategies for stratification of risk and to improve healthcare for the individual patient. The collaboration with the World Heath Organization, the International Health Terminology Standards Development Organisation, and the healthcare industry, will lead to further enhancement of the International Code, and to its more universal use.
Keith, Rosalind E; Crosson, Jesse C; O'Malley, Ann S; Cromp, DeAnn; Taylor, Erin Fries
2017-02-10
Much research does not address the practical needs of stakeholders responsible for introducing health care delivery interventions into organizations working to achieve better outcomes. In this article, we present an approach to using the Consolidated Framework for Implementation Research (CFIR) to guide systematic research that supports rapid-cycle evaluation of the implementation of health care delivery interventions and produces actionable evaluation findings intended to improve implementation in a timely manner. To present our approach, we describe a formative cross-case qualitative investigation of 21 primary care practices participating in the Comprehensive Primary Care (CPC) initiative, a multi-payer supported primary care practice transformation intervention led by the Centers for Medicare and Medicaid Services. Qualitative data include observational field notes and semi-structured interviews with primary care practice leadership, clinicians, and administrative and medical support staff. We use intervention-specific codes, and CFIR constructs to reduce and organize the data to support cross-case analysis of patterns of barriers and facilitators relating to different CPC components. Using the CFIR to guide data collection, coding, analysis, and reporting of findings supported a systematic, comprehensive, and timely understanding of barriers and facilitators to practice transformation. Our approach to using the CFIR produced actionable findings for improving implementation effectiveness during this initiative and for identifying improvements to implementation strategies for future practice transformation efforts. The CFIR is a useful tool for guiding rapid-cycle evaluation of the implementation of practice transformation initiatives. Using the approach described here, we systematically identified where adjustments and refinements to the intervention could be made in the second year of the 4-year intervention. We think the approach we describe has broad application and encourage others to use the CFIR, along with intervention-specific codes, to guide the efficient and rigorous analysis of rich qualitative data. NCT02318108.
A high-fidelity Monte Carlo evaluation of CANDU-6 safety parameters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Y.; Hartanto, D.
2012-07-01
Important safety parameters such as the fuel temperature coefficient (FTC) and the power coefficient of reactivity (PCR) of the CANDU-6 (CANada Deuterium Uranium) reactor have been evaluated by using a modified MCNPX code. For accurate analysis of the parameters, the DBRC (Doppler Broadening Rejection Correction) scheme was implemented in MCNPX in order to account for the thermal motion of the heavy uranium nucleus in the neutron-U scattering reactions. In this work, a standard fuel lattice has been modeled and the fuel is depleted by using the MCNPX and the FTC value is evaluated for several burnup points including the mid-burnupmore » representing a near-equilibrium core. The Doppler effect has been evaluated by using several cross section libraries such as ENDF/B-VI, ENDF/B-VII, JEFF, JENDLE. The PCR value is also evaluated at mid-burnup conditions to characterize safety features of equilibrium CANDU-6 reactor. To improve the reliability of the Monte Carlo calculations, huge number of neutron histories are considered in this work and the standard deviation of the k-inf values is only 0.5{approx}1 pcm. It has been found that the FTC is significantly enhanced by accounting for the Doppler broadening of scattering resonance and the PCR are clearly improved. (authors)« less
DiClemente, Carlo C; Crouch, Taylor Berens; Norwood, Amber E Q; Delahanty, Janine; Welsh, Christopher
2015-03-01
Screening, brief intervention, and referral to treatment (SBIRT) has become an empirically supported and widely implemented approach in primary and specialty care for addressing substance misuse. Accordingly, training of providers in SBIRT has increased exponentially in recent years. However, the quality and fidelity of training programs and subsequent interventions are largely unknown because of the lack of SBIRT-specific evaluation tools. The purpose of this study was to create a coding scale to assess quality and fidelity of SBIRT interactions addressing alcohol, tobacco, illicit drugs, and prescription medication misuse. The scale was developed to evaluate performance in an SBIRT residency training program. Scale development was based on training protocol and competencies with consultation from Motivational Interviewing coding experts. Trained medical residents practiced SBIRT with standardized patients during 10- to 15-min videotaped interactions. This study included 25 tapes from the Family Medicine program coded by 3 unique coder pairs with varying levels of coding experience. Interrater reliability was assessed for overall scale components and individual items via intraclass correlation coefficients. Coder pair-specific reliability was also assessed. Interrater reliability was excellent overall for the scale components (>.85) and nearly all items. Reliability was higher for more experienced coders, though still adequate for the trained coder pair. Descriptive data demonstrated a broad range of adherence and skills. Subscale correlations supported concurrent and discriminant validity. Data provide evidence that the MD3 SBIRT Coding Scale is a psychometrically reliable coding system for evaluating SBIRT interactions and can be used to evaluate implementation skills for fidelity, training, assessment, and research. Recommendations for refinement and further testing of the measure are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Grois, Dan; Marpe, Detlev; Nguyen, Tung; Hadar, Ofer
2014-09-01
The popularity of low-delay video applications dramatically increased over the last years due to a rising demand for realtime video content (such as video conferencing or video surveillance), and also due to the increasing availability of relatively inexpensive heterogeneous devices (such as smartphones and tablets). To this end, this work presents a comparative assessment of the two latest video coding standards: H.265/MPEG-HEVC (High-Efficiency Video Coding), H.264/MPEG-AVC (Advanced Video Coding), and also of the VP9 proprietary video coding scheme. For evaluating H.264/MPEG-AVC, an open-source x264 encoder was selected, which has a multi-pass encoding mode, similarly to VP9. According to experimental results, which were obtained by using similar low-delay configurations for all three examined representative encoders, it was observed that H.265/MPEG-HEVC provides significant average bit-rate savings of 32.5%, and 40.8%, relative to VP9 and x264 for the 1-pass encoding, and average bit-rate savings of 32.6%, and 42.2% for the 2-pass encoding, respectively. On the other hand, compared to the x264 encoder, typical low-delay encoding times of the VP9 encoder, are about 2,000 times higher for the 1-pass encoding, and are about 400 times higher for the 2-pass encoding.
NASA Astrophysics Data System (ADS)
Chen, Zhenzhong; Han, Junwei; Ngan, King Ngi
2005-10-01
MPEG-4 treats a scene as a composition of several objects or so-called video object planes (VOPs) that are separately encoded and decoded. Such a flexible video coding framework makes it possible to code different video object with different distortion scale. It is necessary to analyze the priority of the video objects according to its semantic importance, intrinsic properties and psycho-visual characteristics such that the bit budget can be distributed properly to video objects to improve the perceptual quality of the compressed video. This paper aims to provide an automatic video object priority definition method based on object-level visual attention model and further propose an optimization framework for video object bit allocation. One significant contribution of this work is that the human visual system characteristics are incorporated into the video coding optimization process. Another advantage is that the priority of the video object can be obtained automatically instead of fixing weighting factors before encoding or relying on the user interactivity. To evaluate the performance of the proposed approach, we compare it with traditional verification model bit allocation and the optimal multiple video object bit allocation algorithms. Comparing with traditional bit allocation algorithms, the objective quality of the object with higher priority is significantly improved under this framework. These results demonstrate the usefulness of this unsupervised subjective quality lifting framework.
Interframe vector wavelet coding technique
NASA Astrophysics Data System (ADS)
Wus, John P.; Li, Weiping
1997-01-01
Wavelet coding is often used to divide an image into multi- resolution wavelet coefficients which are quantized and coded. By 'vectorizing' scalar wavelet coding and combining this with vector quantization (VQ), vector wavelet coding (VWC) can be implemented. Using a finite number of states, finite-state vector quantization (FSVQ) takes advantage of the similarity between frames by incorporating memory into the video coding system. Lattice VQ eliminates the potential mismatch that could occur using pre-trained VQ codebooks. It also eliminates the need for codebook storage in the VQ process, thereby creating a more robust coding system. Therefore, by using the VWC coding method in conjunction with the FSVQ system and lattice VQ, the formulation of a high quality very low bit rate coding systems is proposed. A coding system using a simple FSVQ system where the current state is determined by the previous channel symbol only is developed. To achieve a higher degree of compression, a tree-like FSVQ system is implemented. The groupings are done in this tree-like structure from the lower subbands to the higher subbands in order to exploit the nature of subband analysis in terms of the parent-child relationship. Class A and Class B video sequences from the MPEG-IV testing evaluations are used in the evaluation of this coding method.
Coding for effective denial management.
Miller, Jackie; Lineberry, Joe
2004-01-01
Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of ABNs and the compliance risks associated with improper use. Finally, training programs should include routine audits to monitor coders for competence and precision. Constantly changing codes and guidelines mean that a coder's skills can quickly become obsolete if not reinforced by ongoing training and monitoring. Comprehensive reporting and routine analysis of claim denials is without a doubt one of the greatest assets to a practice that is suffering from excessive claim denials and should be considered an investment capable of providing both short and long term ROIs. Some radiologists may lack the funding or human resources needed to implement truly effective coding programs for their staff members. In these circumstances, radiology business managers should consider outsourcing their coding.
NASA Astrophysics Data System (ADS)
Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.
2017-12-01
Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.
Pinoleville Pomo Nation Tribal Green Building Code
The Pinoleville Pomo Nation (PPN) worked with the U.S. Environmental Protection Agency (EPA) and the Development Center for Appropriate Technology (DCAT) to create this framework for tribal building codes.
AN EXACT SOLUTION FOR THE ASSESSMENT OF NONEQUILIBRIUM SORPTION OF RADIONUCLIDES IN THE VADOSE ZONE
In a report on model evaluation, the authors ran the HYDRUS Code, among other transport codes, to evaluate the impacts of nonequilibrium sorption sites on the time-evolution of 99Tc and 90Sr through the vadose zone. Since our evaluation was based on a rather low, annual recharge...
Combinatorial neural codes from a mathematical coding theory perspective.
Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L
2013-07-01
Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.
Jiao, Shuming; Jin, Zhi; Zhou, Changyuan; Zou, Wenbin; Li, Xia
2018-01-01
Quick response (QR) code has been employed as a data carrier for optical cryptosystems in many recent research works, and the error-correction coding mechanism allows the decrypted result to be noise free. However, in this paper, we point out for the first time that the Reed-Solomon coding algorithm in QR code is not a very suitable option for the nonlocally distributed speckle noise in optical cryptosystems from an information coding perspective. The average channel capacity is proposed to measure the data storage capacity and noise-resistant capability of different encoding schemes. We design an alternative 2D barcode scheme based on Bose-Chaudhuri-Hocquenghem (BCH) coding, which demonstrates substantially better average channel capacity than QR code in numerical simulated optical cryptosystems.
Behaviour of Reinforced Concrete Columns of Various Cross-Sections Subjected to Fire
NASA Astrophysics Data System (ADS)
Balaji, Aneesha; Muhamed Luquman, K.; Nagarajan, Praveen; Madhavan Pillai, T. M.
2016-09-01
Fire resistance is one of the crucial design regulations which are now mandatory in most of the design codes. Therefore, a thorough knowledge of behaviour of structures exposed to fire is required in this aspect. Columns are the most vulnerable structural member to fire as it can be exposed to fire from all sides. However, the data available for fire resistant design for columns are limited. Hence the present work is focused on the effect of cross-sectional shape of column in fire resistance design. The various cross-sections considered are Square, Ell (L), Tee (T), and Plus (`+') shape. Also the effect of size and shape and distribution of steel reinforcement on fire resistance of columns is studied. As the procedure for determining fire resistance is not mentioned in Indian Standard code IS 456 (2000), the simplified method (500 °C isotherm method) recommended in EN 1992-1-2:2004 (E) (Eurocode 2) is adopted. The temperature profiles for various cross-sections are developed using finite element method and these profiles are used to predict fire resistance capability of compression members. The fire resistance based on both numerical and code based methods are evaluated and compared for various types of cross-section.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flacco, A.; Fairchild, M.; Reiche, S.
2004-12-07
The coherent radiation emitted by electrons in high brightness beam-based experiments is important from the viewpoints of both radiation source development, and the understanding and diagnosing the basic physical processes important in beam manipulations at high intensity. While much theoretical work has been developed to aid in calculating aspects of this class of radiation, these methods do not often produce accurate information concerning the experimentally relevant aspects of the radiation. At UCLA, we are particularly interested in coherent synchrotron radiation and the related phenomena of coherent edge radiation, in the context of a fs-beam chicane compression experiment at the BNLmore » ATF. To analyze this and related problems, we have developed a program that acts as an extension to the Lienard-Wiechert-based 3D simulation code TREDI, termed FieldEye. This program allows the evaluation of electromagnetic fields in the time and frequency domain in an arbitrary 2D detector planar area. We discuss here the implementation of the FieldEye code, and give examples of results relevant to the case of the ATF chicane compressor experiment.« less