DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirayama, Hideo; Namito, Yoshihito; /KEK, Tsukuba
2005-12-20
In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users ofmore » EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version, a deliberate attempt was made to present example problems in order to help the user ''get started'', and we follow that spirit in this report. A series of elementary tutorial user codes are presented in Chapter 3, with more sophisticated sample user codes described in Chapter 4. Novice EGS users will find it helpful to read through the initial sections of the EGS5 User Manual (provided in Appendix B of this report), proceeding then to work through the tutorials in Chapter 3. The User Manuals and other materials found in the appendices contain detailed flow charts, variable lists, and subprogram descriptions of EGS5 and PEGS. Included are step-by-step instructions for developing basic EGS5 user codes and for accessing all of the physics options available in EGS5 and PEGS. Once acquainted with the basic structure of EGS5, users should find the appendices the most frequently consulted sections of this report.« less
Diagnostic x-ray dosimetry using Monte Carlo simulation.
Ioppolo, J L; Price, R I; Tuchyna, T; Buckley, C E
2002-05-21
An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 x 10(7)) than required for the calculation of dose profiles (1 x 10(9)). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.
Diagnostic x-ray dosimetry using Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Ioppolo, J. L.; Price, R. I.; Tuchyna, T.; Buckley, C. E.
2002-05-01
An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 × 107) than required for the calculation of dose profiles (1 × 109). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.
VISAGE: Interactive Visual Graph Querying.
Pienta, Robert; Navathe, Shamkant; Tamersoy, Acar; Tong, Hanghang; Endert, Alex; Chau, Duen Horng
2016-06-01
Extracting useful patterns from large network datasets has become a fundamental challenge in many domains. We present VISAGE, an interactive visual graph querying approach that empowers users to construct expressive queries, without writing complex code (e.g., finding money laundering rings of bankers and business owners). Our contributions are as follows: (1) we introduce graph autocomplete , an interactive approach that guides users to construct and refine queries, preventing over-specification; (2) VISAGE guides the construction of graph queries using a data-driven approach, enabling users to specify queries with varying levels of specificity, from concrete and detailed (e.g., query by example), to abstract (e.g., with "wildcard" nodes of any types), to purely structural matching; (3) a twelve-participant, within-subject user study demonstrates VISAGE's ease of use and the ability to construct graph queries significantly faster than using a conventional query language; (4) VISAGE works on real graphs with over 468K edges, achieving sub-second response times for common queries.
VISAGE: Interactive Visual Graph Querying
Pienta, Robert; Navathe, Shamkant; Tamersoy, Acar; Tong, Hanghang; Endert, Alex; Chau, Duen Horng
2017-01-01
Extracting useful patterns from large network datasets has become a fundamental challenge in many domains. We present VISAGE, an interactive visual graph querying approach that empowers users to construct expressive queries, without writing complex code (e.g., finding money laundering rings of bankers and business owners). Our contributions are as follows: (1) we introduce graph autocomplete, an interactive approach that guides users to construct and refine queries, preventing over-specification; (2) VISAGE guides the construction of graph queries using a data-driven approach, enabling users to specify queries with varying levels of specificity, from concrete and detailed (e.g., query by example), to abstract (e.g., with “wildcard” nodes of any types), to purely structural matching; (3) a twelve-participant, within-subject user study demonstrates VISAGE’s ease of use and the ability to construct graph queries significantly faster than using a conventional query language; (4) VISAGE works on real graphs with over 468K edges, achieving sub-second response times for common queries. PMID:28553670
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2011-01-01
This users manual provides in-depth information concerning installation and execution of Laura, version 5. Laura is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 Laura code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, Laura now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
Malataras, G; Kappas, C; Lovelock, D M; Mohan, R
1997-01-01
This article presents a comparison between two implementations of an EGS4 Monte Carlo simulation of a radiation therapy machine. The first implementation was run on a high performance RISC workstation, and the second was run on an inexpensive PC. The simulation was performed using the MCRAD user code. The photon energy spectra, as measured at a plane transverse to the beam direction and containing the isocenter, were compared. The photons were also binned radially in order to compare the variation of the spectra with radius. With 500,000 photons recorded in each of the two simulations, the running times were 48 h and 116 h for the workstation and the PC, respectively. No significant statistical differences between the two implementations were found.
Make Movies out of Your Dynamical Simulations with OGRE!
NASA Astrophysics Data System (ADS)
Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.
2013-10-01
We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.
Make Movies out of Your Dynamical Simulations with OGRE!
NASA Astrophysics Data System (ADS)
Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.
2014-01-01
We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.
Benchmark of PENELOPE code for low-energy photon transport: dose comparisons with MCNP4 and EGS4.
Ye, Sung-Joon; Brezovich, Ivan A; Pareek, Prem; Naqvi, Shahid A
2004-02-07
The expanding clinical use of low-energy photon emitting 125I and 103Pd seeds in recent years has led to renewed interest in their dosimetric properties. Numerous papers pointed out that higher accuracy could be obtained in Monte Carlo simulations by utilizing newer libraries for the low-energy photon cross-sections, such as XCOM and EPDL97. The recently developed PENELOPE 2001 Monte Carlo code is user friendly and incorporates photon cross-section data from the EPDL97. The code has been verified for clinical dosimetry of high-energy electron and photon beams, but has not yet been tested at low energies. In the present work, we have benchmarked the PENELOPE code for 10-150 keV photons. We computed radial dose distributions from 0 to 10 cm in water at photon energies of 10-150 keV using both PENELOPE and MCNP4C with either DLC-146 or DLC-200 cross-section libraries, assuming a point source located at the centre of a 30 cm diameter and 20 cm length cylinder. Throughout the energy range of simulated photons (except for 10 keV), PENELOPE agreed within statistical uncertainties (at worst +/- 5%) with MCNP/DLC-146 in the entire region of 1-10 cm and with published EGS4 data up to 5 cm. The dose at 1 cm (or dose rate constant) of PENELOPE agreed with MCNP/DLC-146 and EGS4 data within approximately +/- 2% in the range of 20-150 keV, while MCNP/DLC-200 produced values up to 9% lower in the range of 20-100 keV than PENELOPE or the other codes. However, the differences among the four datasets became negligible above 100 keV.
Color coding of control room displays: the psychocartography of visual layering effects.
Van Laar, Darren; Deshe, Ofer
2007-06-01
To evaluate which of three color coding methods (monochrome, maximally discriminable, and visual layering) used to code four types of control room display format (bars, tables, trend, mimic) was superior in two classes of task (search, compare). It has recently been shown that color coding of visual layers, as used in cartography, may be used to color code any type of information display, but this has yet to be fully evaluated. Twenty-four people took part in a 2 (task) x 3 (coding method) x 4 (format) wholly repeated measures design. The dependent variables assessed were target location reaction time, error rates, workload, and subjective feedback. Overall, the visual layers coding method produced significantly faster reaction times than did the maximally discriminable and the monochrome methods for both the search and compare tasks. No significant difference in errors was observed between conditions for either task type. Significantly less perceived workload was experienced with the visual layers coding method, which was also rated more highly than the other coding methods on a 14-item visual display quality questionnaire. The visual layers coding method is superior to other color coding methods for control room displays when the method supports the user's task. The visual layers color coding method has wide applicability to the design of all complex information displays utilizing color coding, from the most maplike (e.g., air traffic control) to the most abstract (e.g., abstracted ecological display).
The Environment-Power System Analysis Tool development program. [for spacecraft power supplies
NASA Technical Reports Server (NTRS)
Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.
1989-01-01
The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.
This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARCmore » and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.« less
Please Like Me: Facebook and Public Health Communication.
Kite, James; Foley, Bridget C; Grunseit, Anne C; Freeman, Becky
2016-01-01
Facebook, the most widely used social media platform, has been adopted by public health organisations for health promotion and behaviour change campaigns and activities. However, limited information is available on the most effective and efficient use of Facebook for this purpose. This study sought to identify the features of Facebook posts that are associated with higher user engagement on Australian public health organisations' Facebook pages. We selected 20 eligible pages through a systematic search and coded 360-days of posts for each page. Posts were coded by: post type (e.g., photo, text only etc.), communication technique employed (e.g. testimonial, informative etc.) and use of marketing elements (e.g., branding, use of mascots). A series of negative binomial regressions were used to assess associations between post characteristics and user engagement as measured by the number of likes, shares and comments. Our results showed that video posts attracted the greatest amount of user engagement, although an analysis of a subset of the data suggested this may be a reflection of the Facebook algorithm, which governs what is and is not shown in user newsfeeds and appear to preference videos over other post types. Posts that featured a positive emotional appeal or provided factual information attracted higher levels of user engagement, while conventional marketing elements, such as sponsorships and the use of persons of authority, generally discouraged user engagement, with the exception of posts that included a celebrity or sportsperson. Our results give insight into post content that maximises user engagement and begins to fill the knowledge gap on effective use of Facebook by public health organisations.
Please Like Me: Facebook and Public Health Communication
Kite, James; Foley, Bridget C.; Grunseit, Anne C.; Freeman, Becky
2016-01-01
Facebook, the most widely used social media platform, has been adopted by public health organisations for health promotion and behaviour change campaigns and activities. However, limited information is available on the most effective and efficient use of Facebook for this purpose. This study sought to identify the features of Facebook posts that are associated with higher user engagement on Australian public health organisations’ Facebook pages. We selected 20 eligible pages through a systematic search and coded 360-days of posts for each page. Posts were coded by: post type (e.g., photo, text only etc.), communication technique employed (e.g. testimonial, informative etc.) and use of marketing elements (e.g., branding, use of mascots). A series of negative binomial regressions were used to assess associations between post characteristics and user engagement as measured by the number of likes, shares and comments. Our results showed that video posts attracted the greatest amount of user engagement, although an analysis of a subset of the data suggested this may be a reflection of the Facebook algorithm, which governs what is and is not shown in user newsfeeds and appear to preference videos over other post types. Posts that featured a positive emotional appeal or provided factual information attracted higher levels of user engagement, while conventional marketing elements, such as sponsorships and the use of persons of authority, generally discouraged user engagement, with the exception of posts that included a celebrity or sportsperson. Our results give insight into post content that maximises user engagement and begins to fill the knowledge gap on effective use of Facebook by public health organisations. PMID:27632172
Digital Controller For Emergency Beacon
NASA Technical Reports Server (NTRS)
Ivancic, William D.
1990-01-01
Prototype digital controller intended for use in 406-MHz emergency beacon. Undergoing development according to international specifications, 406-MHz emergency beacon system includes satellites providing worldwide monitoring of beacons, with Doppler tracking to locate each beacon within 5 km. Controller turns beacon on and off and generates binary codes identifying source (e.g., ship, aircraft, person, or vehicle on land). Codes transmitted by phase modulation. Knowing code, monitor attempts to communicate with user, monitor uses code information to dispatch rescue team appropriate to type and locations of carrier.
2014-06-01
User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E) by James P. Larentzos...Laboratory Aberdeen Proving Ground, MD 21005-5069 ARL-SR-290 June 2014 User Manual and Source Code for a LAMMPS Implementation of Constant...3. DATES COVERED (From - To) September 2013–February 2014 4. TITLE AND SUBTITLE User Manual and Source Code for a LAMMPS Implementation of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Yi; Fakcharoenphol, Perapon; Wang, Shihao
2013-12-01
TOUGH2-EGS-MP is a parallel numerical simulation program coupling geomechanics with fluid and heat flow in fractured and porous media, and is applicable for simulation of enhanced geothermal systems (EGS). TOUGH2-EGS-MP is based on the TOUGH2-MP code, the massively parallel version of TOUGH2. In TOUGH2-EGS-MP, the fully-coupled flow-geomechanics model is developed from linear elastic theory for thermo-poro-elastic systems and is formulated in terms of mean normal stress as well as pore pressure and temperature. Reservoir rock properties such as porosity and permeability depend on rock deformation, and the relationships between these two, obtained from poro-elasticity theories and empirical correlations, are incorporatedmore » into the simulation. This report provides the user with detailed information on the TOUGH2-EGS-MP mathematical model and instructions for using it for Thermal-Hydrological-Mechanical (THM) simulations. The mathematical model includes the fluid and heat flow equations, geomechanical equation, and discretization of those equations. In addition, the parallel aspects of the code, such as domain partitioning and communication between processors, are also included. Although TOUGH2-EGS-MP has the capability for simulating fluid and heat flows coupled with geomechanical effects, it is up to the user to select the specific coupling process, such as THM or only TH, in a simulation. There are several example problems illustrating applications of this program. These example problems are described in detail and their input data are presented. Their results demonstrate that this program can be used for field-scale geothermal reservoir simulation in porous and fractured media with fluid and heat flow coupled with geomechanical effects.« less
Hamann, Cara J; Peek-Asa, Corinne
2017-05-01
Among roadway users, bicyclists are considered vulnerable due to their high risk for injury when involved in a crash. Little is known about the circumstances leading to near crashes, crashes, and related injuries or how these vary by age and gender. The purpose of this study was to examine the rates and characteristics of safety-relevant events (crashes, near crashes, errors, and traffic violations) among adult and child bicyclists. Bicyclist trips were captured using Pedal Portal, a data acquisition and coding system which includes a GPS-enabled video camera and graphical user interface. A total of 179 safety-relevant events were manually coded from trip videos. Overall, child errors and traffic violations occurred at a rate of 1.9 per 100min of riding, compared to 6.3 for adults. However, children rode on the sidewalk 56.4% of the time, compared with 12.7% for adults. For both adults and children, the highest safety-relevant event rates occurred on paved roadways with no bicycle facilities present (Adults=8.6 and Children=7.2, per 100min of riding). Our study, the first naturalistic study to compare safety-relevant events among adults and children, indicates large variation in riding behavior and exposure between child and adult bicyclists. The majority of identified events were traffic violations and we were not able to code all risk-relevant data (e.g., subtle avoidance behaviors, failure to check for traffic, probability of collision). Future naturalistic cycling studies would benefit from enhanced instrumentation (e.g., additional camera views) and coding protocols able to fill these gaps. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rcount: simple and flexible RNA-Seq read counting.
Schmid, Marc W; Grossniklaus, Ueli
2015-02-01
Analysis of differential gene expression by RNA sequencing (RNA-Seq) is frequently done using feature counts, i.e. the number of reads mapping to a gene. However, commonly used count algorithms (e.g. HTSeq) do not address the problem of reads aligning with multiple locations in the genome (multireads) or reads aligning with positions where two or more genes overlap (ambiguous reads). Rcount specifically addresses these issues. Furthermore, Rcount allows the user to assign priorities to certain feature types (e.g. higher priority for protein-coding genes compared to rRNA-coding genes) or to add flanking regions. Rcount provides a fast and easy-to-use graphical user interface requiring no command line or programming skills. It is implemented in C++ using the SeqAn (www.seqan.de) and the Qt libraries (qt-project.org). Source code and 64 bit binaries for (Ubuntu) Linux, Windows (7) and MacOSX are released under the GPLv3 license and are freely available on github.com/MWSchmid/Rcount. marcschmid@gmx.ch Test data, genome annotation files, useful Python and R scripts and a step-by-step user guide (including run-time and memory usage tests) are available on github.com/MWSchmid/Rcount. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Suhs, Norman E.; Dietz, William E.; Rogers, Stuart E.; Nash, Steven M.; Onufer, Jeffrey T.
2000-01-01
PEGASUS 5.1 is the latest version of the PEGASUS series of mesh interpolation codes. It is a fully three-dimensional code. The main purpose for the development of this latest version was to significantly decrease the number of user inputs required and to allow for easier operation of the code. This guide is to be used with the user's manual for version 4 of PEGASUS. A basic description of methods used in both versions is described in the Version 4 manual. A complete list of all user inputs used in version 5.1 is given in this guide.
Web Services Provide Access to SCEC Scientific Research Application Software
NASA Astrophysics Data System (ADS)
Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.
2003-12-01
Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the correct API interface from within C++ and/or C/Fortran). This poster presentation will provide descriptions of the following selected web services and their origin as scientific application codes: 3D community velocity models for Southern California, geocoordinate conversions (latitude/longitude to UTM), execution of GMT graphical scripts, data format conversions (Gocad to Matlab format), and implementation of Seismic Hazard Analysis application programs that calculate hazard curve and hazard map data sets.
Environmental Policy Tools: A User’s Guide.
1995-09-01
Quality RTUs Remote Terminal Units EG&S Environmental Goods and Services SAB (EPA) Science Advisory Board ( Industry ) SCAQMD South Coast Air Quality...Management EPA Environmental Protection Agency District EPCRA Emergency Planning and Community- SIC Standard Industrial Code Right-To-Know Act SIP State...solution will be significantly different than complicate the decision. Foremost among these today’s? is: Will costs and burdens to industry and gov
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Chirstopher O.; Kleb, Bil
2010-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, William L.
2013-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintain ability by eliminating the requirement for problem dependent recompilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
Xyce parallel electronic simulator : users' guide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.
2011-05-01
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers; (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-artmore » algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only); and (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a unique electrical simulation capability, designed to meet the unique needs of the laboratory.« less
Tensoral for post-processing users and simulation authors
NASA Technical Reports Server (NTRS)
Dresselhaus, Eliot
1993-01-01
The CTR post-processing effort aims to make turbulence simulations and data more readily and usefully available to the research and industrial communities. The Tensoral language, which provides the foundation for this effort, is introduced here in the form of a user's guide. The Tensoral user's guide is presented in two main sections. Section one acts as a general introduction and guides database users who wish to post-process simulation databases. Section two gives a brief description of how database authors and other advanced users can make simulation codes and/or the databases they generate available to the user community via Tensoral database back ends. The two-part structure of this document conforms to the two-level design structure of the Tensoral language. Tensoral has been designed to be a general computer language for performing tensor calculus and statistics on numerical data. Tensoral's generality allows it to be used for stand-alone native coding of high-level post-processing tasks (as described in section one of this guide). At the same time, Tensoral's specialization to a minute task (namely, to numerical tensor calculus and statistics) allows it to be easily embedded into applications written partly in Tensoral and partly in other computer languages (here, C and Vectoral). Embedded Tensoral, aimed at advanced users for more general coding (e.g. of efficient simulations, for interfacing with pre-existing software, for visualization, etc.), is described in section two of this guide.
Comparison of EGS4 and MCNP Monte Carlo codes when calculating radiotherapy depth doses.
Love, P A; Lewis, D G; Al-Affan, I A; Smith, C W
1998-05-01
The Monte Carlo codes EGS4 and MCNP have been compared when calculating radiotherapy depth doses in water. The aims of the work were to study (i) the differences between calculated depth doses in water for a range of monoenergetic photon energies and (ii) the relative efficiency of the two codes for different electron transport energy cut-offs. The depth doses from the two codes agree with each other within the statistical uncertainties of the calculations (1-2%). The relative depth doses also agree with data tabulated in the British Journal of Radiology Supplement 25. A discrepancy in the dose build-up region may by attributed to the different electron transport algorithims used by EGS4 and MCNP. This discrepancy is considerably reduced when the improved electron transport routines are used in the latest (4B) version of MCNP. Timing calculations show that EGS4 is at least 50% faster than MCNP for the geometries used in the simulations.
Implementing TCP/IP and a socket interface as a server in a message-passing operating system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hipp, E.; Wiltzius, D.
1990-03-01
The UNICOS 4.3BSD network code and socket transport interface are the basis of an explicit network server for NLTSS, a message passing operating system on the Cray YMP. A BSD socket user library provides access to the network server using an RPC mechanism. The advantages of this server methodology are its modularity and extensibility to migrate to future protocol suites (e.g. OSI) and transport interfaces. In addition, the network server is implemented in an explicit multi-tasking environment to take advantage of the Cray YMP multi-processor platform. 19 refs., 5 figs.
An Open-source Community Web Site To Support Ground-Water Model Testing
NASA Astrophysics Data System (ADS)
Kraemer, S. R.; Bakker, M.; Craig, J. R.
2007-12-01
A community wiki wiki web site has been created as a resource to support ground-water model development and testing. The Groundwater Gourmet wiki is a repository for user supplied analytical and numerical recipes, howtos, and examples. Members are encouraged to submit analytical solutions, including source code and documentation. A diversity of code snippets are sought in a variety of languages, including Fortran, C, C++, Matlab, Python. In the spirit of a wiki, all contributions may be edited and altered by other users, and open source licensing is promoted. Community accepted contributions are graduated into the library of analytic solutions and organized into either a Strack (Groundwater Mechanics, 1989) or Bruggeman (Analytical Solutions of Geohydrological Problems, 1999) classification. The examples section of the wiki are meant to include laboratory experiments (e.g., Hele Shaw), classical benchmark problems (e.g., Henry Problem), and controlled field experiments (e.g., Borden landfill and Cape Cod tracer tests). Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.
User input verification and test driven development in the NJOY21 nuclear data processing code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trainer, Amelia Jo; Conlin, Jeremy Lloyd; McCartney, Austin Paul
Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, andmore » capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.« less
A Picture is Worth 1,000 Words. The Use of Clinical Images in Electronic Medical Records.
Ai, Angela C; Maloney, Francine L; Hickman, Thu-Trang; Wilcox, Allison R; Ramelson, Harley; Wright, Adam
2017-07-12
To understand how clinicians utilize image uploading tools in a home grown electronic health records (EHR) system. A content analysis of patient notes containing non-radiological images from the EHR was conducted. Images from 4,000 random notes from July 1, 2009 - June 30, 2010 were reviewed and manually coded. Codes were assigned to four properties of the image: (1) image type, (2) role of image uploader (e.g. MD, NP, PA, RN), (3) practice type (e.g. internal medicine, dermatology, ophthalmology), and (4) image subject. 3,815 images from image-containing notes stored in the EHR were reviewed and manually coded. Of those images, 32.8% were clinical and 66.2% were non-clinical. The most common types of the clinical images were photographs (38.0%), diagrams (19.1%), and scanned documents (14.4%). MDs uploaded 67.9% of clinical images, followed by RNs with 10.2%, and genetic counselors with 6.8%. Dermatology (34.9%), ophthalmology (16.1%), and general surgery (10.8%) uploaded the most clinical images. The content of clinical images referencing body parts varied, with 49.8% of those images focusing on the head and neck region, 15.3% focusing on the thorax, and 13.8% focusing on the lower extremities. The diversity of image types, content, and uploaders within a home grown EHR system reflected the versatility and importance of the image uploading tool. Understanding how users utilize image uploading tools in a clinical setting highlights important considerations for designing better EHR tools and the importance of interoperability between EHR systems and other health technology.
Chiavassa, S; Lemosquet, A; Aubineau-Lanièce, I; de Carlan, L; Clairand, I; Ferrer, L; Bardiès, M; Franck, D; Zankl, M
2005-01-01
This paper aims at comparing dosimetric assessments performed with three Monte Carlo codes: EGS4, MCNP4c2 and MCNPX2.5e, using a realistic voxel phantom, namely the Zubal phantom, in two configurations of exposure. The first one deals with an external irradiation corresponding to the example of a radiological accident. The results are obtained using the EGS4 and the MCNP4c2 codes and expressed in terms of the mean absorbed dose (in Gy per source particle) for brain, lungs, liver and spleen. The second one deals with an internal exposure corresponding to the treatment of a medullary thyroid cancer by 131I-labelled radiopharmaceutical. The results are obtained by EGS4 and MCNPX2.5e and compared in terms of S-values (expressed in mGy per kBq and per hour) for liver, kidney, whole body and thyroid. The results of these two studies are presented and differences between the codes are analysed and discussed.
NASA Astrophysics Data System (ADS)
Cook, S. J.
2009-05-01
Aquarius is a Windows application that models fluid flow and heat transport under conditions in which fluid buoyancy can significantly impact patterns and magnitudes of fluid flow. The package is designed as a visualization tool through which users can examine flow systems in environments, both low temperature aquifers and regions with elevated PT regimes such as deep sedimentary basins, hydrothermal systems, and contact thermal aureoles. The package includes 4 components: (1) A finite-element mesh generator/assembler capable of representing complex geologic structures. Left-hand, right-hand and alternating linear triangles can be mixed within the mesh. Planer horizontal, planer vertical and cylindrical vertical coordinate sections are supported. (2) A menu-selectable system for setting properties and boundary/initial conditions. The design retains mathematical terminology for all input parameters such as scalars (e.g., porosity), tensors (e.g., permeability), and boundary/initial conditions (e.g., fixed potential). This makes the package an effective instructional aide by linking model requirements with the underlying mathematical concepts of partial differential equations and the solution logic of boundary/initial value problems. (3) Solution algorithms for steady-state and time-transient fluid flow/heat transport problems. For all models, the nonlinear global matrix equations are solved sequentially using over-relaxation techniques. Matrix storage design allows for large (e.g., 20000) element models to run efficiently on a typical PC. (4) A plotting system that supports contouring nodal data (e.g., head), vector plots for flux data (e.g., specific discharge), and colour gradient plots for elemental data (e.g., porosity), water properties (e.g., density), and performance measures (e.g., Peclet numbers). Display graphics can be printed or saved in standard graphic formats (e.g., jpeg). This package was developed from procedural codes in C written originally to model the hydrothermal flow system responsible for contact metamorphism of Utah's Alta Stock (Cook et al., AJS 1997). These codes were reprogrammed in Microsoft C# to take advantage of object oriented design and the capabilities of Microsoft's .NET framework. The package is available at no cost by e-mail request from the author.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
Comparisons between MCNP, EGS4 and experiment for clinical electron beams.
Jeraj, R; Keall, P J; Ostwald, P M
1999-03-01
Understanding the limitations of Monte Carlo codes is essential in order to avoid systematic errors in simulations, and to suggest further improvement of the codes. MCNP and EGS4, Monte Carlo codes commonly used in medical physics, were compared and evaluated against electron depth dose data and experimental backscatter results obtained using clinical radiotherapy beams. Different physical models and algorithms used in the codes give significantly different depth dose curves and electron backscattering factors. The default version of MCNP calculates electron depth dose curves which are too penetrating. The MCNP results agree better with experiment if the ITS-style energy-indexing algorithm is used. EGS4 underpredicts electron backscattering for high-Z materials. The results slightly improve if optimal PRESTA-I parameters are used. MCNP simulates backscattering well even for high-Z materials. To conclude the comparison, a timing study was performed. EGS4 is generally faster than MCNP and use of a large number of scoring voxels dramatically slows down the MCNP calculation. However, use of a large number of geometry voxels in MCNP only slightly affects the speed of the calculation.
NASA Astrophysics Data System (ADS)
Menthe, R. W.; McColgan, C. J.; Ladden, R. M.
1991-05-01
The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.
NASA Technical Reports Server (NTRS)
Menthe, R. W.; Mccolgan, C. J.; Ladden, R. M.
1991-01-01
The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.
Calculation of Dose for Skyshine Radiation From a 45 MeV Electron LINAC
NASA Astrophysics Data System (ADS)
Hori, M.; Hikoji, M.; Takahashi, H.; Takahashi, K.; Kitaichi, M.; Sawamura, S.; Nojiri, I.
1996-11-01
Dose estimation for skyshine plays an important role in the evaluation of the environment around nuclear facilities. We performed calculations for the skyshine radiation from a Hokkaido University 45 MeV linear accelerator using a general purpose user's version of the EGS4 Monte Carlo Code. To verify accuracy of the code, the simulation results have been compared with our experimental results, in which a gated counting method was used to measure low-level pulsed leakage radiation. In experiment, measurements were carried out up to 600 m away from the LINAC. The simulation results are consistent with the experimental values at the distance between 100 and 400 m from the LINAC. However, agreements of both results up to 100 m from the LINAC are not as good because of the simplification of geometrical modeling in the simulation. It could be said that it is useful to apply this version to the calculation for skyshine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fried, L.E.
1994-06-24
CHEETAH is an effort to bring the TIGER thermochemical code into the 1990s. A wide variety of improvements have been made in Version 1.0, and a host of others will be implemented in the future. In CHEETAH 1.0 I have improved the robustness and ease of use of TIGER. All of TIGER`s solvers have been replaced by new algorithms. I find that CHEETAH solves a wider variety of problems with no user intervention (e.g. no guesses for the C-J state) than TIGER did. CHEETAH has been made simpler to use than TIGER; typical use of the code occurs with themore » new standard run command. I hope that CHEETAH makes the use of thermochemical codes more attractive to practical explosive formulators. In the future I plan to improve the underlying science in CHEETAH. More accurate equations of state will be used in the gas and the condensed phase. A kinetics capability will be added to the code that will predict reaction zone thickness. CHEETAH is currently a numerical implementation of C-J theory. It will,become an implementation of ZND theory. Further ease of use features will eventually be added; an automatic formulator that adjusts concentrations to match desired properties is planned.« less
Demand access communications for TDRSS users
NASA Technical Reports Server (NTRS)
Zillig, David; Weinberg, Aaron; Mcomber, Robert
1994-01-01
The Tracking and Data Relay Satellite System (TDRSS) has long been used to provide reliable low and high-data rate relay services between user spacecraft in Earth orbit and the ground. To date, these TDRSS services have been implemented via prior scheduling based upon estimates of user needs and mission event timelines. While this approach may be necessary for large users that require greater amounts of TDRSS resources, TDRSS can potentially offer the planned community of smaller science missions (e.g., the small explorer missions), and other emerging users, the unique opportunity for services on demand. In particular, innovative application of the existing TDRSS Multiple Access (MA) subsystem, with its phased array antenna, could be used to implement true demand access services without modification to either the TDRSS satellites or the user transponder, thereby introducing operational and performance benefits to both the user community and the Space Network. In this paper, candidate implementations of demand access service via the TDRSS MA subsystem are examined in detail. Both forward and return link services are addressed and a combination of qualitative and quantitative assessments are provided. The paper also identifies further areas for investigation in this ongoing activity that is being conducted by GSFC/Code 531 under the NASA Code O Advanced Systems Program.
Prosocial coping and substance use during pregnancy.
Blechman, E A; Lowell, E S; Garrett, J
1999-01-01
In structured interviews of pregnant inner-city residents, 38 substance users reported more current liking of drugs and polysubstance use, disengagement coping, depressive symptoms, negative affect, and antisocial behavior than did 45 nonusers. During videotaped interviews, trained observers coded less warmth and less prosocial information exchange (e.g., self-disclosure, question asking) among users. Factor analysis of measures of coping and its concomitants yielded a three-factor (prosocial, antisocial, asocial) solution, with asocial and antisocial coping predominating among substance users. These results suggest that coping has emotional, social, and cognitive elements. This study is the first to demonstrate an association between a substance-using lifestyle and limited prosocial information exchange.
The EGS4 Code System: Solution of Gamma-ray and Electron Transport Problems
DOE R&D Accomplishments Database
Nelson, W. R.; Namito, Yoshihito
1990-03-01
In this paper we present an overview of the EGS4 Code System -- a general purpose package for the Monte Carlo simulation of the transport of electrons and photons. During the last 10-15 years EGS has been widely used to design accelerators and detectors for high-energy physics. More recently the code has been found to be of tremendous use in medical radiation physics and dosimetry. The problem-solving capabilities of EGS4 will be demonstrated by means of a variety of practical examples. To facilitate this review, we will take advantage of a new add-on package, called SHOWGRAF, to display particle trajectories in complicated geometries. These are shown as 2-D laser pictures in the written paper and as photographic slides of a 3-D high-resolution color monitor during the oral presentation. 11 refs., 15 figs.
VizieR Online Data Catalog: Radiative forces for stellar envelopes (Seaton, 1997)
NASA Astrophysics Data System (ADS)
Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.
2000-02-01
(1) Primary data files, stages.zz These files give data for the calculation of radiative accelerations, GRAD, for elements with nuclear charge zz. Data are available for zz=06, 07, 08, 10, 11, 12, 13, 14, 16, 18, 20, 24, 25, 26 and 28. Calculations are made using data from the Opacity Project (see papers SYMP and IXZ). The data are given for each ionisation stage, j. They are tabulated on a mesh of (T, Ne, CHI) where T is temperature, Ne electron density and CHI is abundance multiplier. The files include data for ionisation fractions, for each (T, Ne). The file contents are described in the paper ACC and as comments in the code add.f (2) Code add.f This reads a file stages.zz and creates a file acc.zz giving radiative accelerations averaged over ionisation stages. The code prompts for names of input and output files. The code, as provided, gives equal weights (as defined in the paper ACC) to all stages. Th weights are set in SUBROUTINE WEIGHTS, which could be changed to give any weights preferred by the user. The dependence of diffusion coefficients on ionisation stage is given by a function ZET, which is defined in SUBROUTINE ZETA. The expressions used for ZET are as given in the paper. The user can change that subroutine if other expressions are preferred. The output file contains values, ZETBAR, of ZET, averaged over ionisation stages. (3) Files acc.zz Radiative accelerations computed using add.f as provided. The user will need to run the code add.f only if it is required to change the subroutines WEIGHTS or ZETA. The contents of the files acc.zz are described in the paper ACC and in comments contained in the code add.f. (4) Code accfit.f This code gives gives radiative accelerations, and some related data, for a stellar model. Methods used to interpolate data to the values of (T, RHO) for the stellar model are based on those used in the code opfit.for (see the paper OPF). The executable file accfit.com runs accfit.f. It uses a list of files given in accfit.files (see that file for further description). The mesh used for the abundance-multiplier CHI on the output file will generally be finer than that used in the input files acc.zz. The mesh to be used is specified on a file chi.dat. For a test run, the stellar model used is given in the file 10000_4.2 (Teff=10000 K, LOG10(g)=4.2) The output file from that test run is acc100004.2. The contents of the output file are described in the paper ACC and as comments in the code accfit.f. (5) The code diff.f This code reads the output file (e.g. acc1000004.2) created by accfit.f. For any specified depth point in the model and value of CHI, it gives values of radiative accelerations, the quantity ZETBAR required for calculation of diffusion coefficients, and Rosseland-mean opacities. The code prompts for input data. It creates a file recording all data calculated. The code diff.f is intended for incorporation, as a set of subroutines, in codes for diffusion calculations. (1 data file).
A manual for PARTI runtime primitives
NASA Technical Reports Server (NTRS)
Berryman, Harry; Saltz, Joel
1990-01-01
Primitives are presented that are designed to help users efficiently program irregular problems (e.g., unstructured mesh sweeps, sparse matrix codes, adaptive mesh partial differential equations solvers) on distributed memory machines. These primitives are also designed for use in compilers for distributed memory multiprocessors. Communications patterns are captured at runtime, and the appropriate send and receive messages are automatically generated.
User's manual for COAST 4: a code for costing and sizing tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sink, D. A.; Iwinski, E. M.
1979-09-01
The purpose of this report is to document the computer program COAST 4 for the user/analyst. COAST, COst And Size Tokamak reactors, provides complete and self-consistent size models for the engineering features of D-T burning tokamak reactors and associated facilities involving a continuum of performance including highly beam driven through ignited plasma devices. TNS (The Next Step) devices with no tritium breeding or electrical power production are handled as well as power producing and fissile producing fusion-fission hybrid reactors. The code has been normalized with a TFTR calculation which is consistent with cost, size, and performance data published in themore » conceptual design report for that device. Information on code development, computer implementation and detailed user instructions are included in the text.« less
MuSim, a Graphical User Interface for Multiple Simulation Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Thomas; Cummings, Mary Anne; Johnson, Rolland
2016-06-01
MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X,more » and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.« less
GGRNA: an ultrafast, transcript-oriented search engine for genes and transcripts
Naito, Yuki; Bono, Hidemasa
2012-01-01
GGRNA (http://GGRNA.dbcls.jp/) is a Google-like, ultrafast search engine for genes and transcripts. The web server accepts arbitrary words and phrases, such as gene names, IDs, gene descriptions, annotations of gene and even nucleotide/amino acid sequences through one simple search box, and quickly returns relevant RefSeq transcripts. A typical search takes just a few seconds, which dramatically enhances the usability of routine searching. In particular, GGRNA can search sequences as short as 10 nt or 4 amino acids, which cannot be handled easily by popular sequence analysis tools. Nucleotide sequences can be searched allowing up to three mismatches, or the query sequences may contain degenerate nucleotide codes (e.g. N, R, Y, S). Furthermore, Gene Ontology annotations, Enzyme Commission numbers and probe sequences of catalog microarrays are also incorporated into GGRNA, which may help users to conduct searches by various types of keywords. GGRNA web server will provide a simple and powerful interface for finding genes and transcripts for a wide range of users. All services at GGRNA are provided free of charge to all users. PMID:22641850
GGRNA: an ultrafast, transcript-oriented search engine for genes and transcripts.
Naito, Yuki; Bono, Hidemasa
2012-07-01
GGRNA (http://GGRNA.dbcls.jp/) is a Google-like, ultrafast search engine for genes and transcripts. The web server accepts arbitrary words and phrases, such as gene names, IDs, gene descriptions, annotations of gene and even nucleotide/amino acid sequences through one simple search box, and quickly returns relevant RefSeq transcripts. A typical search takes just a few seconds, which dramatically enhances the usability of routine searching. In particular, GGRNA can search sequences as short as 10 nt or 4 amino acids, which cannot be handled easily by popular sequence analysis tools. Nucleotide sequences can be searched allowing up to three mismatches, or the query sequences may contain degenerate nucleotide codes (e.g. N, R, Y, S). Furthermore, Gene Ontology annotations, Enzyme Commission numbers and probe sequences of catalog microarrays are also incorporated into GGRNA, which may help users to conduct searches by various types of keywords. GGRNA web server will provide a simple and powerful interface for finding genes and transcripts for a wide range of users. All services at GGRNA are provided free of charge to all users.
Tools for Designing and Analyzing Structures
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Structural Design and Analysis Toolset is a collection of approximately 26 Microsoft Excel spreadsheet programs, each of which performs calculations within a different subdiscipline of structural design and analysis. These programs present input and output data in user-friendly, menu-driven formats. Although these programs cannot solve complex cases like those treated by larger finite element codes, these programs do yield quick solutions to numerous common problems more rapidly than the finite element codes, thereby making it possible to quickly perform multiple preliminary analyses - e.g., to establish approximate limits prior to detailed analyses by the larger finite element codes. These programs perform different types of calculations, as follows: 1. determination of geometric properties for a variety of standard structural components; 2. analysis of static, vibrational, and thermal- gradient loads and deflections in certain structures (mostly beams and, in the case of thermal-gradients, mirrors); 3. kinetic energies of fans; 4. detailed analysis of stress and buckling in beams, plates, columns, and a variety of shell structures; and 5. temperature dependent properties of materials, including figures of merit that characterize strength, stiffness, and deformation response to thermal gradients
DOE Office of Scientific and Technical Information (OSTI.GOV)
EMAM, M; Eldib, A; Lin, M
2014-06-01
Purpose: An in-house Monte Carlo based treatment planning system (MC TPS) has been developed for modulated electron radiation therapy (MERT). Our preliminary MERT planning experience called for a more user friendly graphical user interface. The current work aimed to design graphical windows and tools to facilitate the contouring and planning process. Methods: Our In-house GUI MC TPS is built on a set of EGS4 user codes namely MCPLAN and MCBEAM in addition to an in-house optimization code, which was named as MCOPTIM. Patient virtual phantom is constructed using the tomographic images in DICOM format exported from clinical treatment planning systemsmore » (TPS). Treatment target volumes and critical structures were usually contoured on clinical TPS and then sent as a structure set file. In our GUI program we developed a visualization tool to allow the planner to visualize the DICOM images and delineate the various structures. We implemented an option in our code for automatic contouring of the patient body and lungs. We also created an interface window displaying a three dimensional representation of the target and also showing a graphical representation of the treatment beams. Results: The new GUI features helped streamline the planning process. The implemented contouring option eliminated the need for performing this step on clinical TPS. The auto detection option for contouring the outer patient body and lungs was tested on patient CTs and it was shown to be accurate as compared to that of clinical TPS. The three dimensional representation of the target and the beams allows better selection of the gantry, collimator and couch angles. Conclusion: An in-house GUI program has been developed for more efficient MERT planning. The application of aiding tools implemented in the program is time saving and gives better control of the planning process.« less
NASA Technical Reports Server (NTRS)
Rodal, J. J. A.; French, S. E.; Witmer, E. A.; Stagliano, T. R.
1979-01-01
The CIVM-JET 4C computer program for the 'finite strain' analysis of 2 d transient structural responses of complete or partial rings and beams subjected to fragment impact stored on tape as a series of individual files. Which subroutines are found in these files are described in detail. All references to the CIVM-JET 4C program are made assuming that the user has a copy of NASA CR-134907 (ASRL TR 154-9) which serves as a user's guide to (1) the CIVM-JET 4B computer code and (2) the CIVM-JET 4C computer code 'with the use of the modified input instructions' attached hereto.
DYNA3D/ParaDyn Regression Test Suite Inventory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jerry I.
2016-09-01
The following table constitutes an initial assessment of feature coverage across the regression test suite used for DYNA3D and ParaDyn. It documents the regression test suite at the time of preliminary release 16.1 in September 2016. The columns of the table represent groupings of functionalities, e.g., material models. Each problem in the test suite is represented by a row in the table. All features exercised by the problem are denoted by a check mark (√) in the corresponding column. The definition of “feature” has not been subdivided to its smallest unit of user input, e.g., algorithmic parameters specific to amore » particular type of contact surface. This represents a judgment to provide code developers and users a reasonable impression of feature coverage without expanding the width of the table by several multiples. All regression testing is run in parallel, typically with eight processors, except problems involving features only available in serial mode. Many are strictly regression tests acting as a check that the codes continue to produce adequately repeatable results as development unfolds; compilers change and platforms are replaced. A subset of the tests represents true verification problems that have been checked against analytical or other benchmark solutions. Users are welcomed to submit documented problems for inclusion in the test suite, especially if they are heavily exercising, and dependent upon, features that are currently underrepresented.« less
Emergency general surgery: definition and estimated burden of disease.
Shafi, Shahid; Aboutanos, Michel B; Agarwal, Suresh; Brown, Carlos V R; Crandall, Marie; Feliciano, David V; Guillamondegui, Oscar; Haider, Adil; Inaba, Kenji; Osler, Turner M; Ross, Steven; Rozycki, Grace S; Tominaga, Gail T
2013-04-01
Acute care surgery encompasses trauma, surgical critical care, and emergency general surgery (EGS). While the first two components are well defined, the scope of EGS practice remains unclear. This article describes the work of the American Association for the Surgery of Trauma to define EGS. A total of 621 unique International Classification of Diseases-9th Rev. (ICD-9) diagnosis codes were identified using billing data (calendar year 2011) from seven large academic medical centers that practice EGS. A modified Delphi methodology was used by the American Association for the Surgery of Trauma Committee on Severity Assessment and Patient Outcomes to review these codes and achieve consensus on the definition of primary EGS diagnosis codes. National Inpatient Sample data from 2009 were used to develop a national estimate of EGS burden of disease. Several unique ICD-9 codes were identified as primary EGS diagnoses. These encompass a wide spectrum of general surgery practice, including upper and lower gastrointestinal tract, hepatobiliary and pancreatic disease, soft tissue infections, and hernias. National Inpatient Sample estimates revealed over 4 million inpatient encounters nationally in 2009 for EGS diseases. This article provides the first list of ICD-9 diagnoses codes that define the scope of EGS based on current clinical practices. These findings have wide implications for EGS workforce training, access to care, and research.
Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System
1999-12-01
jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in
Performance Analysis of Hybrid ARQ Protocols in a Slotted Code Division Multiple-Access Network
1989-08-01
Convolutional Codes . in Proc Int. Conf. Commun., 21.4.1-21.4.5, 1987. [27] J. Hagenauer. Rate Compatible Punctured Convolutional Codes . in Proc Int. Conf...achieved by using a low rate (r = 0.5), high constraint length (e.g., 32) punctured convolutional code . Code puncturing provides for a variable rate code ...investigated the use of convolutional codes in Type II Hybrid ARQ protocols. The error
Sierra/SolidMechanics 4.48 User's Guide: Addendum for Shock Capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose
This is an addendum to the Sierra/SolidMechanics 4.48 User's Guide that documents additional capabilities available only in alternate versions of the Sierra/SolidMechanics (Sierra/SM) code. These alternate versions are enhanced to provide capabilities that are regulated under the U.S. Department of State's International Traffic in Arms Regulations (ITAR) export-control rules. The ITAR regulated codes are only distributed to entities that comply with the ITAR export-control requirements. The ITAR enhancements to Sierra/SM in- clude material models with an energy-dependent pressure response (appropriate for very large deformations and strain rates) and capabilities for blast modeling. Since this is an addendum to the standardmore » Sierra/SM user's guide, please refer to that document first for general descriptions of code capability and use.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Begovich, C.L.; Eckerman, K.F.; Schlatter, E.C.
1981-08-01
The DARTAB computer code combines radionuclide environmental exposure data with dosimetric and health effects data to generate tabulations of the predicted impact of radioactive airborne effluents. DARTAB is independent of the environmental transport code used to generate the environmental exposure data and the codes used to produce the dosimetric and health effects data. Therefore human dose and risk calculations need not be added to every environmental transport code. Options are included in DARTAB to permit the user to request tabulations by various topics (e.g., cancer site, exposure pathway, etc.) to facilitate characterization of the human health impacts of the effluents.more » The DARTAB code was written at ORNL for the US Environmental Protection Agency, Office of Radiation Programs.« less
A manual for PARTI runtime primitives, revision 1
NASA Technical Reports Server (NTRS)
Das, Raja; Saltz, Joel; Berryman, Harry
1991-01-01
Primitives are presented that are designed to help users efficiently program irregular problems (e.g., unstructured mesh sweeps, sparse matrix codes, adaptive mesh partial differential equations solvers) on distributed memory machines. These primitives are also designed for use in compilers for distributed memory multiprocessors. Communications patterns are captured at runtime, and the appropriate send and receive messages are automatically generated.
FY15 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Shemon, E. R.; Smith, M. A.
2015-09-30
This report summarizes the current status of NEAMS activities in FY2015. The tasks this year are (1) to improve solution methods for steady-state and transient conditions, (2) to develop features and user friendliness to increase the usability and applicability of the code, (3) to improve and verify the multigroup cross section generation scheme, (4) to perform verification and validation tests of the code using SFRs and thermal reactor cores, and (5) to support early users of PROTEUS and update the user manuals.
Convergence Acceleration and Documentation of CFD Codes for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Marquart, Jed E.
2005-01-01
The development and analysis of turbomachinery components for industrial and aerospace applications has been greatly enhanced in recent years through the advent of computational fluid dynamics (CFD) codes and techniques. Although the use of this technology has greatly reduced the time required to perform analysis and design, there still remains much room for improvement in the process. In particular, there is a steep learning curve associated with most turbomachinery CFD codes, and the computation times need to be reduced in order to facilitate their integration into standard work processes. Two turbomachinery codes have recently been developed by Dr. Daniel Dorney (MSFC) and Dr. Douglas Sondak (Boston University). These codes are entitled Aardvark (for 2-D and quasi 3-D simulations) and Phantom (for 3-D simulations). The codes utilize the General Equation Set (GES), structured grid methodology, and overset O- and H-grids. The codes have been used with success by Drs. Dorney and Sondak, as well as others within the turbomachinery community, to analyze engine components and other geometries. One of the primary objectives of this study was to establish a set of parametric input values which will enhance convergence rates for steady state simulations, as well as reduce the runtime required for unsteady cases. The goal is to reduce the turnaround time for CFD simulations, thus permitting more design parametrics to be run within a given time period. In addition, other code enhancements to reduce runtimes were investigated and implemented. The other primary goal of the study was to develop enhanced users manuals for Aardvark and Phantom. These manuals are intended to answer most questions for new users, as well as provide valuable detailed information for the experienced user. The existence of detailed user s manuals will enable new users to become proficient with the codes, as well as reducing the dependency of new users on the code authors. In order to achieve the objectives listed, the following tasks were accomplished: 1) Parametric Study Of Preconditioning Parameters And Other Code Inputs; 2) Code Modifications To Reduce Runtimes; 3) Investigation Of Compiler Options To Reduce Code Runtime; and 4) Development/Enhancement of Users Manuals for Aardvark and Phantom
Dynamic quality of service differentiation using fixed code weight in optical CDMA networks
NASA Astrophysics Data System (ADS)
Kakaee, Majid H.; Essa, Shawnim I.; Abd, Thanaa H.; Seyedzadeh, Saleh
2015-11-01
The emergence of network-driven applications, such as internet, video conferencing, and online gaming, brings in the need for a network the environments with capability of providing diverse Quality of Services (QoS). In this paper, a new code family of novel spreading sequences, called a Multi-Service (MS) code, has been constructed to support multiple services in Optical- Code Division Multiple Access (CDMA) system. The proposed method uses fixed weight for all services, however reducing the interfering codewords for the users requiring higher QoS. The performance of the proposed code is demonstrated using mathematical analysis. It shown that the total number of served users with satisfactory BER of 10-9 using NB=2 is 82, while they are only 36 and 10 when NB=3 and 4 respectively. The developed MS code is compared with variable-weight codes such as Variable Weight-Khazani Syed (VW-KS) and Multi-Weight-Random Diagonal (MW-RD). Different numbers of basic users (NB) are used to support triple-play services (audio, data and video) with different QoS requirements. Furthermore, reference to the BER of 10-12, 10-9, and 10-3 for video, data and audio, respectively, the system can support up to 45 total users. Hence, results show that the technique can clearly provide a relative QoS differentiation with lower value of basic users can support larger number of subscribers as well as better performance in terms of acceptable BER of 10-9 at fixed code weight.
‘Sweeter Than a Swisher’: amount and themes of little cigar and cigarillo content on Twitter
Kostygina, Ganna; Tran, Hy; Shi, Yaru; Kim, Yoonsang; Emery, Sherry
2016-01-01
Objective Despite recent increases in little cigar and cigarillo (LCC) use—particularly among urban youth, African-Americans and Latinos—research on targeted strategies for marketing these products is sparse. Little is known about the amount or content of LCC messages users see or share on social media, a popular communication medium among youth and communities of colour. Methods Keyword rules were used to collect tweets related to LCCs from the Twitter Firehose posted in October 2014 and March–April 2015. Tweets were coded for promotional content, brand references, co-use with marijuana and subculture references (eg, rap/hip-hop, celebrity endorsements) and were classified as commercial and ‘organic’/non-commercial using a combination of machine learning methods, keyword algorithms and human coding. Metadata associated with each tweet were used to categorise users as influencers (1000 and more followers) and regular users (under 1000 followers). Results Keyword filters captured over 4 372 293 LCC tweets. Analyses revealed that 17% of account users posting about LCCs were influencers and 1% of accounts were overtly commercial. Influencers were more likely to mention LCC brands and post promotional messages. Approximately 83% of LCC tweets contained references to marijuana and 29% of tweets were memes. Tweets also contained references to rap/hip-hop lyrics and urban subculture. Conclusions Twitter is a major information-sharing and marketing platform for LCCs. Co-use of tobacco and marijuana is common and normalised on Twitter. The presence and broad reach of LCC messages on social media warrants urgent need for surveillance and serious attention from public health professionals and policymakers. Future tobacco use prevention initiatives should be adapted to ensure that they are inclusive of LCC use. PMID:27697951
CFL3D Version 6.4-General Usage and Aeroelastic Analysis
NASA Technical Reports Server (NTRS)
Bartels, Robert E.; Rumsey, Christopher L.; Biedron, Robert T.
2006-01-01
This document contains the course notes on the computational fluid dynamics code CFL3D version 6.4. It is intended to provide from basic to advanced users the information necessary to successfully use the code for a broad range of cases. Much of the course covers capability that has been a part of previous versions of the code, with material compiled from a CFL3D v5.0 manual and from the CFL3D v6 web site prior to the current release. This part of the material is presented to users of the code not familiar with computational fluid dynamics. There is new capability in CFL3D version 6.4 presented here that has not previously been published. There are also outdated features no longer used or recommended in recent releases of the code. The information offered here supersedes earlier manuals and updates outdated usage. Where current usage supersedes older versions, notation of that is made. These course notes also provides hints for usage, code installation and examples not found elsewhere.
NASA Astrophysics Data System (ADS)
Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens
2015-04-01
Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.
TFaNS Tone Fan Noise Design/Prediction System. Volume 2; User's Manual; 1.4
NASA Technical Reports Server (NTRS)
Topol, David A.; Eversman, Walter
1999-01-01
TFaNS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Lewis (presently NASA Glenn). The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. These effects have been added to an existing annular duct/isolated stator noise prediction capability. TFaNS consists of: the codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and write them to files. CUP3D: Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions. AWAKEN: CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so it can be used by the system. This volume of the report provides information on code input and file structure essential for potential users of TFANS. This report is divided into three volumes: Volume 1. System Description, CUP3D Technical Documentation, and Manual for Code Developers; Volume 2. User's Manual, TFANS Vers. 1.4; Volume 3. Evaluation of System Codes.
Re-Engineering the United States Marine Corps’ Enlisted Assignment Model (EAM)
1998-06-01
and Execution button opens the switchboard in Figure 12. This form accesses all of the VBA code that is associated with this form and the code that...the prototype to prompt the user and to inform him of what he is about to do. Each of the buttons that are on these forms is connected to an SQL ...into the field for building a rule are the values 2;3;4;5;6;7. The user can enter an SQL statement that would determine the values or the user could
A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.
Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H
2001-03-01
The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.
Survey of computer programs for heat transfer analysis
NASA Astrophysics Data System (ADS)
Noor, A. K.
An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.
Survey of computer programs for heat transfer analysis
NASA Technical Reports Server (NTRS)
Noor, A. K.
1982-01-01
An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.
Near-line Archive Data Mining at the Goddard Distributed Active Archive Center
NASA Astrophysics Data System (ADS)
Pham, L.; Mack, R.; Eng, E.; Lynnes, C.
2002-12-01
NASA's Earth Observing System (EOS) is generating immense volumes of data, in some cases too much to provide to users with data-intensive needs. As an alternative to moving the data to the user and his/her research algorithms, we are providing a means to move the algorithms to the data. The Near-line Archive Data Mining (NADM) system is the Goddard Earth Sciences Distributed Active Archive Center's (GES DAAC) web data mining portal to the EOS Data and Information System (EOSDIS) data pool, a 50-TB online disk cache. The NADM web portal enables registered users to submit and execute data mining algorithm codes on the data in the EOSDIS data pool. A web interface allows the user to access the NADM system. The users first develops personalized data mining code on their home platform and then uploads them to the NADM system. The C, FORTRAN and IDL languages are currently supported. The user developed code is automatically audited for any potential security problems before it is installed within the NADM system and made available to the user. Once the code has been installed the user is provided a test environment where he/she can test the execution of the software against data sets of the user's choosing. When the user is satisfied with the results, he/she can promote their code to the "operational" environment. From here the user can interactively run his/her code on the data available in the EOSDIS data pool. The user can also set up a processing subscription. The subscription will automatically process new data as it becomes available in the EOSDIS data pool. The generated mined data products are then made available for FTP pickup. The NADM system uses the GES DAAC-developed Simple Scalable Script-based Science Processor (S4P) to automate tasks and perform the actual data processing. Users will also have the option of selecting a DAAC-provided data mining algorithm and using it to process the data of their choice.
A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis
Wu, Yiping; Liu, Shuguang; Yan, Wende
2014-01-01
Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebert, D.
1997-07-01
This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items tomore » be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.« less
Optimization Model for Web Based Multimodal Interactive Simulations.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2015-07-15
This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.
Optimization Model for Web Based Multimodal Interactive Simulations
Halic, Tansel; Ahn, Woojin; De, Suvranu
2015-01-01
This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach. PMID:26085713
ERIC Educational Resources Information Center
Park, Insu
2010-01-01
The purpose of this study is to explore systems users' behavior on IS under the various circumstances (e.g., email usage and malware threats, online communication at the individual level, and IS usage in organizations). Specifically, the first essay develops a method for analyzing and predicting the impact category of malicious code, particularly…
Wang, R; Li, X A
2001-02-01
The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.
User's manual: Subsonic/supersonic advanced panel pilot code
NASA Technical Reports Server (NTRS)
Moran, J.; Tinoco, E. N.; Johnson, F. T.
1978-01-01
Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.
Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.
Uzun, Vassilya; Bilgin, Sami
2016-01-01
For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.
SysSon - A Framework for Systematic Sonification Design
NASA Astrophysics Data System (ADS)
Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns
2015-04-01
SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.
Performance Analysis of New Binary User Codes for DS-CDMA Communication
NASA Astrophysics Data System (ADS)
Usha, Kamle; Jaya Sankar, Kottareddygari
2016-03-01
This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.
Xyce Parallel Electronic Simulator : users' guide, version 2.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont
2004-06-01
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator capable of simulating electrical circuits at a variety of abstraction levels. Primarily, Xyce has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability the current state-of-the-art in the following areas: {sm_bullet} Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. {sm_bullet} Improved performance for allmore » numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. {sm_bullet} Device models which are specifically tailored to meet Sandia's needs, including many radiation-aware devices. {sm_bullet} A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). {sm_bullet} Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing of computing platforms. These include serial, shared-memory and distributed-memory parallel implementation - which allows it to run efficiently on the widest possible number parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. One feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce These input formats include standard analytical models, behavioral models look-up Parallel Electronic Simulator is designed to support a variety of device model inputs. tables, and mesh-level PDE device models. Combined with this flexible interface is an architectural design that greatly simplifies the addition of circuit models. One of the most important feature of Xyce is in providing a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia now has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods) research and development can be performed. Ultimately, these capabilities are migrated to end users.« less
Liang, Tian; Wang, Ke; Lim, Christina; Wong, Elaine; Song, Tingting; Nirmalathas, Ampalavanapillai
2017-09-04
In this paper, we report a novel mechanism to simultaneously provide secure connections for multiple users in indoor optical wireless communication systems by employing the time-slot coding scheme together with chaotic phase sequence. The chaotic phase sequence is generated according to the logistic map and applied to each symbol to secure the transmission. Proof-of-concept experiments are carried out for multiple system capacities based on both 4-QAM and 16-QAM modulation formats, i.e. 1.25 Gb/s, 2 Gb/s and 2.5 Gb/s for 4-QAM, and 2.5 Gb/s, 3.33 Gb/s and 4 Gb/s for 16-QAM. Experimental results show that in all cases the added chaotic phase does not degrade the legitimate user's signal quality while the illegal user cannot detect the signal without the key.
MAGIC Computer Simulation. Volume 1: User Manual
1970-07-01
vulnerability and MAGIC programs. A three-digit code is assigned to each component of the target, such as armor, gun tube; and a two-digit code is assigned to...A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1970 4. TITLE AND SUBTITLE MAGIC Computer Simulation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT
Modular Engine Noise Component Prediction System (MCP) Program Users' Guide
NASA Technical Reports Server (NTRS)
Golub, Robert A. (Technical Monitor); Herkes, William H.; Reed, David H.
2004-01-01
This is a user's manual for Modular Engine Noise Component Prediction System (MCP). This computer code allows the user to predict turbofan engine noise estimates. The program is based on an empirical procedure that has evolved over many years at The Boeing Company. The data used to develop the procedure include both full-scale engine data and small-scale model data, and include testing done by Boeing, by the engine manufacturers, and by NASA. In order to generate a noise estimate, the user specifies the appropriate engine properties (including both geometry and performance parameters), the microphone locations, the atmospheric conditions, and certain data processing options. The version of the program described here allows the user to predict three components: inlet-radiated fan noise, aft-radiated fan noise, and jet noise. MCP predicts one-third octave band noise levels over the frequency range of 50 to 10,000 Hertz. It also calculates overall sound pressure levels and certain subjective noise metrics (e.g., perceived noise levels).
Monte Carlo simulation of electron beams from an accelerator head using PENELOPE.
Sempau, J; Sánchez-Reyes, A; Salvat, F; ben Tahar, H O; Jiang, S B; Fernández-Varea, J M
2001-04-01
The Monte Carlo code PENELOPE has been used to simulate electron beams from a Siemens Mevatron KDS linac with nominal energies of 6, 12 and 18 MeV. Owing to its accuracy, which stems from that of the underlying physical interaction models, PENELOPE is suitable for simulating problems of interest to the medical physics community. It includes a geometry package that allows the definition of complex quadric geometries, such as those of irradiation instruments, in a straightforward manner. Dose distributions in water simulated with PENELOPE agree well with experimental measurements using a silicon detector and a monitoring ionization chamber. Insertion of a lead slab in the incident beam at the surface of the water phantom produces sharp variations in the dose distributions, which are correctly reproduced by the simulation code. Results from PENELOPE are also compared with those of equivalent simulations with the EGS4-based user codes BEAM and DOSXYZ. Angular and energy distributions of electrons and photons in the phase-space plane (at the downstream end of the applicator) obtained from both simulation codes are similar, although significant differences do appear in some cases. These differences, however, are shown to have a negligible effect on the calculated dose distributions. Various practical aspects of the simulations, such as the calculation of statistical uncertainties and the effect of the 'latent' variance in the phase-space file, are discussed in detail.
Survey of computer programs for heat transfer analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1986-01-01
An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.
FlexibleSUSY-A spectrum generator generator for supersymmetric models
NASA Astrophysics Data System (ADS)
Athron, Peter; Park, Jae-hyeon; Stöckinger, Dominik; Voigt, Alexander
2015-05-01
We introduce FlexibleSUSY, a Mathematica and C++ package, which generates a fast, precise C++ spectrum generator for any SUSY model specified by the user. The generated code is designed with both speed and modularity in mind, making it easy to adapt and extend with new features. The model is specified by supplying the superpotential, gauge structure and particle content in a SARAH model file; specific boundary conditions e.g. at the GUT, weak or intermediate scales are defined in a separate FlexibleSUSY model file. From these model files, FlexibleSUSY generates C++ code for self-energies, tadpole corrections, renormalization group equations (RGEs) and electroweak symmetry breaking (EWSB) conditions and combines them with numerical routines for solving the RGEs and EWSB conditions simultaneously. The resulting spectrum generator is then able to solve for the spectrum of the model, including loop-corrected pole masses, consistent with user specified boundary conditions. The modular structure of the generated code allows for individual components to be replaced with an alternative if available. FlexibleSUSY has been carefully designed to grow as alternative solvers and calculators are added. Predefined models include the MSSM, NMSSM, E6SSM, USSM, R-symmetric models and models with right-handed neutrinos.
Laser Signature Prediction Using The VALUE Computer Program
NASA Astrophysics Data System (ADS)
Akerman, Alexander; Hoffman, George A.; Patton, Ronald
1989-09-01
A variety of enhancements are being made to the 1976-vintage LASERX computer code. These include: - Surface characterization with BDRF tabular data - Specular reflection from transparent surfaces - Generation of glint direction maps - Generation of relative range imagery - Interface to the LOWTRAN atmospheric transmission code - Interface to the LEOPS laser sensor code - User friendly menu prompting for easy setup Versions of VALUE have been written for both VAX/VMS and PC/DOS computer environments. Outputs have also been revised to be user friendly and include tables, plots, and images for (1) intensity, (2) cross section,(3) reflectance, (4) relative range, (5) region type, and (6) silhouette.
A User''s Guide to the Zwikker-Kosten Transmission Line Code (ZKTL)
NASA Technical Reports Server (NTRS)
Kelly, J. J.; Abu-Khajeel, H.
1997-01-01
This user's guide documents updates to the Zwikker-Kosten Transmission Line Code (ZKTL). This code was developed for analyzing new liner concepts developed to provide increased sound absorption. Contiguous arrays of multi-degree-of-freedom (MDOF) liner elements serve as the model for these liner configurations, and Zwikker and Kosten's theory of sound propagation in channels is used to predict the surface impedance. Transmission matrices for the various liner elements incorporate both analytical and semi-empirical methods. This allows standard matrix techniques to be employed in the code to systematically calculate the composite impedance due to the individual liner elements. The ZKTL code consists of four independent subroutines: 1. Single channel impedance calculation - linear version (SCIC) 2. Single channel impedance calculation - nonlinear version (SCICNL) 3. Multi-channel, multi-segment, multi-layer impedance calculation - linear version (MCMSML) 4. Multi-channel, multi-segment, multi-layer impedance calculation - nonlinear version (MCMSMLNL) Detailed examples, comments, and explanations for each liner impedance computation module are included. Also contained in the guide are depictions of the interactive execution, input files and output files.
1988-03-24
1430-1445 BREAK 1445-1645 EM CODE USERS PANEL DISCUSSION. Chaired by Wkn Breakal of LLNL. User community sugqestlons on needed enhancemento for EM Codes...I -"FINITE DIFFERENCE & FINITE ELEMENT METHC"S" Moderator: David E . Stein The LTV Aerospace and Defense Company "A Firite Element Analysis of...conduction (resulting from charge movement) or displacement ( e ,0 E /Wt) terms. The sum of these current densities are referred to as the Maxwell current
Arndt, Susan; Aschendorff, Antje; Laszig, Roland; Wesarg, Thomas
2016-01-01
The ability to detect a target signal masked by noise is improved in normal-hearing listeners when interaural phase differences (IPDs) between the ear signals exist either in the masker or in the signal. To improve binaural hearing in bilaterally implanted cochlear implant (BiCI) users, a coding strategy providing the best possible access to IPD is highly desirable. In this study, we compared two coding strategies in BiCI users provided with CI systems from MED-EL (Innsbruck, Austria). The CI systems were bilaterally programmed either with the fine structure processing strategy FS4 or with the constant rate strategy high definition continuous interleaved sampling (HDCIS). Familiarization periods between 6 and 12 weeks were considered. The effect of IPD was measured in two types of experiments: (a) IPD detection thresholds with tonal signals addressing mainly one apical interaural electrode pair and (b) with speech in noise in terms of binaural speech intelligibility level differences (BILD) addressing multiple electrodes bilaterally. The results in (a) showed improved IPD detection thresholds with FS4 compared with HDCIS in four out of the seven BiCI users. In contrast, 12 BiCI users in (b) showed similar BILD with FS4 (0.6 ± 1.9 dB) and HDCIS (0.5 ± 2.0 dB). However, no correlation between results in (a) and (b) both obtained with FS4 was found. In conclusion, the degree of IPD sensitivity determined on an apical interaural electrode pair was not an indicator for BILD based on bilateral multielectrode stimulation. PMID:27659487
Zirn, Stefan; Arndt, Susan; Aschendorff, Antje; Laszig, Roland; Wesarg, Thomas
2016-09-22
The ability to detect a target signal masked by noise is improved in normal-hearing listeners when interaural phase differences (IPDs) between the ear signals exist either in the masker or in the signal. To improve binaural hearing in bilaterally implanted cochlear implant (BiCI) users, a coding strategy providing the best possible access to IPD is highly desirable. In this study, we compared two coding strategies in BiCI users provided with CI systems from MED-EL (Innsbruck, Austria). The CI systems were bilaterally programmed either with the fine structure processing strategy FS4 or with the constant rate strategy high definition continuous interleaved sampling (HDCIS). Familiarization periods between 6 and 12 weeks were considered. The effect of IPD was measured in two types of experiments: (a) IPD detection thresholds with tonal signals addressing mainly one apical interaural electrode pair and (b) with speech in noise in terms of binaural speech intelligibility level differences (BILD) addressing multiple electrodes bilaterally. The results in (a) showed improved IPD detection thresholds with FS4 compared with HDCIS in four out of the seven BiCI users. In contrast, 12 BiCI users in (b) showed similar BILD with FS4 (0.6 ± 1.9 dB) and HDCIS (0.5 ± 2.0 dB). However, no correlation between results in (a) and (b) both obtained with FS4 was found. In conclusion, the degree of IPD sensitivity determined on an apical interaural electrode pair was not an indicator for BILD based on bilateral multielectrode stimulation. © The Author(s) 2016.
Methods for Coding Tobacco-Related Twitter Data: A Systematic Review
Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai
2017-01-01
Background As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. Objective The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Methods Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. Results E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter’s Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Conclusions Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter’s databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. PMID:28363883
'Sweeter Than a Swisher': amount and themes of little cigar and cigarillo content on Twitter.
Kostygina, Ganna; Tran, Hy; Shi, Yaru; Kim, Yoonsang; Emery, Sherry
2016-10-01
Despite recent increases in little cigar and cigarillo (LCC) use-particularly among urban youth, African-Americans and Latinos-research on targeted strategies for marketing these products is sparse. Little is known about the amount or content of LCC messages users see or share on social media, a popular communication medium among youth and communities of colour. Keyword rules were used to collect tweets related to LCCs from the Twitter Firehose posted in October 2014 and March-April 2015. Tweets were coded for promotional content, brand references, co-use with marijuana and subculture references (eg, rap/hip-hop, celebrity endorsements) and were classified as commercial and 'organic'/non-commercial using a combination of machine learning methods, keyword algorithms and human coding. Metadata associated with each tweet were used to categorise users as influencers (1000 and more followers) and regular users (under 1000 followers). Keyword filters captured over 4 372 293 LCC tweets. Analyses revealed that 17% of account users posting about LCCs were influencers and 1% of accounts were overtly commercial. Influencers were more likely to mention LCC brands and post promotional messages. Approximately 83% of LCC tweets contained references to marijuana and 29% of tweets were memes. Tweets also contained references to rap/hip-hop lyrics and urban subculture. Twitter is a major information-sharing and marketing platform for LCCs. Co-use of tobacco and marijuana is common and normalised on Twitter. The presence and broad reach of LCC messages on social media warrants urgent need for surveillance and serious attention from public health professionals and policymakers. Future tobacco use prevention initiatives should be adapted to ensure that they are inclusive of LCC use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Fostering Team Awareness in Earth System Modeling Communities
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Lawson, A.; Strong, S.
2009-12-01
Existing Global Climate Models are typically managed and controlled at a single site, with varied levels of participation by scientists outside the core lab. As these models evolve to encompass a wider set of earth systems, this central control of the modeling effort becomes a bottleneck. But such models cannot evolve to become fully distributed open source projects unless they address the imbalance in the availability of communication channels: scientists at the core site have access to regular face-to-face communication with one another, while those at remote sites have access to only a subset of these conversations - e.g. formally scheduled teleconferences and user meetings. Because of this imbalance, critical decision making can be hidden from many participants, their code contributions can interact in unanticipated ways, and the community loses awareness of who knows what. We have documented some of these problems in a field study at one climate modeling centre, and started to develop tools to overcome these problems. We report on one such tool, TracSNAP, which analyzes the social network of the scientists contributing code to the model by extracting the data in an existing project code repository. The tool presents the results of this analysis to modelers and model users in a number of ways: recommendation for who has expertise on particular code modules, suggestions for code sections that are related to files being worked on, and visualizations of team communication patterns. The tool is currently available as a plugin for the Trac bug tracking system.
15 CFR 740.7 - Computers (APP).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... programmability. (ii) Technology and source code. Technology and source code eligible for License Exception APP..., reexports and transfers (in-country) for nuclear, chemical, biological, or missile end-users and end-uses...
1980-08-01
knots Figure 14. Current profile. 84 6; * .4. 0 E U U U -~ U U (.4 U @0 85 I UECfLI ?E)r eAtE NjKC 7 frCAd I o .,01 U.I 75o* ANL I U,) I000. 0.) AKC 3 U...NAVSCOLCECOFF C35 Port Hueneme, CA NAVSEASYSCOM Code SEA OOC Washington. DC NAVSEC Code 6034 (Library), Washington DC NAVSHIPREPFAC Library. Guam NAVSHIPYD Code
Cscibox: A Software System for Age-Model Construction and Evaluation
NASA Astrophysics Data System (ADS)
Bradley, E.; Anderson, K. A.; Marchitto, T. M., Jr.; de Vesine, L. R.; White, J. W. C.; Anderson, D. M.
2014-12-01
CSciBox is an integrated software system for the construction and evaluation of age models of paleo-environmetal archives, both directly dated and cross dated. The time has come to encourage cross-pollinization between earth science and computer science in dating paleorecords. This project addresses that need. The CSciBox code, which is being developed by a team of computer scientists and geoscientists, is open source and freely available on github. The system employs modern database technology to store paleoclimate proxy data and analysis results in an easily accessible and searchable form. This makes it possible to do analysis on the whole core at once, in an interactive fashion, or to tailor the analysis to a subset of the core without loading the entire data file. CSciBox provides a number of 'components' that perform the common steps in age-model construction and evaluation: calibrations, reservoir-age correction, interpolations, statistics, and so on. The user employs these components via a graphical user interface (GUI) to go from raw data to finished age model in a single tool: e.g., an IntCal09 calibration of 14C data from a marine sediment core, followed by a piecewise-linear interpolation. CSciBox's GUI supports plotting of any measurement in the core against any other measurement, or against any of the variables in the calculation of the age model-with or without explicit error representations. Using the GUI, CSciBox's user can import a new calibration curve or other background data set and define a new module that employs that information. Users can also incorporate other software (e.g., Calib, BACON) as 'plug ins.' In the case of truly large data or significant computational effort, CSciBox is parallelizable across modern multicore processors, or clusters, or even the cloud. The next generation of the CSciBox code, currently in the testing stages, includes an automated reasoning engine that supports a more-thorough exploration of plausible age models and cross-dating scenarios.
Mummah, Sarah A; King, Abby C; Gardner, Christopher D; Sutton, Stephen
2016-08-08
Mobile technology may serve as a cost-effective and scalable tool for delivering behavioral nutrition interventions. This research sought to iteratively develop a theory-driven mobile app, Vegethon, to increase vegetable consumption. Development of Vegethon followed phases outlined by the IDEAS framework: 1) empathize with users (qualitative interviews, n = 18); 2) specify target behavior; 3) ground in behavioral theory; 4) ideate implementation strategies; 5) prototype potential products; 6) gather user feedback (qualitative interviews, n = 14; questionnaire, n = 41); 7) build minimum viable product; and 8) pilot potential efficacy and usability (pilot RCT, n = 17). Findings from each phase informed subsequent phases. The target population that informed intervention development was 18-50 years of age, had BMIs of 28-40 kg/m(2), and lived in the geographical area surrounding Stanford University. A full description of the final version of Vegethon is included in the paper. Qualitative findings that shaped initial intervention conception were: participants' interests in accountability without judgment; their desire for simple and efficient dietary self-monitoring; and the importance of planning meals in advance. Qualitative findings identified during intervention refinement were the need for a focus on vegetable self-monitoring; inclusion of vegetable challenges; simplification of features; advice and inspiration for eating vegetables; reminder notifications; and peer comparison. Pilot RCT findings suggested the initial efficacy, acceptance, and feasibility of the intervention. The final version of Vegethon enabled easy self-monitoring of vegetable consumption and included a range of features designed to engage the user (e.g., surprise challenges; leaderboard; weekly reports). Vegethon was coded for its inclusion of 18 behavior change techniques (BCTs) (e.g., goal setting; feedback; social comparison; prompts/cues; framing/reframing; identity). Vegethon is a theory-based, user-informed mobile intervention that was systematically developed using the IDEAS framework. Vegethon targets increased vegetable consumption among overweight adults and is currently being evaluated in a randomized controlled efficacy trial. Clinical Trials.gov: NCT01826591.
Ground Operations Aerospace Language (GOAL). Volume 4: Interpretive code translator
NASA Technical Reports Server (NTRS)
1973-01-01
This specification identifies and describes the principal functions and elements of the Interpretive Code Translator which has been developed for use with the GOAL Compiler. This translator enables the user to convert a compliled GOAL program to a highly general binary format which is designed to enable interpretive execution. The translator program provides user controls which are designed to enable the selection of various output types and formats. These controls provide a means for accommodating many of the implementation options which are discussed in the Interpretive Code Guideline document. The technical design approach is given. The relationship between the translator and the GOAL compiler is explained and the principal functions performed by the Translator are described. Specific constraints regarding the use of the Translator are discussed. The control options are described. These options enable the user to select outputs to be generated by the translator and to control vrious aspects of the translation processing.
Peters, Betts; Bieker, Gregory; Heckman, Susan M; Huggins, Jane E; Wolf, Catherine; Zeitlin, Debra; Fried-Oken, Melanie
2015-03-01
More than 300 researchers gathered at the 2013 International Brain-Computer Interface (BCI) Meeting to discuss current practice and future goals for BCI research and development. The authors organized the Virtual Users' Forum at the meeting to provide the BCI community with feedback from users. We report on the Virtual Users' Forum, including initial results from ongoing research being conducted by 2 BCI groups. Online surveys and in-person interviews were used to solicit feedback from people with disabilities who are expert and novice BCI users. For the Virtual Users' Forum, their responses were organized into 4 major themes: current (non-BCI) communication methods, experiences with BCI research, challenges of current BCIs, and future BCI developments. Two authors with severe disabilities gave presentations during the Virtual Users' Forum, and their comments are integrated with the other results. While participants' hopes for BCIs of the future remain high, their comments about available systems mirror those made by consumers about conventional assistive technology. They reflect concerns about reliability (eg, typing accuracy/speed), utility (eg, applications and the desire for real-time interactions), ease of use (eg, portability and system setup), and support (eg, technical support and caregiver training). People with disabilities, as target users of BCI systems, can provide valuable feedback and input on the development of BCI as an assistive technology. To this end, participatory action research should be considered as a valuable methodology for future BCI research. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Efficient burst image compression using H.265/HEVC
NASA Astrophysics Data System (ADS)
Roodaki-Lavasani, Hoda; Lainema, Jani
2014-02-01
New imaging use cases are emerging as more powerful camera hardware is entering consumer markets. One family of such use cases is based on capturing multiple pictures instead of just one when taking a photograph. That kind of a camera operation allows e.g. selecting the most successful shot from a sequence of images, showing what happened right before or after the shot was taken or combining the shots by computational means to improve either visible characteristics of the picture (such as dynamic range or focus) or the artistic aspects of the photo (e.g. by superimposing pictures on top of each other). Considering that photographic images are typically of high resolution and quality and the fact that these kind of image bursts can consist of at least tens of individual pictures, an efficient compression algorithm is desired. However, traditional video coding approaches fail to provide the random access properties these use cases require to achieve near-instantaneous access to the pictures in the coded sequence. That feature is critical to allow users to browse the pictures in an arbitrary order or imaging algorithms to extract desired pictures from the sequence quickly. This paper proposes coding structures that provide such random access properties while achieving coding efficiency superior to existing image coders. The results indicate that using HEVC video codec with a single reference picture fixed for the whole sequence can achieve nearly as good compression as traditional IPPP coding structures. It is also shown that the selection of the reference frame can further improve the coding efficiency.
Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components
NASA Technical Reports Server (NTRS)
Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.
1999-01-01
This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.
1983-09-01
to geometries not large in terms of wavelength and the lack of analytical results which can provide physical insight into the problem. The first...EPTS)*Ŗ) STPM *2. ’TP’ (CABS (ETPS)*Ŗ) STTHi2.*TP*(CARS(ETTS)*Ŗ) RETURN - 1 END 296 do APPENDIX 37 * SUBROUTINE SURFFY SUBROUTINE.SURF!? (KXl,YN1...D-Ai135 837 A USER’S MAiNUAfL FOR ELECTROMAGNETIC SURFACE PATCH (ESP) 1 /4 CODE VERSION 11 P.-(U) OHIO STATE UNIV COLUMBUS U CLAS ELECTROSCIENCE LAB E
Price-related promotions for tobacco products on Twitter.
Jo, Catherine L; Kornfield, Rachel; Kim, Yoonsang; Emery, Sherry; Ribisl, Kurt M
2016-07-01
This cross-sectional study examined price-related promotions for tobacco products on Twitter. Through the Twitter Firehose, we obtained access to all public tweets posted between 6 December 2012 and 20 June 2013 that contained a keyword suggesting a tobacco-related product or behaviour (eg, cigarette, vaping) in addition to a keyword suggesting a price promotion (eg, coupon, discount). From this data set of 155 249 tweets, we constructed a stratified sampling frame based on the price-related keywords and randomly sampled 5000 tweets (3.2%). Tweets were coded for product type and promotion type. Non-English tweets and tweets unrelated to a tobacco or cessation price promotion were excluded, leaving an analytic sample of 2847 tweets. The majority of tweets (97.0%) mentioned tobacco products while 3% mentioned tobacco cessation products. E-cigarettes were the most frequently mentioned product (90.1%), followed by cigarettes (5.4%). The most common type of price promotion mentioned across all products was a discount. About a third of all e-cigarette-related tweets included a discount code. Banned or restricted price promotions comprised about 3% of cigarette-related tweets. This study demonstrates that the vast majority of tweets offering price promotions focus on e-cigarettes. Future studies should examine the extent to which Twitter users, particularly youth, notice or engage with these price promotion tweets. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
RELAP-7 Code Assessment Plan and Requirement Traceability Matrix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.
2016-10-01
The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yidong Xia; Mitch Plummer; Robert Podgorney
2016-02-01
Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less
12 CFR Appendix A to Subpart B of... - Commentary
Code of Federal Regulations, 2014 CFR
2014-01-01
...,” as defined in section 4A-501(b) of Article 4A, Funds Transfers, of the Uniform Commercial Code (UCC... refers to other provisions of the Uniform Commercial Code, e.g., definitions in article 1 of the UCC, these other provisions of the UCC, as approved by the National Conference of Commissioners on Uniform...
12 CFR Appendix A to Subpart B of... - Commentary
Code of Federal Regulations, 2010 CFR
2010-01-01
...,” as defined in section 4A-501(b) of Article 4A, Funds Transfers, of the Uniform Commercial Code (UCC... refers to other provisions of the Uniform Commercial Code, e.g., definitions in Article 1 of the UCC, these other provisions of the UCC, as approved by the National Conference of Commissioners on Uniform...
12 CFR Appendix A to Subpart B of... - Commentary
Code of Federal Regulations, 2011 CFR
2011-01-01
...,” as defined in section 4A-501(b) of Article 4A, Funds Transfers, of the Uniform Commercial Code (UCC... refers to other provisions of the Uniform Commercial Code, e.g., definitions in Article 1 of the UCC, these other provisions of the UCC, as approved by the National Conference of Commissioners on Uniform...
12 CFR Appendix A to Subpart B of... - Commentary
Code of Federal Regulations, 2013 CFR
2013-01-01
...,” as defined in section 4A-501(b) of Article 4A, Funds Transfers, of the Uniform Commercial Code (UCC... refers to other provisions of the Uniform Commercial Code, e.g., definitions in article 1 of the UCC, these other provisions of the UCC, as approved by the National Conference of Commissioners on Uniform...
12 CFR Appendix A to Subpart B of... - Commentary
Code of Federal Regulations, 2012 CFR
2012-01-01
...,” as defined in section 4A-501(b) of Article 4A, Funds Transfers, of the Uniform Commercial Code (UCC... refers to other provisions of the Uniform Commercial Code, e.g., definitions in Article 1 of the UCC, these other provisions of the UCC, as approved by the National Conference of Commissioners on Uniform...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetzstein, M.; Nelson, Andrew F.; Naab, T.
2009-10-01
We present a numerical code for simulating the evolution of astrophysical systems using particles to represent the underlying fluid flow. The code is written in Fortran 95 and is designed to be versatile, flexible, and extensible, with modular options that can be selected either at the time the code is compiled or at run time through a text input file. We include a number of general purpose modules describing a variety of physical processes commonly required in the astrophysical community and we expect that the effort required to integrate additional or alternate modules into the code will be small. Inmore » its simplest form the code can evolve the dynamical trajectories of a set of particles in two or three dimensions using a module which implements either a Leapfrog or Runge-Kutta-Fehlberg integrator, selected by the user at compile time. The user may choose to allow the integrator to evolve the system using individual time steps for each particle or with a single, global time step for all. Particles may interact gravitationally as N-body particles, and all or any subset may also interact hydrodynamically, using the smoothed particle hydrodynamic (SPH) method by selecting the SPH module. A third particle species can be included with a module to model massive point particles which may accrete nearby SPH or N-body particles. Such particles may be used to model, e.g., stars in a molecular cloud. Free boundary conditions are implemented by default, and a module may be selected to include periodic boundary conditions. We use a binary 'Press' tree to organize particles for rapid access in gravity and SPH calculations. Modules implementing an interface with special purpose 'GRAPE' hardware may also be selected to accelerate the gravity calculations. If available, forces obtained from the GRAPE coprocessors may be transparently substituted for those obtained from the tree, or both tree and GRAPE may be used as a combination GRAPE/tree code. The code may be run without modification on single processors or in parallel using OpenMP compiler directives on large-scale, shared memory parallel machines. We present simulations of several test problems, including a merger simulation of two elliptical galaxies with 800,000 particles. In comparison to the Gadget-2 code of Springel, the gravitational force calculation, which is the most costly part of any simulation including self-gravity, is {approx}4.6-4.9 times faster with VINE when tested on different snapshots of the elliptical galaxy merger simulation when run on an Itanium 2 processor in an SGI Altix. A full simulation of the same setup with eight processors is a factor of 2.91 faster with VINE. The code is available to the public under the terms of the Gnu General Public License.« less
NASA Astrophysics Data System (ADS)
Wetzstein, M.; Nelson, Andrew F.; Naab, T.; Burkert, A.
2009-10-01
We present a numerical code for simulating the evolution of astrophysical systems using particles to represent the underlying fluid flow. The code is written in Fortran 95 and is designed to be versatile, flexible, and extensible, with modular options that can be selected either at the time the code is compiled or at run time through a text input file. We include a number of general purpose modules describing a variety of physical processes commonly required in the astrophysical community and we expect that the effort required to integrate additional or alternate modules into the code will be small. In its simplest form the code can evolve the dynamical trajectories of a set of particles in two or three dimensions using a module which implements either a Leapfrog or Runge-Kutta-Fehlberg integrator, selected by the user at compile time. The user may choose to allow the integrator to evolve the system using individual time steps for each particle or with a single, global time step for all. Particles may interact gravitationally as N-body particles, and all or any subset may also interact hydrodynamically, using the smoothed particle hydrodynamic (SPH) method by selecting the SPH module. A third particle species can be included with a module to model massive point particles which may accrete nearby SPH or N-body particles. Such particles may be used to model, e.g., stars in a molecular cloud. Free boundary conditions are implemented by default, and a module may be selected to include periodic boundary conditions. We use a binary "Press" tree to organize particles for rapid access in gravity and SPH calculations. Modules implementing an interface with special purpose "GRAPE" hardware may also be selected to accelerate the gravity calculations. If available, forces obtained from the GRAPE coprocessors may be transparently substituted for those obtained from the tree, or both tree and GRAPE may be used as a combination GRAPE/tree code. The code may be run without modification on single processors or in parallel using OpenMP compiler directives on large-scale, shared memory parallel machines. We present simulations of several test problems, including a merger simulation of two elliptical galaxies with 800,000 particles. In comparison to the Gadget-2 code of Springel, the gravitational force calculation, which is the most costly part of any simulation including self-gravity, is ~4.6-4.9 times faster with VINE when tested on different snapshots of the elliptical galaxy merger simulation when run on an Itanium 2 processor in an SGI Altix. A full simulation of the same setup with eight processors is a factor of 2.91 faster with VINE. The code is available to the public under the terms of the Gnu General Public License.
BurnMan: Towards a multidisciplinary toolkit for reproducible deep Earth science
NASA Astrophysics Data System (ADS)
Myhill, R.; Cottaar, S.; Heister, T.; Rose, I.; Unterborn, C. T.; Dannberg, J.; Martin-Short, R.
2016-12-01
BurnMan (www.burnman.org) is an open-source toolbox to compute thermodynamic and thermoelastic properties as a function of pressure and temperature using published mineral physical parameters and equations-of-state. The framework is user-friendly, written in Python, and modular, allowing the user to implement their own equations of state, endmember and solution model libraries, geotherms, and averaging schemes. Here we introduce various new modules, which can be used to: Fit thermodynamic variables to data from high pressure static and shock wave experiments, Calculate equilibrium assemblages given a bulk composition, pressure and temperature, Calculate chemical potentials and oxygen fugacities for given assemblages Compute 3D synthetic seismic models using output from geodynamic models and compare these results with global seismic tomographic models, Create input files for synthetic seismogram codes. Users can contribute scripts that reproduce the results from peer-reviewed articles and practical demonstrations (e.g. Cottaar et al., 2014).
NASA Astrophysics Data System (ADS)
Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.
2018-05-01
One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.
A flooding induced station blackout analysis for a pressurized water reactor using the RISMC toolkit
Mandelli, Diego; Prescott, Steven; Smith, Curtis; ...
2015-05-17
In this paper we evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: the RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., component/system activation) and to perform statistical analyses. In our case, the simulation of the flooding is performed by using an advanced smooth particle hydrodynamics code calledmore » NEUTRINO. The obtained results allow the user to investigate and quantify the impact of timing and sequencing of events on system safety. The impact of power uprate is determined in terms of both core damage probability and safety margins.« less
Replica Exchange Molecular Dynamics in the Age of Heterogeneous Architectures
NASA Astrophysics Data System (ADS)
Roitberg, Adrian
2014-03-01
The rise of GPU-based codes has allowed MD to reach timescales only dreamed of only 5 years ago. Even within this new paradigm there is still need for advanced sampling techniques. Modern supercomputers (e.g. Blue Waters, Titan, Keeneland) have made available to users a significant number of GPUS and CPUS, which in turn translate into amazing opportunities for dream calculations. Replica-exchange based methods can optimally use tis combination of codes and architectures to explore conformational variabilities in large systems. I will show our recent work in porting the program Amber to GPUS, and the support for replica exchange methods, where the replicated dimension could be Temperature, pH, Hamiltonian, Umbrella windows and combinations of those schemes.
Comparing apples and oranges: the Community Intercomparison Suite
NASA Astrophysics Data System (ADS)
Schutgens, Nick; Stier, Philip; Pascoe, Stephen
2014-05-01
Visual representation and comparison of geoscientific datasets presents a huge challenge due to the large variety of file formats and spatio-temporal sampling of data (be they observations or simulations). The Community Intercomparison Suite attempts to greatly simplify these tasks for users by offering an intelligent but simple command line tool for visualisation and colocation of diverse datasets. In addition, CIS can subset and aggregate large datasets into smaller more manageable datasets. Our philosophy is to remove as much as possible the need for specialist knowledge by the user of the structure of a dataset. The colocation of observations with model data is as simple as: "cis col
NLTE4 Plasma Population Kinetics Database
National Institute of Standards and Technology Data Gateway
SRD 159 NLTE4 Plasma Population Kinetics Database (Web database for purchase) This database contains benchmark results for simulation of plasma population kinetics and emission spectra. The data were contributed by the participants of the 4th Non-LTE Code Comparison Workshop who have unrestricted access to the database. The only limitation for other users is in hidden labeling of the output results. Guest users can proceed to the database entry page without entering userid and password.
1983-05-01
empirical erosion model, with use of the debris-layer model optional. 1.1 INTERFACE WITH ISPP ISPP is a collection of computer codes designed to calculate...expansion with the ODK code, 4. A two-dimensional, two-phase nozzle expansion with the TD2P code, 5. A turbulent boundary layer solution along the...INPUT THERMODYNAMIC DATA FOR TEMPERATURESBELOW 300°K OIF NEEDED) NO A• 11 READ SSP NAMELIST (ODE. BAL. ODK . TD2P. TEL. NOZZLE GEOMETRY) PROfLM 2
Li, Ang; Huang, Xiaoxiao; Hao, Bibo; O'Dea, Bridianne; Christensen, Helen; Zhu, Tingshao
2015-01-01
Introduction. Broadcasting a suicide attempt on social media has become a public health concern in many countries, particularly in China. In these cases, social media users are likely to be the first to witness the suicide attempt, and their attitudes may determine their likelihood of joining rescue efforts. This paper examines Chinese social media (Weibo) users' attitudes towards suicide attempts broadcast on Weibo. Methods. A total of 4,969 Weibo posts were selected from a customised Weibo User Pool which consisted of 1.06 million active users. The selected posts were then independently coded by two researchers using a coding framework that assessed: (a) Themes, (b) General attitudes, (c) Stigmatising attitudes, (d) Perceived motivations, and (e) Desired responses. Results and Discussion. More than one third of Weibo posts were coded as "stigmatising" (35%). Among these, 22%, 16%, and 15% of posts were coded as "deceitful," "pathetic," and "stupid," respectively. Among the posts which reflected different types of perceived motivations, 57% of posts were coded as "seeking attention." Among the posts which reflected desired responses, 37% were "not saving" and 28% were "encouraging suicide." Furthermore, among the posts with negative desired responses (i.e., "not saving" and "encouraging suicide"), 57% and 17% of them were related to different types of stigmatising attitudes and perceived motivations, respectively. Specifically, 29% and 26% of posts reflecting both stigmatising attitudes and negative desired responses were coded as "deceitful" and "pathetic," respectively, while 66% of posts reflecting both perceived motivations, and negative desired responses were coded as "seeking attention." Very few posts "promoted literacy" (2%) or "provided resources" (8%). Gender differences existed in multiple categories. Conclusions. This paper confirms the need for stigma reduction campaigns for Chinese social media users to improve their attitudes towards those who broadcast their suicide attempts on social media. Results of this study support the need for improved public health programs in China and may be insightful for other countries and other social media platforms.
A Cloud Based Framework For Monitoring And Predicting Subsurface System Behaviour
NASA Astrophysics Data System (ADS)
Versteeg, R. J.; Rodzianko, A.; Johnson, D. V.; Soltanian, M. R.; Dwivedi, D.; Dafflon, B.; Tran, A. P.; Versteeg, O. J.
2015-12-01
Subsurface system behavior is driven and controlled by the interplay of physical, chemical, and biological processes which occur at multiple temporal and spatial scales. Capabilities to monitor, understand and predict this behavior in an effective and timely manner are needed for both scientific purposes and for effective subsurface system management. Such capabilities require three elements: Models, Data and an enabling cyberinfrastructure, which allow users to use these models and data in an effective manner. Under a DOE Office of Science funded STTR award Subsurface Insights and LBNL have designed and implemented a cloud based predictive assimilation framework (PAF) which automatically ingests, controls quality and stores heterogeneous physical and chemical subsurface data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of subsurface systems. PAF is implemented as a modular cloud based software application with five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result delivery and (5) orchestration. Serverside PAF uses ZF2 (a PHP web application framework) and Python and both open source (ODM2) and in house developed data models. Clientside PAF uses CSS and JS to allow for interactive data visualization and analysis. Client side modularity (which allows for a responsive interface) of the system is achieved by implementing each core capability of PAF (such as data visualization, user configuration and control, electrical geophysical monitoring and email/SMS alerts on data streams) as a SPA (Single Page Application). One of the recent enhancements is the full integration of a number of flow and mass transport and parameter estimation codes (e.g., MODFLOW, MT3DMS, PHT3D, TOUGH, PFLOTRAN) in this framework. This integration allows for autonomous and user controlled modeling of hydrological and geochemical processes. In our presentation we will discuss our software architecture and present the results of using these codes and the overall developed performance of our framework using hydrological, geochemical and geophysical data from the LBNL SFA2 Rifle field site.
NASA Technical Reports Server (NTRS)
Reichel, R. H.; Hague, D. S.; Jones, R. T.; Glatt, C. R.
1973-01-01
This computer program manual describes in two parts the automated combustor design optimization code AUTOCOM. The program code is written in the FORTRAN 4 language. The input data setup and the program outputs are described, and a sample engine case is discussed. The program structure and programming techniques are also described, along with AUTOCOM program analysis.
TU-AB-BRC-08: Egs-brachy, a Fast and Versatile Monte Carlo Code for Brachytherapy Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamberland, M; Taylor, R; Rogers, D
2016-06-15
Purpose: To introduce egs-brachy, a new, fast, and versatile Monte Carlo code for brachytherapy applications. Methods: egs-brachy is an EGSnrc user-code based on the EGSnrc C++ class library (egs++). Complex phantom, applicator, and source model geometries are built using the egs++ geometry module. egs-brachy uses a tracklength estimator to score collision kerma in voxels. Interaction, spectrum, energy fluence, and phase space scoring are also implemented. Phase space sources and particle recycling may be used to improve simulation efficiency. HDR treatments (e.g. stepping source through dwell positions) can be simulated. Standard brachytherapy seeds, as well as electron and miniature x-ray tubemore » sources are fully modelled. Variance reduction techniques for electron source simulations are implemented (Bremsstrahlung cross section enhancement, uniform Bremsstrahlung splitting, and Russian Roulette). TG-43 parameters of seeds are computed and compared to published values. Example simulations of various treatments are carried out on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core. Results: TG-43 parameters calculated with egs-brachy show excellent agreement with published values. Using a phase space source, 2% average statistical uncertainty in the PTV ((2mm){sup 3} voxels) can be achieved in 10 s for 100 {sup 125}I or {sup 103}Pd seeds in a 36.2 cm{sup 3} prostate PTV, 31 s for 64 {sup 103}Pd seeds in a 64 cm{sup 3} breast PTV, and 56 s for a miniature x-ray tube in a 27 cm{sup 3} breast PTV. Comparable uncertainty is reached in 12 s in a (1 mm){sup 3} water voxel 5 mm away from a COMS 16mm eye plaque with 13 {sup 103}Pd seeds. Conclusion: The accuracy of egs-brachy has been demonstrated through benchmarking calculations. Calculation times are sufficiently fast to allow full MC simulations for routine treatment planning for diverse brachytherapy treatments (LDR, HDR, miniature x-ray tube). egs-brachy will be available as free and open-source software to the medical physics research community. This work is partially funded by the Canada Research Chairs program, the Natural Sciences and Engineering Research Council of Canada, and the Ontario Ministry of Research and Innovation (Ontario Early Researcher Award).« less
Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0
NASA Technical Reports Server (NTRS)
Knox, J. C.
1996-01-01
The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.
JBioWH: an open-source Java framework for bioinformatics data integration
Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor
2013-01-01
The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595
JBioWH: an open-source Java framework for bioinformatics data integration.
Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor
2013-01-01
The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.
Kim, Annice E; Hopper, Timothy; Simpson, Sean; Nonnemaker, James; Lieberman, Alicea J; Hansen, Heather; Porter, Lauren
2015-01-01
Background Marketing and use of electronic cigarettes (e-cigarettes) and other electronic nicotine delivery devices have increased exponentially in recent years fueled, in part, by marketing and word-of-mouth communications via social media platforms, such as Twitter. Objective This study examines Twitter posts about e-cigarettes between 2008 and 2013 to gain insights into (1) marketing trends for selling and promoting e-cigarettes and (2) locations where people use e-cigarettes. Methods We used keywords to gather tweets about e-cigarettes between July 1, 2008 and February 28, 2013. A randomly selected subset of tweets was manually coded as advertising (eg, marketing, advertising, sales, promotion) or nonadvertising (eg, individual users, consumers), and classification algorithms were trained to code the remaining data into these 2 categories. A combination of manual coding and natural language processing methods was used to indicate locations where people used e-cigarettes. Additional metadata were used to generate insights about users who tweeted most frequently about e-cigarettes. Results We identified approximately 1.7 million tweets about e-cigarettes between 2008 and 2013, with the majority of these tweets being advertising (93.43%, 1,559,508/1,669,123). Tweets about e-cigarettes increased more than tenfold between 2009 and 2010, suggesting a rapid increase in the popularity of e-cigarettes and marketing efforts. The Twitter handles tweeting most frequently about e-cigarettes were a mixture of e-cigarette brands, affiliate marketers, and resellers of e-cigarette products. Of the 471 e-cigarette tweets mentioning a specific place, most mentioned e-cigarette use in class (39.1%, 184/471) followed by home/room/bed (12.5%, 59/471), school (12.1%, 57/471), in public (8.7%, 41/471), the bathroom (5.7%, 27/471), and at work (4.5%, 21/471). Conclusions Twitter is being used to promote e-cigarettes by different types of entities and the online marketplace is more diverse than offline product offerings and advertising strategies. E-cigarettes are also being used in public places, such as schools, underscoring the need for education and enforcement of policies banning e-cigarette use in public places. Twitter data can provide new insights on e-cigarettes to help inform future research, regulations, surveillance, and enforcement efforts. PMID:26545927
Kim, Annice E; Hopper, Timothy; Simpson, Sean; Nonnemaker, James; Lieberman, Alicea J; Hansen, Heather; Guillory, Jamie; Porter, Lauren
2015-11-06
Marketing and use of electronic cigarettes (e-cigarettes) and other electronic nicotine delivery devices have increased exponentially in recent years fueled, in part, by marketing and word-of-mouth communications via social media platforms, such as Twitter. This study examines Twitter posts about e-cigarettes between 2008 and 2013 to gain insights into (1) marketing trends for selling and promoting e-cigarettes and (2) locations where people use e-cigarettes. We used keywords to gather tweets about e-cigarettes between July 1, 2008 and February 28, 2013. A randomly selected subset of tweets was manually coded as advertising (eg, marketing, advertising, sales, promotion) or nonadvertising (eg, individual users, consumers), and classification algorithms were trained to code the remaining data into these 2 categories. A combination of manual coding and natural language processing methods was used to indicate locations where people used e-cigarettes. Additional metadata were used to generate insights about users who tweeted most frequently about e-cigarettes. We identified approximately 1.7 million tweets about e-cigarettes between 2008 and 2013, with the majority of these tweets being advertising (93.43%, 1,559,508/1,669,123). Tweets about e-cigarettes increased more than tenfold between 2009 and 2010, suggesting a rapid increase in the popularity of e-cigarettes and marketing efforts. The Twitter handles tweeting most frequently about e-cigarettes were a mixture of e-cigarette brands, affiliate marketers, and resellers of e-cigarette products. Of the 471 e-cigarette tweets mentioning a specific place, most mentioned e-cigarette use in class (39.1%, 184/471) followed by home/room/bed (12.5%, 59/471), school (12.1%, 57/471), in public (8.7%, 41/471), the bathroom (5.7%, 27/471), and at work (4.5%, 21/471). Twitter is being used to promote e-cigarettes by different types of entities and the online marketplace is more diverse than offline product offerings and advertising strategies. E-cigarettes are also being used in public places, such as schools, underscoring the need for education and enforcement of policies banning e-cigarette use in public places. Twitter data can provide new insights on e-cigarettes to help inform future research, regulations, surveillance, and enforcement efforts.
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1975-01-01
Information necessary to use the LOVES computer program in its existing state or to modify the program to include studies not properly handled by the basic model is provided. A users guide, a programmers manual, and several supporting appendices are included.
Climate tools in mainstream Linux distributions
NASA Astrophysics Data System (ADS)
McKinstry, Alastair
2015-04-01
Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.
User's Guide for TOUGH2-MP - A Massively Parallel Version of the TOUGH2 Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Earth Sciences Division; Zhang, Keni; Zhang, Keni
TOUGH2-MP is a massively parallel (MP) version of the TOUGH2 code, designed for computationally efficient parallel simulation of isothermal and nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. In recent years, computational requirements have become increasingly intensive in large or highly nonlinear problems for applications in areas such as radioactive waste disposal, CO2 geological sequestration, environmental assessment and remediation, reservoir engineering, and groundwater hydrology. The primary objective of developing the parallel-simulation capability is to significantly improve the computational performance of the TOUGH2 family of codes. The particular goal for the parallel simulator ismore » to achieve orders-of-magnitude improvement in computational time for models with ever-increasing complexity. TOUGH2-MP is designed to perform parallel simulation on multi-CPU computational platforms. An earlier version of TOUGH2-MP (V1.0) was based on the TOUGH2 Version 1.4 with EOS3, EOS9, and T2R3D modules, a software previously qualified for applications in the Yucca Mountain project, and was designed for execution on CRAY T3E and IBM SP supercomputers. The current version of TOUGH2-MP (V2.0) includes all fluid property modules of the standard version TOUGH2 V2.0. It provides computationally efficient capabilities using supercomputers, Linux clusters, or multi-core PCs, and also offers many user-friendly features. The parallel simulator inherits all process capabilities from V2.0 together with additional capabilities for handling fractured media from V1.4. This report provides a quick starting guide on how to set up and run the TOUGH2-MP program for users with a basic knowledge of running the (standard) version TOUGH2 code, The report also gives a brief technical description of the code, including a discussion of parallel methodology, code structure, as well as mathematical and numerical methods used. To familiarize users with the parallel code, illustrative sample problems are presented.« less
2011-01-01
reliability, e.g., Turbo Codes [2] and Low Density Parity Check ( LDPC ) codes [3]. The challenge to apply both MIMO and ECC into wireless systems is on...REPORT Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded MIMO Systems 14. ABSTRACT 16. SECURITY...illustrates the performance of coded LR aided detectors. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES The views, opinions
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (HP9000 SERIES 300/400 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is expected to be available on media suitable for seven different machine platforms: 1) DEC VAX computers running VMS (TK50 cartridge in VAX BACKUP format), 2) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and 7) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is expected to be available on media suitable for seven different machine platforms: 1) DEC VAX computers running VMS (TK50 cartridge in VAX BACKUP format), 2) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and 7) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2.
Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.
Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai
2017-03-31
As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 31.03.2017.
LAS - LAND ANALYSIS SYSTEM, VERSION 5.0
NASA Technical Reports Server (NTRS)
Pease, P. B.
1994-01-01
The Land Analysis System (LAS) is an image analysis system designed to manipulate and analyze digital data in raster format and provide the user with a wide spectrum of functions and statistical tools for analysis. LAS offers these features under VMS with optional image display capabilities for IVAS and other display devices as well as the X-Windows environment. LAS provides a flexible framework for algorithm development as well as for the processing and analysis of image data. Users may choose between mouse-driven commands or the traditional command line input mode. LAS functions include supervised and unsupervised image classification, film product generation, geometric registration, image repair, radiometric correction and image statistical analysis. Data files accepted by LAS include formats such as Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Advanced Very High Resolution Radiometer (AVHRR). The enhanced geometric registration package now includes both image to image and map to map transformations. The over 200 LAS functions fall into image processing scenario categories which include: arithmetic and logical functions, data transformations, fourier transforms, geometric registration, hard copy output, image restoration, intensity transformation, multispectral and statistical analysis, file transfer, tape profiling and file management among others. Internal improvements to the LAS code have eliminated the VAX VMS dependencies and improved overall system performance. The maximum LAS image size has been increased to 20,000 lines by 20,000 samples with a maximum of 256 bands per image. The catalog management system used in earlier versions of LAS has been replaced by a more streamlined and maintenance-free method of file management. This system is not dependent on VAX/VMS and relies on file naming conventions alone to allow the use of identical LAS file names on different operating systems. While the LAS code has been improved, the original capabilities of the system have been preserved. These include maintaining associated image history, session logging, and batch, asynchronous and interactive mode of operation. The LAS application programs are integrated under version 4.1 of an interface called the Transportable Applications Executive (TAE). TAE 4.1 has four modes of user interaction: menu, direct command, tutor (or help), and dynamic tutor. In addition TAE 4.1 allows the operation of LAS functions using mouse-driven commands under the TAE-Facelift environment provided with TAE 4.1. These modes of operation allow users, from the beginner to the expert, to exercise specific application options. LAS is written in C-language and FORTRAN 77 for use with DEC VAX computers running VMS with approximately 16Mb of physical memory. This program runs under TAE 4.1. Since TAE 4.1 is not a current version of TAE, TAE 4.1 is included within the LAS distribution. Approximately 130,000 blocks (65Mb) of disk storage space are necessary to store the source code and files generated by the installation procedure for LAS and 44,000 blocks (22Mb) of disk storage space are necessary for TAE 4.1 installation. The only other dependencies for LAS are the subroutine libraries for the specific display device(s) that will be used with LAS/DMS (e.g. X-Windows and/or IVAS). The standard distribution medium for LAS is a set of two 9track 6250 BPI magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. This program was developed in 1986 and last updated in 1992.
DIM SUM: demography and individual migration simulated using a Markov chain.
Brown, Jeremy M; Savidge, Kevin; McTavish, Emily Jane B
2011-03-01
An increasing number of studies seek to infer demographic history, often jointly with genetic relationships. Despite numerous analytical methods for such data, few simulations have investigated the methods' power and robustness, especially when underlying assumptions have been violated. DIM SUM (Demography and Individual Migration Simulated Using a Markov chain) is a stand-alone Java program for the simulation of population demography and individual migration while recording ancestor-descendant relationships. It does not employ coalescent assumptions or discrete population boundaries. It is extremely flexible, allowing the user to specify border positions, reactions of organisms to borders, local and global carrying capacities, individual dispersal kernels, rates of reproduction and strategies for sampling individuals. Spatial variables may be specified using image files (e.g., as exported from gis software) and may vary through time. In combination with software for genetic marker simulation, DIM SUM will be useful for testing phylogeographic (e.g., nested clade phylogeographic analysis, coalescent-based tests and continuous-landscape frameworks) and landscape-genetic methods, specifically regarding violations of coalescent assumptions. It can also be used to explore the qualitative features of proposed demographic scenarios (e.g. regarding biological invasions) and as a pedagogical tool. DIM SUM (with user's manual) can be downloaded from http://code.google.com/p/bio-dimsum. © 2010 Blackwell Publishing Ltd.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SUN4 VERSION WITH MOTIF)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SUN4 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
Impact of packet losses in scalable 3D holoscopic video coding
NASA Astrophysics Data System (ADS)
Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.
2014-05-01
Holoscopic imaging became a prospective glassless 3D technology to provide more natural 3D viewing experiences to the end user. Additionally, holoscopic systems also allow new post-production degrees of freedom, such as controlling the plane of focus or the viewing angle presented to the user. However, to successfully introduce this technology into the consumer market, a display scalable coding approach is essential to achieve backward compatibility with legacy 2D and 3D displays. Moreover, to effectively transmit 3D holoscopic content over error-prone networks, e.g., wireless networks or the Internet, error resilience techniques are required to mitigate the impact of data impairments in the user quality perception. Therefore, it is essential to deeply understand the impact of packet losses in terms of decoding video quality for the specific case of 3D holoscopic content, notably when a scalable approach is used. In this context, this paper studies the impact of packet losses when using a three-layer display scalable 3D holoscopic video coding architecture previously proposed, where each layer represents a different level of display scalability (i.e., L0 - 2D, L1 - stereo or multiview, and L2 - full 3D holoscopic). For this, a simple error concealment algorithm is used, which makes use of inter-layer redundancy between multiview and 3D holoscopic content and the inherent correlation of the 3D holoscopic content to estimate lost data. Furthermore, a study of the influence of 2D views generation parameters used in lower layers on the performance of the used error concealment algorithm is also presented.
Dispel4py: An Open-Source Python library for Data-Intensive Seismology
NASA Astrophysics Data System (ADS)
Filgueira, Rosa; Krause, Amrey; Spinuso, Alessandro; Klampanos, Iraklis; Danecek, Peter; Atkinson, Malcolm
2015-04-01
Scientific workflows are a necessary tool for many scientific communities as they enable easy composition and execution of applications on computing resources while scientists can focus on their research without being distracted by the computation management. Nowadays, scientific communities (e.g. Seismology) have access to a large variety of computing resources and their computational problems are best addressed using parallel computing technology. However, successful use of these technologies requires a lot of additional machinery whose use is not straightforward for non-experts: different parallel frameworks (MPI, Storm, multiprocessing, etc.) must be used depending on the computing resources (local machines, grids, clouds, clusters) where applications are run. This implies that for achieving the best applications' performance, users usually have to change their codes depending on the features of the platform selected for running them. This work presents dispel4py, a new open-source Python library for describing abstract stream-based workflows for distributed data-intensive applications. Special care has been taken to provide dispel4py with the ability to map abstract workflows to different platforms dynamically at run-time. Currently dispel4py has four mappings: Apache Storm, MPI, multi-threading and sequential. The main goal of dispel4py is to provide an easy-to-use tool to develop and test workflows in local resources by using the sequential mode with a small dataset. Later, once a workflow is ready for long runs, it can be automatically executed on different parallel resources. dispel4py takes care of the underlying mappings by performing an efficient parallelisation. Processing Elements (PE) represent the basic computational activities of any dispel4Py workflow, which can be a seismologic algorithm, or a data transformation process. For creating a dispel4py workflow, users only have to write very few lines of code to describe their PEs and how they are connected by using Python, which is widely supported on many platforms and is popular in many scientific domains, such as in geosciences. Once, a dispel4py workflow is written, a user only has to select which mapping they would like to use, and everything else (parallelisation, distribution of data) is carried on by dispel4py without any cost to the user. Among all dispel4py features we would like to highlight the following: * The PEs are connected by streams and not by writing to and reading from intermediate files, avoiding many IO operations. * The PEs can be stored into a registry. Therefore, different users can recombine PEs in many different workflows. * dispel4py has been enriched with a provenance mechanism to support runtime provenance analysis. We have adopted the W3C-PROV data model, which is accessible via a prototypal browser-based user interface and a web API. It supports the users with the visualisation of graphical products and offers combined operations to access and download the data, which may be selectively stored at runtime, into dedicated data archives. dispel4py has been already used by seismologists in the VERCE project to develop different seismic workflows. One of them is the Seismic Ambient Noise Cross-Correlation workflow, which preprocesses and cross-correlates traces from several stations. First, this workflow was tested on a local machine by using a small number of stations as input data. Later, it was executed on different parallel platforms (SuperMUC cluster, and Terracorrelator machine), automatically scaling up by using MPI and multiprocessing mappings and up to 1000 stations as input data. The results show that the dispel4py achieves scalable performance in both mappings tested on different parallel platforms.
Ant-Based Cyber Defense (also known as
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glenn Fink, PNNL
2015-09-29
ABCD is a four-level hierarchy with human supervisors at the top, a top-level agent called a Sergeant controlling each enclave, Sentinel agents located at each monitored host, and mobile Sensor agents that swarm through the enclaves to detect cyber malice and misconfigurations. The code comprises four parts: (1) the core agent framework, (2) the user interface and visualization, (3) test-range software to create a network of virtual machines including a simulated Internet and user and host activity emulation scripts, and (4) a test harness to allow the safe running of adversarial code within the framework of monitored virtual machines.
Hazard Assessment Computer System HACS/UIM Users’ Operation Manual. Volume I.
1981-09-01
941999-A U NCL A SSI7IED USCG-D-75-AL R_1 3 ~hhE~ I EEmhh.EEohmhE 2 I 1.I25 1.fl4 L MICROCOP RtfSCLUTItN IEST HTAK ’I’l ONAL BURLAU OF STANDARDS-1963...to assist in obtaining the compound recognition code used to refer- ence data for a particular chemical, a separate set of indices have been produced...and are given in a separate report. These indices enable a user of HACS to obtain a compound recognition code for a chemical given either the compound
Program MAMO: Models for avian management optimization-user guide
Guillaumet, Alban; Paxton, Eben H.
2017-01-01
The following chapters describe the structure and code of MAMO, and walk the reader through running the different components of the program with sample data. This manual should be used alongside a computer running R, so that the reader can copy and paste code into R, observe the output, and follow along interactively. Taken together, chapters 2–4 will allow the user to replicate a simulation study investigating the consequences of climate change and two potential management actions on the population dynamics of a vulnerable and iconic Hawaiian forest bird, the ‘I‘iwi (Drepanis coccinea; hereafter IIWI).
User's Manual for the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.; Cheatwood, F. McNeil
1996-01-01
This user's manual provides detailed instructions for the installation and the application of version 4.1 of the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA). Also provides simulation of flow field in thermochemical nonequilibrium around vehicles traveling at hypersonic velocities through the atmosphere. Earlier versions of LAURA were predominantly research codes, and they had minimal (or no) documentation. This manual describes UNIX-based utilities for customizing the code for special applications that also minimize system resource requirements. The algorithm is reviewed, and the various program options are related to specific equations and variables in the theoretical development.
Sweeney, Angela; Greenwood, Kathryn E; Williams, Sally; Wykes, Til; Rose, Diana S
2013-12-01
Health research is frequently conducted in multi-disciplinary teams, with these teams increasingly including service user researchers. Whilst it is common for service user researchers to be involved in data collection--most typically interviewing other service users--it is less common for service user researchers to be involved in data analysis and interpretation. This means that a unique and significant perspective on the data is absent. This study aims to use an empirical report of a study on Cognitive Behavioural Therapy for psychosis (CBTp) to demonstrate the value of multiple coding in enabling service users voices to be heard in team-based qualitative data analysis. The CBTp study employed multiple coding to analyse service users' discussions of CBT for psychosis (CBTp) from the perspectives of a service user researcher, clinical researcher and psychology assistant. Multiple coding was selected to enable multiple perspectives to analyse and interpret data, to understand and explore differences and to build multi-disciplinary consensus. Multiple coding enabled the team to understand where our views were commensurate and incommensurate and to discuss and debate differences. Through the process of multiple coding, we were able to build strong consensus about the data from multiple perspectives, including that of the service user researcher. Multiple coding is an important method for understanding and exploring multiple perspectives on data and building team consensus. This can be contrasted with inter-rater reliability which is only appropriate in limited circumstances. We conclude that multiple coding is an appropriate and important means of hearing service users' voices in qualitative data analysis. © 2012 John Wiley & Sons Ltd.
Ecological DYnamics Simulation Model - Light (EDYS-L): User’s Guide Version 4.6.4
2011-08-01
dead), utilization potential , and competitive success for each specified species (e.g., insects , rodents, native ungulates, livestock, predators...available disturbances. The default native herbivores are insects , rabbits, and deer. While multiple species occur within each category, and... native herbivores ( insects , rabbits, and deer) is simulated as a uniform consumption rate across the entire landscape. The user has the choice of
TOPAS Tool for Particle Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perl, Joseph
2013-05-30
TOPAS lets users simulate the passage of subatomic particles moving through any kind of radiation therapy treatment system, can import a patient geometry, can record dose and other quantities, has advanced graphics, and is fully four-dimensional (3D plus time) to handle the most challenging time-dependent aspects of modern cancer treatments.TOPAS unlocks the power of the most accurate particle transport simulation technique, the Monte Carlo (MC) method, while removing the painstaking coding work such methods used to require. Research physicists can use TOPAS to improve delivery systems towards safer and more effective radiation therapy treatments, easily setting up and running complexmore » simulations that previously used to take months of preparation. Clinical physicists can use TOPAS to increase accuracy while reducing side effects, simulating patient-specific treatment plans at the touch of a button. TOPAS is designed as a user code layered on top of the Geant4 Simulation Toolkit. TOPAS includes the standard Geant4 toolkit, plus additional code to make Geant4 easier to control and to extend Geant4 functionality. TOPAS aims to make proton simulation both reliable and repeatable. Reliable means both accurate physics and a high likelihood to simulate precisely what the user intended to simulate, reducing issues of wrong units, wrong materials, wrong scoring locations, etc. Repeatable means not just getting the same result from one simulation to another, but being able to easily restore a previously used setup and reducing sources of error when a setup is passed from one user to another. TOPAS control system incorporates key lessons from safety management, proactively removing possible sources of user error such as line-ordering mistakes In control files. TOPAS has been used to model proton therapy treatment examples including the UCSF eye treatment head, the MGH stereotactic alignment in radiosurgery treatment head and the MGH gantry treatment heads in passive scattering and scanning modes, and has demonstrated dose calculation based on patient-specific CT data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Grady, K; Davis, S D; Papaconstadopoulos, P
2014-08-15
A PTW microLion liquid ionization chamber and an Exradin A1SL air-filled ionization chamber have been modeled using the egs-chamber user code of the EGSnrc system to determine their perturbation effects in water in a 5 × 5 cm{sup 2} 18 MV photon beam. A model of the Varian CL21EX linear accelerator was constructed using the BEAMnrc Monte Carlo code, and was validated by comparing measured PDDs and profiles from the microLion and A1SL chambers to calculated results that included chamber models. Measured PDDs for a 5 × 5 cm{sup 2} field for the microLion chamber agreed with calculations to withinmore » 1.5% beyond a depth of 0.5 cm, and the A1SL PDDs agreed within 1.0% beyond 1.0 cm. Measured and calculated profiles at 10 cm depth agreed within 1.0% for both chambers inside the field, and within 4.0% near the field edge. Local percent differences increased up to 15% at 4 cm outside the field. The ratio of dose to water in the absence of the chamber relative to dose in the chamber's active volume as a function of off-axis distance was calculated using the egs-chamber correlated sampling technique. The dose ratio was nearly constant inside the field and consistent with the stopping power ratios of water to detector material, but varied up to 3.3% near the field edge and 5.2% at 4 cm outside the field. Once these perturbation effects are fully characterized for more field sizes and detectors, they could be applied to clinical water tank measurements for improved dosimetric accuracy.« less
1991-03-01
29 3.3.2 Manual Frequency List Measurement ................... 29 3.3.3 Manual 200-kHz Spectrum Measurement ................ 30 1 on/ lity Codes...39 4.2.1 Frequency List Measurements ......................... 39 4.2.2 Calibration Measurements...Manual Frequency List Measurements .................. 43 4.3 D isk Files ............................................... 43 4.3.1 Program Disk
On-line content creation for photo products: understanding what the user wants
NASA Astrophysics Data System (ADS)
Fageth, Reiner
2015-03-01
This paper describes how videos can be implemented into printed photo books and greeting cards. We will show that - surprisingly or not- pictures from videos are similarly used such as classical images to tell compelling stories. Videos can be taken with nearly every camera, digital point and shoot cameras, DSLRs as well as smartphones and more and more with so-called action cameras mounted on sports devices. The implementation of videos while generating QR codes and relevant pictures out of the video stream via a software implementation was contents in last years' paper. This year we present first data about what contents is displayed and how the users represent their videos in printed products, e.g. CEWE PHOTOBOOKS and greeting cards. We report the share of the different video formats used.
Risk perceptions of smokeless tobacco among adolescents and adult users and nonusers
Liu, Sherry T.; Nemeth, Julianna M.; Klein, Elizabeth G.; Ferketich, Amy K.; Kwan, Mei-Po; Wewers, Mary Ellen
2015-01-01
The recent growth in smokeless tobacco (ST) consumption has raised questions about consumer risk perceptions of ST products, especially in high-risk vulnerable populations. This qualitative study examined risk perceptions of ST among adolescent and adult users and non-users in Ohio Appalachia. Focus groups and interviews were held with adolescents (n=53; mean age of 17 years) and adults (n=63; mean age of 34 years) from four Ohio Appalachian counties. Participants were asked about their perceptions of ST-related health risks, ST safety, and the relative safety of ST compared to cigarettes. Transcriptions were coded independently by two individuals. Overall, participants were knowledgeable about health problems from ST use (e.g., oral cancers, periodontal disease). Nearly all participants stated that ST use is not safe; however, there was disagreement about its relative safety. Some perceived all tobacco products as equally harmful; others believed that ST is safer than cigarettes for either the user or those around the user. Disagreements about ST relative safety may reflect mixed public health messages concerning the safety of ST. Comprehensive consumer messages about the relative safety of ST compared to cigarettes are needed. Messages should address the effect of ST on the health of the user as well as those exposed to the user. PMID:25832126
3D geospatial visualizations: Animation and motion effects on spatial objects
NASA Astrophysics Data System (ADS)
Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos
2018-02-01
Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epiney, A.; Canepa, S.; Zerkak, O.
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.
1982-01-01
A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.
NASA Astrophysics Data System (ADS)
Taber, J.; Bahavar, M.; Bravo, T. K.; Butler, R. F.; Kilb, D. L.; Trabant, C.; Woodward, R.; Ammon, C. J.
2011-12-01
Data from dense seismic arrays can be used to visualize the propagation of seismic waves, resulting in animations effective for teaching both general and advanced audiences. One of the first visualizations of this type was developed using Objective C code and EarthScope/USArray data, which was then modified and ported to the Matlab platform and has now been standardized and automated as an IRIS Data Management System (IRIS-DMS) data product. These iterative code developments and improvements were completed by C. Ammon, R. Woodward and M. Bahavar, respectively. Currently, an automated script creates Ground Motion Visualizations (GMVs) for all global earthquakes over magnitude 6 recorded by EarthScope's USArray Transportable Array (USArray TA) network. The USArray TA network is a rolling array of 400 broadband stations deployed on a uniform 70-km grid. These near real-time GMV visualizations are typically available for download within 4 hours or less of their occurrence (see: www.iris.edu/dms/products/usarraygmv/). The IRIS-DMS group has recently added a feature that allows users to highlight key elements within the GMVs, by providing an online tool for creating customized GMVs. This new interface allows users to select the stations, channels, and time window of interest, adjust the mapped areal extent of the view, and specify high and low pass filters. An online tutorial available from the IRIS Education and Public Outreach (IRIS-EPO) website, listed below, steps through a teaching sequence that can be used to explain the basic features of the GMVs. For example, they can be used to demonstrate simple concepts such as relative P, S and surface wave velocities and corresponding wavelengths for middle-school students, or more advanced concepts such as the influence of focal mechanism on waveforms, or how seismic waves converge at an earthquake's antipode. For those who desire a greater level of customization, including the ability to use the GMV framework with data sets not stored within the IRIS-DMS, the Matlab GMV code is now also available from the IRIS-DMS website. These GMV codes have been applied to sac-formatted data from the Quake Catcher Network (QCN). Through a collaboration between NSF-funded programs and projects (e.g., IRIS and QCN) we are striving to make these codes user friendly enough to be routinely incorporated in undergraduate and graduate seismology classes. In this way, we will help provide a research tool for students to explore never-looked-at-before data, similar to actual seismology research. As technology is advancing quickly, we now have more data than seismologists can easily examine. Given this, we anticipate students using our codes can perform a 'citizen scientist' role in that they can help us identify key signals within the unexamined vast data streams we are acquiring.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (DEC VAX ULTRIX VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. Data-driven graphical objects such as dials, thermometers, and strip charts are also included. TAE Plus updates the strip chart as the data values change. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK50 cartridge in VAX BACKUP format), 2) DEC VAXstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX 8.0 (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX 8.05 (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun3 series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), 7) Sun4 (SPARC) series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), and 8) SGI Indigo computers running IRIX 4.0.1 and IRIX/Motif 1.0.1 (.25 inch IRIS tape cartridge in UNIX tar format). An optional Motif Object Code License is available for either Sun version. TAE is a trademark of the National Aeronautics and Space Administration. X Window System is a trademark of the Massachusetts Institute of Technology. Motif is a trademark of the Open Software Foundation. DEC, VAX, VMS, TK50 and ULTRIX are trademarks of Digital Equipment Corporation. HP9000 and HP-UX are trademarks of Hewlett-Packard Co. Sun3, Sun4, SunOS, and SPARC are trademarks of Sun Microsystems, Inc. SGI and IRIS are registered trademarks of Silicon Graphics, Inc.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (SUN3 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. Data-driven graphical objects such as dials, thermometers, and strip charts are also included. TAE Plus updates the strip chart as the data values change. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK50 cartridge in VAX BACKUP format), 2) DEC VAXstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX 8.0 (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX 8.05 (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun3 series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), 7) Sun4 (SPARC) series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), and 8) SGI Indigo computers running IRIX 4.0.1 and IRIX/Motif 1.0.1 (.25 inch IRIS tape cartridge in UNIX tar format). An optional Motif Object Code License is available for either Sun version. TAE is a trademark of the National Aeronautics and Space Administration. X Window System is a trademark of the Massachusetts Institute of Technology. Motif is a trademark of the Open Software Foundation. DEC, VAX, VMS, TK50 and ULTRIX are trademarks of Digital Equipment Corporation. HP9000 and HP-UX are trademarks of Hewlett-Packard Co. Sun3, Sun4, SunOS, and SPARC are trademarks of Sun Microsystems, Inc. SGI and IRIS are registered trademarks of Silicon Graphics, Inc.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (SUN3 VERSION WITH MOTIF)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. Data-driven graphical objects such as dials, thermometers, and strip charts are also included. TAE Plus updates the strip chart as the data values change. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK50 cartridge in VAX BACKUP format), 2) DEC VAXstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX 8.0 (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX 8.05 (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun3 series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), 7) Sun4 (SPARC) series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), and 8) SGI Indigo computers running IRIX 4.0.1 and IRIX/Motif 1.0.1 (.25 inch IRIS tape cartridge in UNIX tar format). An optional Motif Object Code License is available for either Sun version. TAE is a trademark of the National Aeronautics and Space Administration. X Window System is a trademark of the Massachusetts Institute of Technology. Motif is a trademark of the Open Software Foundation. DEC, VAX, VMS, TK50 and ULTRIX are trademarks of Digital Equipment Corporation. HP9000 and HP-UX are trademarks of Hewlett-Packard Co. Sun3, Sun4, SunOS, and SPARC are trademarks of Sun Microsystems, Inc. SGI and IRIS are registered trademarks of Silicon Graphics, Inc.
1987-09-01
Evaluation Commnand &_. ADMASS Coly, 1W~., and ZIP Code ) 7b. ADDRESS (C01y, State, wid ZIP Code ) Dugwiay, Utahi 84022-5000 Aberdeen Proving Ground...Aency_________________________ 9L AoOMS(CRY, 0to, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS Hazardous Waste Environmental RLsearch Lab PROGRAM PROJECT TASK...CLASSIFICATION 0 UNO.ASSIFIEDAIJNLIMITED 0l SAME AS RPT. 03 OTIC USERS UNCLA.SSIFIED 22a. RAWE OF RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include Area Code ) I
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (HP9000 SERIES 700/800 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (IBM RS/6000 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SILICON GRAPHICS VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (DEC RISC ULTRIX VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
QuIN: A Web Server for Querying and Visualizing Chromatin Interaction Networks.
Thibodeau, Asa; Márquez, Eladio J; Luo, Oscar; Ruan, Yijun; Menghi, Francesca; Shin, Dong-Guk; Stitzel, Michael L; Vera-Licona, Paola; Ucar, Duygu
2016-06-01
Recent studies of the human genome have indicated that regulatory elements (e.g. promoters and enhancers) at distal genomic locations can interact with each other via chromatin folding and affect gene expression levels. Genomic technologies for mapping interactions between DNA regions, e.g., ChIA-PET and HiC, can generate genome-wide maps of interactions between regulatory elements. These interaction datasets are important resources to infer distal gene targets of non-coding regulatory elements and to facilitate prioritization of critical loci for important cellular functions. With the increasing diversity and complexity of genomic information and public ontologies, making sense of these datasets demands integrative and easy-to-use software tools. Moreover, network representation of chromatin interaction maps enables effective data visualization, integration, and mining. Currently, there is no software that can take full advantage of network theory approaches for the analysis of chromatin interaction datasets. To fill this gap, we developed a web-based application, QuIN, which enables: 1) building and visualizing chromatin interaction networks, 2) annotating networks with user-provided private and publicly available functional genomics and interaction datasets, 3) querying network components based on gene name or chromosome location, and 4) utilizing network based measures to identify and prioritize critical regulatory targets and their direct and indirect interactions. QuIN's web server is available at http://quin.jax.org QuIN is developed in Java and JavaScript, utilizing an Apache Tomcat web server and MySQL database and the source code is available under the GPLV3 license available on GitHub: https://github.com/UcarLab/QuIN/.
ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less
MATH77 - A LIBRARY OF MATHEMATICAL SUBPROGRAMS FOR FORTRAN 77, RELEASE 4.0
NASA Technical Reports Server (NTRS)
Lawson, C. L.
1994-01-01
MATH77 is a high quality library of ANSI FORTRAN 77 subprograms implementing contemporary algorithms for the basic computational processes of science and engineering. The portability of MATH77 meets the needs of present-day scientists and engineers who typically use a variety of computing environments. Release 4.0 of MATH77 contains 454 user-callable and 136 lower-level subprograms. Usage of the user-callable subprograms is described in 69 sections of the 416 page users' manual. The topics covered by MATH77 are indicated by the following list of chapter titles in the users' manual: Mathematical Functions, Pseudo-random Number Generation, Linear Systems of Equations and Linear Least Squares, Matrix Eigenvalues and Eigenvectors, Matrix Vector Utilities, Nonlinear Equation Solving, Curve Fitting, Table Look-Up and Interpolation, Definite Integrals (Quadrature), Ordinary Differential Equations, Minimization, Polynomial Rootfinding, Finite Fourier Transforms, Special Arithmetic , Sorting, Library Utilities, Character-based Graphics, and Statistics. Besides subprograms that are adaptations of public domain software, MATH77 contains a number of unique packages developed by the authors of MATH77. Instances of the latter type include (1) adaptive quadrature, allowing for exceptional generality in multidimensional cases, (2) the ordinary differential equations solver used in spacecraft trajectory computation for JPL missions, (3) univariate and multivariate table look-up and interpolation, allowing for "ragged" tables, and providing error estimates, and (4) univariate and multivariate derivative-propagation arithmetic. MATH77 release 4.0 is a subroutine library which has been carefully designed to be usable on any computer system that supports the full ANSI standard FORTRAN 77 language. It has been successfully implemented on a CRAY Y/MP computer running UNICOS, a UNISYS 1100 computer running EXEC 8, a DEC VAX series computer running VMS, a Sun4 series computer running SunOS, a Hewlett-Packard 720 computer running HP-UX, a Macintosh computer running MacOS, and an IBM PC compatible computer running MS-DOS. Accompanying the library is a set of 196 "demo" drivers that exercise all of the user-callable subprograms. The FORTRAN source code for MATH77 comprises 109K lines of code in 375 files with a total size of 4.5Mb. The demo drivers comprise 11K lines of code and 418K. Forty-four percent of the lines of the library code and 29% of those in the demo code are comment lines. The standard distribution medium for MATH77 is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 9track 1600 BPI magnetic tape in VAX BACKUP format and a TK50 tape cartridge in VAX BACKUP format. An electronic copy of the documentation is included on the distribution media. Previous releases of MATH77 have been used over a number of years in a variety of JPL applications. MATH77 Release 4.0 was completed in 1992. MATH77 is a copyrighted work with all copyright vested in NASA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, John; Jankovsky, Zachary; Metzroth, Kyle G
2018-04-04
The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified set of simulators. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which uses explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system (or other complex system) evolution along with stochastic modeling. When DET are used to model various aspects of Probabilistic Risk Assessment (PRA), all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specifiedmore » times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at separate times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination), analysis of results, and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worst-case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less
Sollie, Annet; Sijmons, Rolf H; Helsper, Charles; Numans, Mattijs E
2017-03-01
To assess quality and reusability of coded cancer diagnoses in routine primary care data. To identify factors that influence data quality and areas for improvement. A dynamic cohort study in a Dutch network database containing 250,000 anonymized electronic medical records (EMRs) from 52 general practices was performed. Coded data from 2000 to 2011 for the three most common cancer types (breast, colon and prostate cancer) was compared to the Netherlands Cancer Registry. Data quality is expressed in Standard Incidence Ratios (SIRs): the ratio between the number of coded cases observed in the primary care network database and the expected number of cases based on the Netherlands Cancer Registry. Ratios were multiplied by 100% for readability. The overall SIR was 91.5% (95%CI 88.5-94.5) and showed improvement over the years. SIRs differ between cancer types: from 71.5% for colon cancer in males to 103.9% for breast cancer. There are differences in data quality (SIRs 76.2% - 99.7%) depending on the EMR system used, with SIRs up to 232.9% for breast cancer. Frequently observed errors in routine healthcare data can be classified as: lack of integrity checks, inaccurate use and/or lack of codes, and lack of EMR system functionality. Re-users of coded routine primary care Electronic Medical Record data should be aware that 30% of cancer cases can be missed. Up to 130% of cancer cases found in the EMR data can be false-positive. The type of EMR system and the type of cancer influence the quality of coded diagnosis registry. While data quality can be improved (e.g. through improving system design and by training EMR system users), re-use should only be taken care of by appropriately trained experts. Copyright © 2016. Published by Elsevier B.V.
A Shallow Layer Approach for Geo-flow emplacement
NASA Astrophysics Data System (ADS)
Costa, A.; Folch, A.; Mecedonio, G.
2009-04-01
Geophysical flows such as lahars or lava flows severely threat the communities located on or near the volcano flanks. Risks and damages caused by the propagation of this kind of flows require a quantitative description of this phenomenon and reliable tools for forecasting their emplacement. Computational models are a valuable tool for planning risk mitigation countermeasures, such as human intervention to force flow diversion, artificial barriers, and allow for significant economical and social benefits. A FORTRAN 90 code based on a Shallow Layer Approach for Geo-flows (SLAG) for describing transport and emplacement of diluted lahars, water and lava was developed in both serial and parallel version. Three rheological models, such as those describing i) a viscous, ii) a turbulent, and iii) a dilatant flow respectively, were implemented in order to describe transport of lavas, water and diluted lahars. The code was made user-friendly by creating some interfaces that allow the user to easily define the problem, extract and interpolate the topography of the simulation domain. Moreover SLAG outputs can be written in both GRD format (e.g., Surfer), NetCDF format, or visualized directly in GoogleEarth. In SLAG the governing equations were treated using a Godunov splitting method following George (2008) algorithm based on a Riemann solver for the shallow water equations that decomposes an augmented state variable the depth, momentum, momentum flux, and bathymetry into four propagating discontinuities or waves. For our application, the algorithm was generalized for solving the energy equation. For validating the code in simulating real geophysical flows, we performed few simulations the lava flow event of the the 3rd and 4th January 1992 Etna eruption, the July 2001 Etna lava flows, January 2002 Nyragongo lava flows and few test cases for simulating transport of diluted lahars. Ref: George, D.L. (2008), Augmented Riemann Solvers for the Shallow Water Equations over Variable Topography with Steady States and Inundation, J. Comput. Phys., 227 (6), 3089-3113, doi:10.1016/j.jcp.2007.10.027.
Users manual for the improved NASA Lewis ice accretion code LEWICE 1.6
NASA Technical Reports Server (NTRS)
Wright, William B.
1995-01-01
This report is intended as an update/replacement to NASA CR 185129 'User's Manual for the NASALewis Ice Accretion Prediction Code (LEWICE)' and as an update to NASA CR 195387 'Update to the NASA Lewis Ice Accretion Code LEWICE'. In addition to describing the changes specifically made for this version, information from previous manuals will be duplicated so that the user will not need three manuals to use this code.
Design implications for task-specific search utilities for retrieval and re-engineering of code
NASA Astrophysics Data System (ADS)
Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif
2017-05-01
The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Guoping; Mayes, Melanie; Parker, Jack C
2010-01-01
We implemented the widely used CXTFIT code in Excel to provide flexibility and added sensitivity and uncertainty analysis functions to improve transport parameter estimation and to facilitate model discrimination for multi-tracer experiments on structured soils. Analytical solutions for one-dimensional equilibrium and nonequilibrium convection dispersion equations were coded as VBA functions so that they could be used as ordinary math functions in Excel for forward predictions. Macros with user-friendly interfaces were developed for optimization, sensitivity analysis, uncertainty analysis, error propagation, response surface calculation, and Monte Carlo analysis. As a result, any parameter with transformations (e.g., dimensionless, log-transformed, species-dependent reactions, etc.) couldmore » be estimated with uncertainty and sensitivity quantification for multiple tracer data at multiple locations and times. Prior information and observation errors could be incorporated into the weighted nonlinear least squares method with a penalty function. Users are able to change selected parameter values and view the results via embedded graphics, resulting in a flexible tool applicable to modeling transport processes and to teaching students about parameter estimation. The code was verified by comparing to a number of benchmarks with CXTFIT 2.0. It was applied to improve parameter estimation for four typical tracer experiment data sets in the literature using multi-model evaluation and comparison. Additional examples were included to illustrate the flexibilities and advantages of CXTFIT/Excel. The VBA macros were designed for general purpose and could be used for any parameter estimation/model calibration when the forward solution is implemented in Excel. A step-by-step tutorial, example Excel files and the code are provided as supplemental material.« less
LIME: Semiautomated line measurement and identification from stellar spectra
NASA Astrophysics Data System (ADS)
Sahin, T.
2017-09-01
We present LIME (Line Measurements from ECHELLE Spectra), an IDL-based code, as a powerful tool for semiautomated stellar line measurement and identification. Interactively selected line positions (i.e. wavelengths) are compared with a master line list of the user's selections. Each unknown line that the user interactively chooses is displayed with potential identifications provided by the code in the vicinity of the selected line. The best identification is evaluated on the basis of several criteria (e.g., atomic/molecular line information, wavelength displacement, and theoretical equivalent width for solar atmospheric values). We examined the identifications by LIME in the spectra of post-red supergiant star HD 179821 over a range of signal-to-noise values and wavelength ranges. We found that the results obtained by LIME show virtually complete agreement with the manual identifications for which the conventional and also tedious approach is to use a revised multiplet table as an initial guide and perform a systematic search that makes use of the lower excitation potential and gf-values. Comparison to previous identifications for HD 179821 in the literature revealed not only lines that were unmeasurable and/or blended but also misidentifications. While a manual identification process takes a relatively longer time to be accomplished by an experienced spectroscopist, LIME can provide a rapid extraction of line information in a few hours with moderate user interaction.
SD-4060OCPLT4 program, user's guide
NASA Technical Reports Server (NTRS)
Glazer, J.
1973-01-01
A brief description of the Orbit Comparison Plot (OCPLT4) program is presented, along with user information and a source program listing. In addition to correcting several errors that existed in the original program, this program incorporates the following new features: (1) For any satellite whose observations are processed by the Definitive Orbit Determination System (DODS), the orbital uncertainty estimates (OUE) can be obtained via appropriate card input with no major modification to the program. (2) All satellite-related information (e.g., plotter scales, cutoff limits, plotting frequencies) is user controlled via card input. (3) Not all components of OUE must be obtained. The user has the option of obtaining only the radial component if there is no need for the other two components. (4) The altitude and time graph formats are controlled by the user and are not stored for specific satellites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.
2004-09-14
This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.
Interdisciplinary Research in Viscoelasticity and Rheology
1989-09-01
I’tijc -. ease ~ NREORT NUN. SRIS) 5 MONITORING ORCANIZAIO IORT fulf( 4 , AD-A213 630 _ _ __ _ _ _ _ _ _ _ _ _ %-z-CR~MI NORG ANIZATION oo OF iCE...13 6. SUPO!EMENTARY NOTATION -7 L COSATi CODES IS. SUBJECT TERMS (Continue on reverse if necessary and identify by black riumoer) . ABSTR-AC-T...34ON E JCI-SSIDNUMTED C SAME AS RPT D TIC USERS 4 2a 4AM.E !)F aQE-PONSI8LE NOIVODUAL 22b. 7ELEPH-ONE (include Area Code) 22c.Qe iCE SYMBOL ~r je
NASA Astrophysics Data System (ADS)
Braun, N.; Hauth, T.; Pulvermacher, C.; Ritter, M.
2017-10-01
Today’s analyses for high-energy physics (HEP) experiments involve processing a large amount of data with highly specialized algorithms. The contemporary workflow from recorded data to final results is based on the execution of small scripts - often written in Python or ROOT macros which call complex compiled algorithms in the background - to perform fitting procedures and generate plots. During recent years interactive programming environments, such as Jupyter, became popular. Jupyter allows to develop Python-based applications, so-called notebooks, which bundle code, documentation and results, e.g. plots. Advantages over classical script-based approaches is the feature to recompute only parts of the analysis code, which allows for fast and iterative development, and a web-based user frontend, which can be hosted centrally and only requires a browser on the user side. In our novel approach, Python and Jupyter are tightly integrated into the Belle II Analysis Software Framework (basf2), currently being developed for the Belle II experiment in Japan. This allows to develop code in Jupyter notebooks for every aspect of the event simulation, reconstruction and analysis chain. These interactive notebooks can be hosted as a centralized web service via jupyterhub with docker and used by all scientists of the Belle II Collaboration. Because of its generality and encapsulation, the setup can easily be scaled to large installations.
Bergamino, Maurizio; Hamilton, David J; Castelletti, Lara; Barletta, Laura; Castellan, Lucio
2015-03-01
In this study, we describe the development and utilization of a relational database designed to manage the clinical and radiological data of patients with brain tumors. The Brain Tumor Database was implemented using MySQL v.5.0, while the graphical user interface was created using PHP and HTML, thus making it easily accessible through a web browser. This web-based approach allows for multiple institutions to potentially access the database. The BT Database can record brain tumor patient information (e.g. clinical features, anatomical attributes, and radiological characteristics) and be used for clinical and research purposes. Analytic tools to automatically generate statistics and different plots are provided. The BT Database is a free and powerful user-friendly tool with a wide range of possible clinical and research applications in neurology and neurosurgery. The BT Database graphical user interface source code and manual are freely available at http://tumorsdatabase.altervista.org. © The Author(s) 2013.
Superwide-angle coverage code-multiplexed optical scanner.
Riza, Nabeel A; Arain, Muzammil A
2004-05-01
A superwide-angle coverage code-multiplexed optical scanner is presented that has the potential to provide 4 pi-sr coverage. As a proof-of-concept experiment, an angular scan range of 288 degrees for six randomly distributed beams is demonstrated. The proposed scanner achieves its superwide coverage by exploiting a combination of phase-encoded transmission and reflection holography within an in-line hologram recording-retrieval geometry. The basic scanner unit consists of one phase-only digital mode spatial light modulator for code entry (i.e., beam scan control) and a holographic material from which we obtained what we believe is the first-of-a-kind extremely wide coverage, low component count, high speed (e.g., microsecond domain), and large aperture (e.g., > 1-cm diameter) scanner.
ogs6 - a new concept for porous-fractured media simulations
NASA Astrophysics Data System (ADS)
Naumov, Dmitri; Bilke, Lars; Fischer, Thomas; Rink, Karsten; Wang, Wenqing; Watanabe, Norihiro; Kolditz, Olaf
2015-04-01
OpenGeoSys (OGS) is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THMC) processes in porous and fractured media, continuously developed since the mid-eighties. The basic concept is to provide a flexible numerical framework for solving coupled multi-field problems. OGS is targeting mainly on applications in environmental geoscience, e.g. in the fields of contaminant hydrology, water resources management, waste deposits, or geothermal energy systems, but it has also been successfully applied to new topics in energy storage recently. OGS is actively participating several international benchmarking initiatives, e.g. DECOVALEX (waste management), CO2BENCH (CO2 storage and sequestration), SeSBENCH (reactive transport processes) and HM-Intercomp (coupled hydrosystems). Despite the broad applicability of OGS in geo-, hydro- and energy-sciences, several shortcomings became obvious concerning the computational efficiency as well as the code structure became too sophisticated for further efficient development. OGS-5 was designed for object-oriented FEM applications. However, in many multi-field problems a certain flexibility of tailored numerical schemes is essential. Therefore, a new concept was designed to overcome existing bottlenecks. The paradigms for ogs6 are: - Flexibility of numerical schemes (FEM#FVM#FDM), - Computational efficiency (PetaScale ready), - Developer- and user-friendly. ogs6 has a module-oriented architecture based on thematic libraries (e.g. MeshLib, NumLib) on the large scale and uses object-oriented approach for the small scale interfaces. Usage of a linear algebra library (Eigen3) for the mathematical operations together with the ISO C++11 standard increases the expressiveness of the code and makes it more developer-friendly. The new C++ standard also makes the template meta-programming technique code used for compile-time optimizations more compact. We have transitioned the main code development to the GitHub code hosting system (https://github.com/ufz/ogs). The very flexible revision control system Git in combination with issue tracking, developer feedback and the code review options improve the code quality and the development process in general. The continuous testing procedure of the benchmarks as it was established for OGS-5 is maintained. Additionally unit testing, which is automatically triggered by any code changes, is executed by two continuous integration frameworks (Jenkins CI, Travis CI) which build and test the code on different operating systems (Windows, Linux, Mac OS), in multiple configurations and with different compilers (GCC, Clang, Visual Studio). To improve the testing possibilities further, XML based file input formats are introduced helping with automatic validation of the user contributed benchmarks. The first ogs6 prototype version 6.0.1 has been implemented for solving generic elliptic problems. Next steps are envisaged to transient, non-linear and coupled problems. Literature: [1] Kolditz O, Shao H, Wang W, Bauer S (eds) (2014): Thermo-Hydro-Mechanical-Chemical Processes in Fractured Porous Media: Modelling and Benchmarking - Closed Form Solutions. In: Terrestrial Environmental Sciences, Vol. 1, Springer, Heidelberg, ISBN 978-3-319-11893-2, 315pp. http://www.springer.com/earth+sciences+and+geography/geology/book/978-3-319-11893-2 [2] Naumov D (2015): Computational Fluid Dynamics in Unconsolidated Sediments: Model Generation and Discrete Flow Simulations, PhD thesis, Technische Universität Dresden.
An assessment of multibody simulation tools for articulated spacecraft
NASA Technical Reports Server (NTRS)
Man, Guy K.; Sirlin, Samuel W.
1989-01-01
A survey of multibody simulation codes was conducted in the spring of 1988, to obtain an assessment of the state of the art in multibody simulation codes from the users of the codes. This survey covers the most often used articulated multibody simulation codes in the spacecraft and robotics community. There was no attempt to perform a complete survey of all available multibody codes in all disciplines. Furthermore, this is not an exhaustive evaluation of even robotics and spacecraft multibody simulation codes, as the survey was designed to capture feedback on issues most important to the users of simulation codes. We must keep in mind that the information received was limited and the technical background of the respondents varied greatly. Therefore, only the most often cited observations from the questionnaire are reported here. In this survey, it was found that no one code had both many users (reports) and no limitations. The first section is a report on multibody code applications. Following applications is a discussion of execution time, which is the most troublesome issue for flexible multibody codes. The representation of component flexible bodies, which affects both simulation setup time as well as execution time, is presented next. Following component data preparation, two sections address the accessibility or usability of a code, evaluated by considering its user interface design and examining the overall simulation integrated environment. A summary of user efforts at code verification is reported, before a tabular summary of the questionnaire responses. Finally, some conclusions are drawn.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gwo, J.P.; Jardine, P.M.; Yeh, G.T.
Matrix diffusion, a diffusive mass transfer process,in the structured soils and geologic units at ORNL, is believe to be an important subsurface mass transfer mechanism; it may affect off-site movement of radioactive wastes and remediation of waste disposal sites by locally exchanging wastes between soil/rock matrix and macropores/fractures. Advective mass transfer also contributes to waste movement but is largely neglected by researchers. This report presents the first documented 2-D multiregion solute transport code (MURT) that incorporates not only diffusive but also advective mass transfer and can be applied to heterogeneous porous media under transient flow conditions. In this report, theoreticalmore » background is reviewed and the derivation of multiregion solute transport equations is presented. Similar to MURF (Gwo et al. 1994), a multiregion subsurface flow code, multiplepore domains as suggested by previous investigators (eg, Wilson and Luxmoore 1988) can be implemented in MURT. Transient or steady-state flow fields of the pore domains can be either calculated by MURF or by modelers. The mass transfer process is briefly discussed through a three-pore-region multiregion solute transport mechanism. Mass transfer equations that describe mass flux across pore region interfaces are also presented and parameters needed to calculate mass transfer coefficients detailed. Three applications of MURT (tracer injection problem, sensitivity analysis of advective and diffusive mass transfer, hillslope ponding infiltration and secondary source problem) were simulated and results discussed. Program structure of MURT and functions of MURT subroutiness are discussed so that users can adapt the code; guides for input data preparation are provided in appendices.« less
NASA Astrophysics Data System (ADS)
Zhang, Baocheng; Yuan, Yunbin
2017-04-01
A synthesis of two prevailing Global Navigation Satellite System (GNSS) positioning technologies, namely the precise point positioning (PPP) and the network-based real-time kinematic (NRTK), results in the emergence of the PPP-RTK. This new concept preferably integrates the typical advantage of PPP (e.g. flexibility) and that of NRTK (e.g. efficiency), such that it enables single-receiver users to achieve high positioning accuracy with reasonable timeliness through integer ambiguity resolution (IAR). The realization of PPP-RTK needs to accomplish two sequential tasks. The first task is to determine a class of corrections including, necessarily, the satellite orbits, the satellite clocks and the satellite phase (and code, in case of more than two frequencies) biases at the network level. With these corrections, the second task, then, is capable of solving for the ambiguity-fixed, absolute position(s) at the user level. In this contribution, we revisit three variants (geometry-free, geometry-fixed, and geometry- and satellite-clock-fixed) of undifferenced, uncombined PPP-RTK network model and discuss their implications for practical use. We carry out a case study using multi-day, dual-frequency GPS data from the Crustal Movement Observation Network of China (CMONOC), aiming to assess the (static and kinematic) positioning performance (in terms of time-to-first-fix and accuracy) that is achievable by PPP-RTK users across China.
SinCHet: a MATLAB toolbox for single cell heterogeneity analysis in cancer.
Li, Jiannong; Smalley, Inna; Schell, Michael J; Smalley, Keiran S M; Chen, Y Ann
2017-09-15
Single-cell technologies allow characterization of transcriptomes and epigenomes for individual cells under different conditions and provide unprecedented resolution for researchers to investigate cellular heterogeneity in cancer. The SinCHet ( gle ell erogeneity) toolbox is developed in MATLAB and has a graphical user interface (GUI) for visualization and user interaction. It analyzes both continuous (e.g. mRNA expression) and binary omics data (e.g. discretized methylation data). The toolbox does not only quantify cellular heterogeneity using S hannon P rofile (SP) at different clonal resolutions but also detects heterogeneity differences using a D statistic between two populations. It is defined as the area under the P rofile of S hannon D ifference (PSD). This flexible tool provides a default clonal resolution using the change point of PSD detected by multivariate adaptive regression splines model; it also allows user-defined clonal resolutions for further investigation. This tool provides insights into emerging or disappearing clones between conditions, and enables the prioritization of biomarkers for follow-up experiments based on heterogeneity or marker differences between and/or within cell populations. The SinCHet software is freely available for non-profit academic use. The source code, example datasets, and the compiled package are available at http://labpages2.moffitt.org/chen/software/ . ann.chen@moffitt.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
A narrowband CDMA communications payload for little LEOS applications
NASA Astrophysics Data System (ADS)
Michalik, H.; Hävecker, W.; Ginati, A.
1996-09-01
In recent years Code Division Multiple Access (CDMA) techniques have been investigated for application in Local Area Networks [J. A. Salehi, IEEE Trans. Commun. 37 (1989)]as well as in Mobile Communications [R. Kohno et al., IEEE Commun. Mag. Jan (1995)]. The main attraction of these techniques is due to potential higher throughput and capacity of such systems under certain conditions compared to conventional multi-access schemes like frequency and time division multiplexing. Mobile communication over a Satellite Link represents in some terms the "worst case" for operating a CDMA-system. Considering e.g. the uplink case from mobile to satellite, the imperfections due to different and time varying channel conditions will add to the well known effects of Multiple Access Interference (MAI) between the simultaneously active users at the satellite receiver. In addition, bandwidth constraints due to the non-availability of large bandwidth channels in the interesting frequency bands, exist for small systems. As a result, for a given service in terms of user data rates, the practical code sequence lengths are limited as well as the available number of codes within a code set. In this paper a communications payload for Small Satellite Applications with CDMA uplink and C/TDMA downlink under the constraint of bandwidth limitations is proposed. To optimise the performance under the above addressed imperfections the system provides ability for power control and synchronisation for the CDMA uplink. The major objectives of this project are studying, development and testing of such a system for educational purposes and technology development at Hochschule Bremen.
NASA Astrophysics Data System (ADS)
Lea, J.
2017-12-01
The quantification of glacier change is a key variable within glacier monitoring, with the method used potentially being crucial to ensuring that data can be appropriately compared with environmental data. The topic and timescales of study (e.g. land/marine terminating environments; sub-annual/decadal/centennial/millennial timescales) often mean that different methods are more suitable for different problems. However, depending on the GIS/coding expertise of the user, some methods can potentially be time consuming to undertake, making large-scale studies problematic. In addition, examples exist where different users have nominally applied the same methods in different studies, though with minor methodological inconsistencies in their approach. In turn, this will have implications for data homogeneity where regional/global datasets may be constructed. Here, I present a simple toolbox scripted in a Matlab® environment that requires only glacier margin and glacier centreline data to quantify glacier length, glacier change between observations, rate of change, in addition to other metrics. The toolbox includes the option to apply the established centreline or curvilinear box methods, or a new method: the variable box method - designed for tidewater margins where box width is defined as the total width of the individual terminus observation. The toolbox is extremely flexible, and has the option to be applied as either Matlab® functions within user scripts, or via a graphical user interface (GUI) for those unfamiliar with a coding environment. In both instances, there is potential to apply the methods quickly to large datasets (100s-1000s of glaciers, with potentially similar numbers of observations each), thus ensuring large scale methodological consistency (and therefore data homogeneity) and allowing regional/global scale analyses to be achievable for those with limited GIS/coding experience. The toolbox has been evaluated against idealised scenarios demonstrating its accuracy, while feedback from undergraduate students who have trialled the toolbox is that it is intuitive and simple to use. When released, the toolbox will be free and open source allowing users to potentially modify, improve and expand upon the current version.
MODEST: a web-based design tool for oligonucleotide-mediated genome engineering and recombineering
Bonde, Mads T.; Klausen, Michael S.; Anderson, Mads V.; Wallin, Annika I.N.; Wang, Harris H.; Sommer, Morten O.A.
2014-01-01
Recombineering and multiplex automated genome engineering (MAGE) offer the possibility to rapidly modify multiple genomic or plasmid sites at high efficiencies. This enables efficient creation of genetic variants including both single mutants with specifically targeted modifications as well as combinatorial cell libraries. Manual design of oligonucleotides for these approaches can be tedious, time-consuming, and may not be practical for larger projects targeting many genomic sites. At present, the change from a desired phenotype (e.g. altered expression of a specific protein) to a designed MAGE oligo, which confers the corresponding genetic change, is performed manually. To address these challenges, we have developed the MAGE Oligo Design Tool (MODEST). This web-based tool allows designing of MAGE oligos for (i) tuning translation rates by modifying the ribosomal binding site, (ii) generating translational gene knockouts and (iii) introducing other coding or non-coding mutations, including amino acid substitutions, insertions, deletions and point mutations. The tool automatically designs oligos based on desired genotypic or phenotypic changes defined by the user, which can be used for high efficiency recombineering and MAGE. MODEST is available for free and is open to all users at http://modest.biosustain.dtu.dk. PMID:24838561
Lim, Chun Shen; Brown, Chris M
2017-01-01
Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community.
Recent Developments in the VISRAD 3-D Target Design and Radiation Simulation Code
NASA Astrophysics Data System (ADS)
Macfarlane, Joseph; Golovkin, Igor; Sebald, James
2017-10-01
The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, Z, and LMJ. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e.g., that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. VISRAD includes a variety of user-friendly graphics for setting up targets and displaying results, can readily display views from any point in space, and can be used to generate image sequences for animations. We will discuss recent improvements to conveniently assess beam capture on target and beam clearance of diagnostic components, as well as plans for future developments.
Lim, Chun Shen; Brown, Chris M.
2018-01-01
Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community. PMID:29354101
Problem Management Module: An Innovative System to Improve Problem List Workflow
Hodge, Chad M.; Kuttler, Kathryn G.; Bowes, Watson A.; Narus, Scott P.
2014-01-01
Electronic problem lists are essential to modern health record systems, with a primary goal to serve as the repository of a patient’s current health issues. Additionally, coded problems can be used to drive downstream activities such as decision support, evidence-based medicine, billing, and cohort generation for research. Meaningful Use also requires use of a coded problem list. Over the course of three years, Intermountain Healthcare developed a problem management module (PMM) that provided innovative functionality to improve clinical workflow and boost problem list adoption, e.g. smart search, user customizable views, problem evolution, and problem timelines. In 23 months of clinical use, clinicians entered over 70,000 health issues, the percentage of free-text items dropped to 1.2%, completeness of problem list items increased by 14%, and more collaborative habits were initiated. PMID:25954372
48 CFR 304.7001 - Numbering acquisitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... contracting office identification codes currently in use is contained in the DCIS Users' Manual, available at... than one code may apply in a specific situation, or for additional codes, refer to the DCIS Users' Manual or consult with the cognizant DCIS coordinator/focal point for guidance on which code governs...
Uptake of an Incentive-Based mHealth App: Process Evaluation of the Carrot Rewards App
Oh, Paul; Alter, David; Leahey, Tricia; Kwan, Matthew; Faulkner, Guy
2017-01-01
Background Behavioral economics has stimulated renewed interest in financial health incentives worldwide. The Carrot Rewards app was developed as part of a public-private partnership to reward Canadians with loyalty points (eg, movies and groceries) for downloading the app, referring friends, and completing an average of 1 to 2 educational health quizzes per week (“micro-learning”), with long-term objectives of increasing health knowledge and encouraging healthy behaviors. Objective The main objective of this study was to evaluate uptake of a loyalty points-based mHealth app during the exclusive 3-month launch period in British Columbia (BC), Canada. The secondary aims were to describe the health and sociodemographic characteristics of users, as well as participation levels (eg, proportion of quizzes completed and friends referred). Methods The app was promoted via loyalty program email campaigns (1.64 million emails). Number of downloads and registrations (users enter age, gender, and valid BC postal code to register) were collected. Additional sociodemographics were inferred by linking postal codes with census data at the local health area (LHA) level. Health risk assessments were also deployed. Participation levels were collected over 3 months and descriptive data were presented. Results In 3 months, 67,464 individuals downloaded the app; in its first week, Carrot Rewards was the most downloaded health app in Canada. Among valid users (n=57,885; at least one quiz completed), the majority were female (62.96%; 36,446/57,885) and aged 18 to 34 years (54.34%; 31,459/57,885). More than half of the users (52.40%; 30,332/57,885) resided in LHAs where the median personal income was below the provincial average (Can $28,765). Furthermore, 64.42% (37,291/57,885) of users lived in metropolitan (ie, urban) LHAs, compared with 56.17% of the general BC population. The most prevalent risk factors were “not” meeting physical activity guidelines (72.70%; 31,765/43,692) and “not” getting the flu shot last year (67.69%; 30,286/44,739). Regarding participation, 60.05% (34,761/57,885) of users were classified as “very high” engagers (>75% quiz completion rate). Conclusions Early results suggest that loyalty points may promote mHealth app uptake. The app was downloaded by younger females especially, and BC residents from higher and lower income regions were equally represented. Loyalty points appear to have driven participation throughout the inaugural 3-month period (ie, quiz completion). PMID:28559224
Bahreyni Toossi, M T; Moradi, H; Zare, H
2008-01-01
In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.
MAC/GMC 4.0 User's Manual: Example Problem Manual. Volume 3
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2002-01-01
This document is the third volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, Volume 2 is the Keywords Manual, and this document is the Example Problems Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material, have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume provides in-depth descriptions of 43 example problems, which were specially designed to highlight many of the most important capabilities of the code. The actual input files associated with each example problem are distributed with the MAC/GMC 4.0 software; thus providing the user with a convenient starting point for their own specialized problems of interest.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Priedhorsky, Reid; Randles, Tim
Charliecloud is a set of scripts to let users run a virtual cluster of virtual machines (VMs) on a desktop or supercomputer. Key functions include: 1. Creating (typically by installing an operating system from vendor media) and updating VM images; 2. Running a single VM; 3. Running multiple VMs in a virtual cluster. The virtual machines can talk to one another over the network and (in some cases) the outside world. This is accomplished by calling external programs such as QEMU and the Virtual Distributed Ethernet (VDE) suite. The goal is to let users have a virtual cluster containing nodesmore » where they have privileged access, while isolating that privilege within the virtual cluster so it cannot affect the physical compute resources. Host configuration enforces security; this is not included in Charliecloud, though security guidelines are included in its documentation and Charliecloud is designed to facilitate such configuration. Charliecloud manages passing information from host computers into and out of the virtual machines, such as parameters of the virtual cluster, input data specified by the user, output data from virtual compute jobs, VM console display, and network connections (e.g., SSH or X11). Parameters for the virtual cluster (number of VMs, RAM and disk per VM, etc.) are specified by the user or gathered from the environment (e.g., SLURM environment variables). Example job scripts are included. These include computation examples (such as a "hello world" MPI job) as well as performance tests. They also include a security test script to verify that the virtual cluster is appropriately sandboxed. Tests include: 1. Pinging hosts inside and outside the virtual cluster to explore connectivity; 2. Port scans (again inside and outside) to see what services are available; 3. Sniffing tests to see what traffic is visible to running VMs; 4. IP address spoofing to test network functionality in this case; 5. File access tests to make sure host access permissions are enforced. This test script is not a comprehensive scanner and does not test for specific vulnerabilities. Importantly, no information about physical hosts or network topology is included in this script (or any of Charliecloud); while part of a sensible test, such information is specified by the user when the test is run. That is, one cannot learn anything about the LANL network or computing infrastructure by examining Charliecloud code.« less
User's Manual for FEMOM3DR. Version 1.0
NASA Technical Reports Server (NTRS)
Reddy, C. J.
1998-01-01
FEMoM3DR is a computer code written in FORTRAN 77 to compute radiation characteristics of antennas on 3D body using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. The code is written to handle different feeding structures like coaxial line, rectangular waveguide, and circular waveguide. This code uses the tetrahedral elements, with vector edge basis functions for FEM and triangular elements with roof-top basis functions for MoM. By virtue of FEM, this code can handle any arbitrary shaped three dimensional bodies with inhomogeneous lossy materials; and due to MoM the computational domain can be terminated in any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.
DefEX: Hands-On Cyber Defense Exercise for Undergraduate Students
2011-07-01
Injection, and 4) File Upload. Next, the students patched the associated flawed Perl and PHP Hypertext Preprocessor ( PHP ) code. Finally, students...underlying script. The Zora XSS vulnerability existed in a PHP file that echoed unfiltered user input back to the screen. To eliminate the...vulnerability, students filtered the input using the PHP htmlentities function and retested the code. The htmlentities function translates certain ambiguous
Rare and low-frequency coding variants alter human adult height
Marouli, Eirini; Graff, Mariaelisa; Medina-Gomez, Carolina; Lo, Ken Sin; Wood, Andrew R; Kjaer, Troels R; Fine, Rebecca S; Lu, Yingchang; Schurmann, Claudia; Highland, Heather M; Rüeger, Sina; Thorleifsson, Gudmar; Justice, Anne E; Lamparter, David; Stirrups, Kathleen E; Turcot, Valérie; Young, Kristin L; Winkler, Thomas W; Esko, Tõnu; Karaderi, Tugce; Locke, Adam E; Masca, Nicholas GD; Ng, Maggie CY; Mudgal, Poorva; Rivas, Manuel A; Vedantam, Sailaja; Mahajan, Anubha; Guo, Xiuqing; Abecasis, Goncalo; Aben, Katja K; Adair, Linda S; Alam, Dewan S; Albrecht, Eva; Allin, Kristine H; Allison, Matthew; Amouyel, Philippe; Appel, Emil V; Arveiler, Dominique; Asselbergs, Folkert W; Auer, Paul L; Balkau, Beverley; Banas, Bernhard; Bang, Lia E; Benn, Marianne; Bergmann, Sven; Bielak, Lawrence F; Blüher, Matthias; Boeing, Heiner; Boerwinkle, Eric; Böger, Carsten A; Bonnycastle, Lori L; Bork-Jensen, Jette; Bots, Michiel L; Bottinger, Erwin P; Bowden, Donald W; Brandslund, Ivan; Breen, Gerome; Brilliant, Murray H; Broer, Linda; Burt, Amber A; Butterworth, Adam S; Carey, David J; Caulfield, Mark J; Chambers, John C; Chasman, Daniel I; Chen, Yii-Der Ida; Chowdhury, Rajiv; Christensen, Cramer; Chu, Audrey Y; Cocca, Massimiliano; Collins, Francis S; Cook, James P; Corley, Janie; Galbany, Jordi Corominas; Cox, Amanda J; Cuellar-Partida, Gabriel; Danesh, John; Davies, Gail; de Bakker, Paul IW; de Borst, Gert J.; de Denus, Simon; de Groot, Mark CH; de Mutsert, Renée; Deary, Ian J; Dedoussis, George; Demerath, Ellen W; den Hollander, Anneke I; Dennis, Joe G; Di Angelantonio, Emanuele; Drenos, Fotios; Du, Mengmeng; Dunning, Alison M; Easton, Douglas F; Ebeling, Tapani; Edwards, Todd L; Ellinor, Patrick T; Elliott, Paul; Evangelou, Evangelos; Farmaki, Aliki-Eleni; Faul, Jessica D; Feitosa, Mary F; Feng, Shuang; Ferrannini, Ele; Ferrario, Marco M; Ferrieres, Jean; Florez, Jose C; Ford, Ian; Fornage, Myriam; Franks, Paul W; Frikke-Schmidt, Ruth; Galesloot, Tessel E; Gan, Wei; Gandin, Ilaria; Gasparini, Paolo; Giedraitis, Vilmantas; Giri, Ayush; Girotto, Giorgia; Gordon, Scott D; Gordon-Larsen, Penny; Gorski, Mathias; Grarup, Niels; Grove, Megan L.; Gudnason, Vilmundur; Gustafsson, Stefan; Hansen, Torben; Harris, Kathleen Mullan; Harris, Tamara B; Hattersley, Andrew T; Hayward, Caroline; He, Liang; Heid, Iris M; Heikkilä, Kauko; Helgeland, Øyvind; Hernesniemi, Jussi; Hewitt, Alex W; Hocking, Lynne J; Hollensted, Mette; Holmen, Oddgeir L; Hovingh, G. Kees; Howson, Joanna MM; Hoyng, Carel B; Huang, Paul L; Hveem, Kristian; Ikram, M. Arfan; Ingelsson, Erik; Jackson, Anne U; Jansson, Jan-Håkan; Jarvik, Gail P; Jensen, Gorm B; Jhun, Min A; Jia, Yucheng; Jiang, Xuejuan; Johansson, Stefan; Jørgensen, Marit E; Jørgensen, Torben; Jousilahti, Pekka; Jukema, J Wouter; Kahali, Bratati; Kahn, René S; Kähönen, Mika; Kamstrup, Pia R; Kanoni, Stavroula; Kaprio, Jaakko; Karaleftheri, Maria; Kardia, Sharon LR; Karpe, Fredrik; Kee, Frank; Keeman, Renske; Kiemeney, Lambertus A; Kitajima, Hidetoshi; Kluivers, Kirsten B; Kocher, Thomas; Komulainen, Pirjo; Kontto, Jukka; Kooner, Jaspal S; Kooperberg, Charles; Kovacs, Peter; Kriebel, Jennifer; Kuivaniemi, Helena; Küry, Sébastien; Kuusisto, Johanna; La Bianca, Martina; Laakso, Markku; Lakka, Timo A; Lange, Ethan M; Lange, Leslie A; Langefeld, Carl D; Langenberg, Claudia; Larson, Eric B; Lee, I-Te; Lehtimäki, Terho; Lewis, Cora E; Li, Huaixing; Li, Jin; Li-Gao, Ruifang; Lin, Honghuang; Lin, Li-An; Lin, Xu; Lind, Lars; Lindström, Jaana; Linneberg, Allan; Liu, Yeheng; Liu, Yongmei; Lophatananon, Artitaya; Luan, Jian'an; Lubitz, Steven A; Lyytikäinen, Leo-Pekka; Mackey, David A; Madden, Pamela AF; Manning, Alisa K; Männistö, Satu; Marenne, Gaëlle; Marten, Jonathan; Martin, Nicholas G; Mazul, Angela L; Meidtner, Karina; Metspalu, Andres; Mitchell, Paul; Mohlke, Karen L; Mook-Kanamori, Dennis O; Morgan, Anna; Morris, Andrew D; Morris, Andrew P; Müller-Nurasyid, Martina; Munroe, Patricia B; Nalls, Mike A; Nauck, Matthias; Nelson, Christopher P; Neville, Matt; Nielsen, Sune F; Nikus, Kjell; Njølstad, Pål R; Nordestgaard, Børge G; Ntalla, Ioanna; O'Connel, Jeffrey R; Oksa, Heikki; Loohuis, Loes M Olde; Ophoff, Roel A; Owen, Katharine R; Packard, Chris J; Padmanabhan, Sandosh; Palmer, Colin NA; Pasterkamp, Gerard; Patel, Aniruddh P; Pattie, Alison; Pedersen, Oluf; Peissig, Peggy L; Peloso, Gina M; Pennell, Craig E; Perola, Markus; Perry, James A; Perry, John R.B.; Person, Thomas N; Pirie, Ailith; Polasek, Ozren; Posthuma, Danielle; Raitakari, Olli T; Rasheed, Asif; Rauramaa, Rainer; Reilly, Dermot F; Reiner, Alex P; Renström, Frida; Ridker, Paul M; Rioux, John D; Robertson, Neil; Robino, Antonietta; Rolandsson, Olov; Rudan, Igor; Ruth, Katherine S; Saleheen, Danish; Salomaa, Veikko; Samani, Nilesh J; Sandow, Kevin; Sapkota, Yadav; Sattar, Naveed; Schmidt, Marjanka K; Schreiner, Pamela J; Schulze, Matthias B; Scott, Robert A; Segura-Lepe, Marcelo P; Shah, Svati; Sim, Xueling; Sivapalaratnam, Suthesh; Small, Kerrin S; Smith, Albert Vernon; Smith, Jennifer A; Southam, Lorraine; Spector, Timothy D; Speliotes, Elizabeth K; Starr, John M; Steinthorsdottir, Valgerdur; Stringham, Heather M; Stumvoll, Michael; Surendran, Praveen; Hart, Leen M ‘t; Tansey, Katherine E; Tardif, Jean-Claude; Taylor, Kent D; Teumer, Alexander; Thompson, Deborah J; Thorsteinsdottir, Unnur; Thuesen, Betina H; Tönjes, Anke; Tromp, Gerard; Trompet, Stella; Tsafantakis, Emmanouil; Tuomilehto, Jaakko; Tybjaerg-Hansen, Anne; Tyrer, Jonathan P; Uher, Rudolf; Uitterlinden, André G; Ulivi, Sheila; van der Laan, Sander W; Van Der Leij, Andries R; van Duijn, Cornelia M; van Schoor, Natasja M; van Setten, Jessica; Varbo, Anette; Varga, Tibor V; Varma, Rohit; Edwards, Digna R Velez; Vermeulen, Sita H; Vestergaard, Henrik; Vitart, Veronique; Vogt, Thomas F; Vozzi, Diego; Walker, Mark; Wang, Feijie; Wang, Carol A; Wang, Shuai; Wang, Yiqin; Wareham, Nicholas J; Warren, Helen R; Wessel, Jennifer; Willems, Sara M; Wilson, James G; Witte, Daniel R; Woods, Michael O; Wu, Ying; Yaghootkar, Hanieh; Yao, Jie; Yao, Pang; Yerges-Armstrong, Laura M; Young, Robin; Zeggini, Eleftheria; Zhan, Xiaowei; Zhang, Weihua; Zhao, Jing Hua; Zhao, Wei; Zhao, Wei; Zheng, He; Zhou, Wei; Rotter, Jerome I; Boehnke, Michael; Kathiresan, Sekar; McCarthy, Mark I; Willer, Cristen J; Stefansson, Kari; Borecki, Ingrid B; Liu, Dajiang J; North, Kari E; Heard-Costa, Nancy L; Pers, Tune H; Lindgren, Cecilia M; Oxvig, Claus; Kutalik, Zoltán; Rivadeneira, Fernando; Loos, Ruth JF; Frayling, Timothy M; Hirschhorn, Joel N; Deloukas, Panos; Lettre, Guillaume
2016-01-01
Summary Height is a highly heritable, classic polygenic trait with ∼700 common associated variants identified so far through genome-wide association studies. Here, we report 83 height-associated coding variants with lower minor allele frequencies (range of 0.1-4.8%) and effects of up to 2 cm/allele (e.g. in IHH, STC2, AR and CRISPLD2), >10 times the average effect of common variants. In functional follow-up studies, rare height-increasing alleles of STC2 (+1-2 cm/allele) compromised proteolytic inhibition of PAPP-A and increased cleavage of IGFBP-4 in vitro, resulting in higher bioavailability of insulin-like growth factors. These 83 height-associated variants overlap genes mutated in monogenic growth disorders and highlight new biological candidates (e.g. ADAMTS3, IL11RA, NOX4) and pathways (e.g. proteoglycan/glycosaminoglycan synthesis) involved in growth. Our results demonstrate that sufficiently large sample sizes can uncover rare and low-frequency variants of moderate to large effect associated with polygenic human phenotypes, and that these variants implicate relevant genes and pathways. PMID:28146470
Quality Scalability Aware Watermarking for Visual Content.
Bhowmik, Deepayan; Abhayaratne, Charith
2016-11-01
Scalable coding-based content adaptation poses serious challenges to traditional watermarking algorithms, which do not consider the scalable coding structure and hence cannot guarantee correct watermark extraction in media consumption chain. In this paper, we propose a novel concept of scalable blind watermarking that ensures more robust watermark extraction at various compression ratios while not effecting the visual quality of host media. The proposed algorithm generates scalable and robust watermarked image code-stream that allows the user to constrain embedding distortion for target content adaptations. The watermarked image code-stream consists of hierarchically nested joint distortion-robustness coding atoms. The code-stream is generated by proposing a new wavelet domain blind watermarking algorithm guided by a quantization based binary tree. The code-stream can be truncated at any distortion-robustness atom to generate the watermarked image with the desired distortion-robustness requirements. A blind extractor is capable of extracting watermark data from the watermarked images. The algorithm is further extended to incorporate a bit-plane discarding-based quantization model used in scalable coding-based content adaptation, e.g., JPEG2000. This improves the robustness against quality scalability of JPEG2000 compression. The simulation results verify the feasibility of the proposed concept, its applications, and its improved robustness against quality scalable content adaptation. Our proposed algorithm also outperforms existing methods showing 35% improvement. In terms of robustness to quality scalable video content adaptation using Motion JPEG2000 and wavelet-based scalable video coding, the proposed method shows major improvement for video watermarking.
Geuens, Jonas; Swinnen, Thijs Willem; Westhovens, Rene; de Vlam, Kurt; Geurts, Luc; Vanden Abeele, Vero
2016-10-13
Chronic arthritis (CA), an umbrella term for inflammatory rheumatic and other musculoskeletal diseases, is highly prevalent. Effective disease-modifying antirheumatic drugs for CA are available, with the exception of osteoarthritis, but require a long-term commitment of patients to comply with the medication regimen and management program as well as a tight follow-up by the treating physician and health professionals. Additionally, patients are advised to participate in physical exercise programs. Adherence to exercises and physical activity programs is often very low. Patients would benefit from support to increase medication compliance as well as compliance to the physical exercise programs. To address these shortcomings, health apps for CA patients have been created. These mobile apps assist patients in self-management of overall health measures, health prevention, and disease management. By including persuasive principles designed to reinforce, change, or shape attitudes or behaviors, health apps can transform into support tools that motivate and stimulate users to achieve or keep up with target behavior, also called persuasive systems. However, the extent to which health apps for CA patients consciously and successfully employ such persuasive principles remains unknown. The objective of this study was to evaluate the number and type of persuasive principles present in current health apps for CA patients. A review of apps for arthritis patients was conducted across the three major app stores (Google Play, Apple App Store, and Windows Phone Store). Collected apps were coded according to 37 persuasive principles, based on an altered version of the Persuasive System Design taxonomy of Oinas-Kukkonen and Harjuma and the taxonomy of Behavior Change Techniques of Michie and Abraham. In addition, user ratings, number of installs, and price of the apps were also coded. We coded 28 apps. On average, 5.8 out of 37 persuasive principles were used in each app. The most used category of persuasive principles was System Credibility with an average of 2.6 principles. Task Support was the second most used, with an average of 2.3 persuasive principles. Next was Dialogue Support with an average of 0.5 principles. Social Support was last with an average of 0.01 persuasive principles only. Current health apps for CA patients would benefit from adding Social Support techniques (eg, social media, user fora) and extending Dialogue Support techniques (eg, rewards, praise). The addition of automated tracking of health-related parameters (eg, physical activity, step count) could further reduce the effort for CA patients to manage their disease and thus increase Task Support. Finally, apps for health could benefit from a more evidence-based approach, both in developing the app as well as ensuring that content can be verified as scientifically proven, which will result in enhanced System Credibility.
ARES: automated response function code. Users manual. [HPGAM and LSQVM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maung, T.; Reynolds, G.M.
This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.
Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools
NASA Astrophysics Data System (ADS)
Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.
2011-12-01
Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate temporal evolution of the magnetic field structure and/or fast electron population implied by the electron acceleration and transport. This work was supported in part by NSF grants AGS-0961867, AST-0908344, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, STFC/PPARC Advanced Fellowship, and the Leverhulme Trust, UK. Financial support by the European Commission through the SOLAIRE and HESPE Networks is gratefully acknowledged.
Cannabis Mobile Apps: A Content Analysis.
Ramo, Danielle E; Popova, Lucy; Grana, Rachel; Zhao, Shirley; Chavez, Kathryn
2015-08-12
Mobile technology is pervasive and widely used to obtain information about drugs such as cannabis, especially in a climate of rapidly changing cannabis policy; yet the content of available cannabis apps is largely unknown. Understanding the resources available to those searching for cannabis apps will clarify how this technology is being used to reflect and influence cannabis use behavior. We investigated the content of 59 cannabis-related mobile apps for Apple and Android devices as of November 26, 2014. The Apple and Google Play app stores were searched using the terms "cannabis" and "marijuana." Three trained coders classified the top 20 apps for each term and each store, using a coding guide. Apps were examined for the presence of 20 content codes derived by the researchers. Total apps available for each search term were 124 for cannabis and 218 for marijuana in the Apple App Store, and 250 each for cannabis and marijuana on Google Play. The top 20 apps in each category in each store were coded for 59 independent apps (30 Apple, 29 Google Play). The three most common content areas were cannabis strain classification (33.9%), facts about cannabis (20.3%), and games (20.3%). In the Apple App Store, most apps were free (77%), all were rated "17+" years, and the average user rating was 3.9/5 stars. The most popular apps provided cannabis strain classifications (50%), dispensary information (27%), or general facts about cannabis (27%). Only one app (3%) provided information or resources related to cannabis abuse, addiction, or treatment. On Google Play, most apps were free (93%), rated "high maturity" (79%), and the average user rating was 4.1/5. The most popular app types offered games (28%), phone utilities (eg, wallpaper, clock; 21%) and cannabis food recipes (21%); no apps addressed abuse, addiction, or treatment. Cannabis apps are generally free and highly rated. Apps were most often informational (facts, strain classification), or recreational (games), likely reflecting and influencing the growing acceptance of cannabis for medical and recreational purposes. Apps addressing addiction or cessation were underrepresented in the most popular cannabis mobile apps. Differences among apps for Apple and Android platforms likely reflect differences in the population of users, developer choice, and platform regulations.
Cannabis Mobile Apps: A Content Analysis
Popova, Lucy; Grana, Rachel; Zhao, Shirley; Chavez, Kathryn
2015-01-01
Background Mobile technology is pervasive and widely used to obtain information about drugs such as cannabis, especially in a climate of rapidly changing cannabis policy; yet the content of available cannabis apps is largely unknown. Understanding the resources available to those searching for cannabis apps will clarify how this technology is being used to reflect and influence cannabis use behavior. Objective We investigated the content of 59 cannabis-related mobile apps for Apple and Android devices as of November 26, 2014. Methods The Apple and Google Play app stores were searched using the terms “cannabis” and “marijuana.” Three trained coders classified the top 20 apps for each term and each store, using a coding guide. Apps were examined for the presence of 20 content codes derived by the researchers. Results Total apps available for each search term were 124 for cannabis and 218 for marijuana in the Apple App Store, and 250 each for cannabis and marijuana on Google Play. The top 20 apps in each category in each store were coded for 59 independent apps (30 Apple, 29 Google Play). The three most common content areas were cannabis strain classification (33.9%), facts about cannabis (20.3%), and games (20.3%). In the Apple App Store, most apps were free (77%), all were rated “17+” years, and the average user rating was 3.9/5 stars. The most popular apps provided cannabis strain classifications (50%), dispensary information (27%), or general facts about cannabis (27%). Only one app (3%) provided information or resources related to cannabis abuse, addiction, or treatment. On Google Play, most apps were free (93%), rated “high maturity” (79%), and the average user rating was 4.1/5. The most popular app types offered games (28%), phone utilities (eg, wallpaper, clock; 21%) and cannabis food recipes (21%); no apps addressed abuse, addiction, or treatment. Conclusions Cannabis apps are generally free and highly rated. Apps were most often informational (facts, strain classification), or recreational (games), likely reflecting and influencing the growing acceptance of cannabis for medical and recreational purposes. Apps addressing addiction or cessation were underrepresented in the most popular cannabis mobile apps. Differences among apps for Apple and Android platforms likely reflect differences in the population of users, developer choice, and platform regulations. PMID:26268634
WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNamara, A; Held, K; Paganetti, H
2016-06-15
Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less
Improved Iterative Decoding of Network-Channel Codes for Multiple-Access Relay Channel.
Majumder, Saikat; Verma, Shrish
2015-01-01
Cooperative communication using relay nodes is one of the most effective means of exploiting space diversity for low cost nodes in wireless network. In cooperative communication, users, besides communicating their own information, also relay the information of other users. In this paper we investigate a scheme where cooperation is achieved using a common relay node which performs network coding to provide space diversity for two information nodes transmitting to a base station. We propose a scheme which uses Reed-Solomon error correcting code for encoding the information bit at the user nodes and convolutional code as network code, instead of XOR based network coding. Based on this encoder, we propose iterative soft decoding of joint network-channel code by treating it as a concatenated Reed-Solomon convolutional code. Simulation results show significant improvement in performance compared to existing scheme based on compound codes.
VizieR Online Data Catalog: Opacities from the Opacity Project (Seaton+, 1995)
NASA Astrophysics Data System (ADS)
Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.
1997-08-01
1 CODES. ***** 1.1 Code rop.for ************ This code reads opacity files written in standard OP format. Its main purpose is to provide documentation on the contents of the files. This code, like the other codes provided, prompts for the name of the file (or files) to be read. The file names read in response to the prompt may have up to 128 characters. 1.2 Code opfit.for ************** This code reads opacity files in standard OP format, and provides for interpolation of opacities to any required values of temperature and mass-density. The method used is described in OPF. The code prompts for the name of a file giving all required control parameters. As an example, the file opfit.dat is provided (users will need to change directory names and file names). The use of opfit.for is illustrated using opfit.dat. Most users will probably want to adapt opfit.for for use as a subroutine in other codes. Timings for DEC 7000 ALPHA: 0.3 sec for data read and initialisations; then 0.0007 sec for each temperature-density point. Users who like OPAL formats should note that opfit.for has a facility to produce files of OP data in OPAL-type formats. 1.3 Code ixz.for ************ This code provides for interpolations to any required values of X and Z. See IXZ. It prompts for the name of a file giving all required control parameters. An example of such a file if provided, ixz.dat (the user will need to change directory and file names). The output files have names s92INT.'nnn'. The user specifies the first value of nnn, and the number of files to be produced. 2. DATA FILES ********** 2.1 Data files for solar metal-mix ****************************** Data for solar metal-mix s92 as defined in SYMP. These files are from version 2 runs of December 1994 (see IXZ for details on Version 2). There are 213 files with names s92.'nnn', 'nnn'=201 to 413. Each file occupies 83762 bytes. The file s92.version2 gives values of X (hydrogen mass-faction) and Z (metals mass-fraction) for each value of 'nnn'. The user can get s92.version2, select the values of 'nnn' required, then get the required files s92.'nnn'. The user can see the file in ftp, displayed on the screen, by typing "get s92.version2 -". The files s92.'nnn' can be used with opfit.for to obtain opacities for any requires value of temperature and mass density. Files for other metal-mixtures will be added in due course. Send requests to mjs@star.ucl.ac.uk. 2.2 Files for interpolation in X and Z ********************************** The data files have names s92xz.'mmm', where 'mmm'=001 to 096. They differ from the standard OP files (such as s92.'nnn' --- section 2.1 above) in that they contain information giving derivatives of opacities with respect to X and Z. Each file s92xz.'mmm' occupies 148241 bytes. The interpolations to any required values of X and Z are made using ixz.for. Timings: on DEC 7000 ALPHA, 2.16 sec for each new-mixture file. For interpolations to some specified values of X and Z, one requires just 4 files s92xz.'mmm'. Most users will not require the complete set of files s92xz.'mmm'. The file s92xz.index includes a table (starting on line 3) giving values, for each 'mmm' file, of x,y,z (abundances by number-factions) and X,Y,Z (abundances by mass-fractions). Users are advised to get the file s92.index, and select values of 'mmm' for files required, then get those files. The files produced by ixz.for are in standard OP format and can be used with opfit.for to obtain opacities for any required values of temperature and mass density. 3 RECOMMENDED PROCEDURE FOR USE OF OPACITY FILES ********************************************** (1) Get the file s92.version2. (2) If the values of X and Z you require are available in the files s92.'nnn' then get those files. (3) If not, get the file s92xz.index. (4) Select from s92xz.index the values of 'mmm' which cover the range of X and Z in which your are interested. Get those files and use ixz.for to generate files for your exact required values of X and Z. (5) Note that the exact abundance mixtures used are specified in each file (see rop.for). Also each run of opfit.for produces a table of abundances. (6) If you want a metal-mix different from that of s92, contact mjs@star.ucl.ac.uk. 4 FUTURE DEVELOPMENTS ******************* (1) Data for the calculation of radiative forces are provided as the CDS catalog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doucet, M.; Durant Terrasson, L.; Mouton, J.
2006-07-01
Criticality safety evaluations implement requirements to proof of sufficient sub critical margins outside of the reactor environment for example in fuel fabrication plants. Basic criticality data (i.e., criticality standards) are used in the determination of sub critical margins for all processes involving plutonium or enriched uranium. There are several criticality international standards, e.g., ARH-600, which is one the US nuclear industry relies on. The French Nuclear Safety Authority (DGSNR and its advising body IRSN) has requested AREVA NP to review the criticality standards used for the evaluation of its Low Enriched Uranium fuel fabrication plants with CRISTAL V0, the recentlymore » updated French criticality evaluation package. Criticality safety is a concern for every phase of the fabrication process including UF{sub 6} cylinder storage, UF{sub 6}-UO{sub 2} conversion, powder storage, pelletizing, rod loading, assembly fabrication, and assembly transportation. Until 2003, the accepted criticality standards were based on the French CEA work performed in the late seventies with the APOLLO1 cell/assembly computer code. APOLLO1 is a spectral code, used for evaluating the basic characteristics of fuel assemblies for reactor physics applications, which has been enhanced to perform criticality safety calculations. Throughout the years, CRISTAL, starting with APOLLO1 and MORET 3 (a 3D Monte Carlo code), has been improved to account for the growth of its qualification database and for increasing user requirements. Today, CRISTAL V0 is an up-to-date computational tool incorporating a modern basic microscopic cross section set based on JEF2.2 and the comprehensive APOLLO2 and MORET 4 codes. APOLLO2 is well suited for criticality standards calculations as it includes a sophisticated self shielding approach, a P{sub ij} flux determination, and a 1D transport (S{sub n}) process. CRISTAL V0 is the result of more than five years of development work focusing on theoretical approaches and the implementation of user-friendly graphical interfaces. Due to its comprehensive physical simulation and thanks to its broad qualification database with more than a thousand benchmark/calculation comparisons, CRISTAL V0 provides outstanding and reliable accuracy for criticality evaluations for configurations covering the entire fuel cycle (i.e. from enrichment, pellet/assembly fabrication, transportation, to fuel reprocessing). After a brief description of the calculation scheme and the physics algorithms used in this code package, results for the various fissile media encountered in a UO{sub 2} fuel fabrication plant will be detailed and discussed. (authors)« less
A Review on Spectral Amplitude Coding Optical Code Division Multiple Access
NASA Astrophysics Data System (ADS)
Kaur, Navpreet; Goyal, Rakesh; Rani, Monika
2017-06-01
This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.
Postdeployment User Guide: Transition Workbook for Combat Veterans
2009-12-07
including pork, and especially fatty cuts (eg, prime, chuck). 4. CARBOHYDRATES • Eat whole grains , fresh fruits , and vegetables that are high in fiber...periods of time without eating. Have a healthy snack between meals. 4. Increase complex carbohydrates (eg, whole grains ) and starches in your diet ...back to baseline at some point or having a particularly bad stretch does not mean you have to start again. It just means you may need to add some
NASA Technical Reports Server (NTRS)
Smith, S. D.
1984-01-01
A users manual for the RAMP2 computer code is provided. The RAMP2 code can be used to model the dominant phenomena which affect the prediction of liquid and solid rocket nozzle and orbital plume flow fields. The general structure and operation of RAMP2 are discussed. A user input/output guide for the modified TRAN72 computer code and the RAMP2F code is given. The application and use of the BLIMPJ module are considered. Sample problems involving the space shuttle main engine and motor are included.
Application of Aeroelastic Solvers Based on Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Srivastava, Rakesh
1998-01-01
A pre-release version of the Navier-Stokes solver (TURBO) was obtained from MSU. Along with Dr. Milind Bakhle of the University of Toledo, subroutines for aeroelastic analysis were developed and added to the TURBO code to develop versions 1 and 2 of the TURBO-AE code. For specified mode shape, frequency and inter-blade phase angle the code calculates the work done by the fluid on the rotor for a prescribed sinusoidal motion. Positive work on the rotor indicates instability of the rotor. The version 1 of the code calculates the work for in-phase blade motions only. In version 2 of the code, the capability for analyzing all possible inter-blade phase angles, was added. The version 2 of TURBO-AE code was validated and delivered to NASA and the industry partners of the AST project. The capabilities and the features of the code are summarized in Refs. [1] & [2]. To release the version 2 of TURBO-AE, a workshop was organized at NASA Lewis, by Dr. Srivastava and Dr. M. A. Bakhle, both of the University of Toledo, in October of 1996 for the industry partners of NASA Lewis. The workshop provided the potential users of TURBO-AE, all the relevant information required in preparing the input data, executing the code, interpreting the results and bench marking the code on their computer systems. After the code was delivered to the industry partners, user support was also provided. A new version of the Navier-Stokes solver (TURBO) was later released by MSU. This version had significant changes and upgrades over the previous version. This new version was merged with the TURBO-AE code. Also, new boundary conditions for 3-D unsteady non-reflecting boundaries, were developed by researchers from UTRC, Ref. [3]. Time was spent on understanding, familiarizing, executing and implementing the new boundary conditions into the TURBO-AE code. Work was started on the phase lagged (time-shifted) boundary condition version (version 4) of the code. This will allow the users to calculate non-zero interblade phase angles using, only one blade passage for analysis.
User's Manual for FEMOM3DS. Version 1.0
NASA Technical Reports Server (NTRS)
Reddy, C.J.; Deshpande, M. D.
1997-01-01
FEMOM3DS is a computer code written in FORTRAN 77 to compute electromagnetic(EM) scattering characteristics of a three dimensional object with complex materials using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity and the triangular elements with the basis functions similar to that described for MoM at the outer boundary. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.
Towards a Consolidated Approach for the Assessment of Evaluation Models of Nuclear Power Reactors
Epiney, A.; Canepa, S.; Zerkak, O.; ...
2016-11-02
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Data Parallel Line Relaxation (DPLR) Code User Manual: Acadia - Version 4.01.1
NASA Technical Reports Server (NTRS)
Wright, Michael J.; White, Todd; Mangini, Nancy
2009-01-01
Data-Parallel Line Relaxation (DPLR) code is a computational fluid dynamic (CFD) solver that was developed at NASA Ames Research Center to help mission support teams generate high-value predictive solutions for hypersonic flow field problems. The DPLR Code Package is an MPI-based, parallel, full three-dimensional Navier-Stokes CFD solver with generalized models for finite-rate reaction kinetics, thermal and chemical non-equilibrium, accurate high-temperature transport coefficients, and ionized flow physics incorporated into the code. DPLR also includes a large selection of generalized realistic surface boundary conditions and links to enable loose coupling with external thermal protection system (TPS) material response and shock layer radiation codes.
QuIN: A Web Server for Querying and Visualizing Chromatin Interaction Networks
Thibodeau, Asa; Márquez, Eladio J.; Luo, Oscar; Ruan, Yijun; Shin, Dong-Guk; Stitzel, Michael L.; Ucar, Duygu
2016-01-01
Recent studies of the human genome have indicated that regulatory elements (e.g. promoters and enhancers) at distal genomic locations can interact with each other via chromatin folding and affect gene expression levels. Genomic technologies for mapping interactions between DNA regions, e.g., ChIA-PET and HiC, can generate genome-wide maps of interactions between regulatory elements. These interaction datasets are important resources to infer distal gene targets of non-coding regulatory elements and to facilitate prioritization of critical loci for important cellular functions. With the increasing diversity and complexity of genomic information and public ontologies, making sense of these datasets demands integrative and easy-to-use software tools. Moreover, network representation of chromatin interaction maps enables effective data visualization, integration, and mining. Currently, there is no software that can take full advantage of network theory approaches for the analysis of chromatin interaction datasets. To fill this gap, we developed a web-based application, QuIN, which enables: 1) building and visualizing chromatin interaction networks, 2) annotating networks with user-provided private and publicly available functional genomics and interaction datasets, 3) querying network components based on gene name or chromosome location, and 4) utilizing network based measures to identify and prioritize critical regulatory targets and their direct and indirect interactions. AVAILABILITY: QuIN’s web server is available at http://quin.jax.org QuIN is developed in Java and JavaScript, utilizing an Apache Tomcat web server and MySQL database and the source code is available under the GPLV3 license available on GitHub: https://github.com/UcarLab/QuIN/. PMID:27336171
NASA Technical Reports Server (NTRS)
Johnston, William E.; Gannon, Dennis; Nitzberg, Bill
2000-01-01
We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.
The UEA sRNA Workbench (version 4.4): a comprehensive suite of tools for analyzing miRNAs and sRNAs.
Stocks, Matthew B; Mohorianu, Irina; Beckers, Matthew; Paicu, Claudia; Moxon, Simon; Thody, Joshua; Dalmay, Tamas; Moulton, Vincent
2018-05-02
RNA interference, a highly conserved regulatory mechanism, is mediated via small RNAs. Recent technical advances enabled the analysis of larger, complex datasets and the investigation of microRNAs and the less known small interfering RNAs. However, the size and intricacy of current data requires a comprehensive set of tools, able to discriminate the patterns from the low-level, noise-like, variation; numerous and varied suggestions from the community represent an invaluable source of ideas for future tools, the ability of the community to contribute to this software is essential. We present a new version of the UEA sRNA Workbench, reconfigured to allow an easy insertion of new tools/workflows. In its released form, it comprises of a suite of tools in a user-friendly environment, with enhanced capabilities for a comprehensive processing of sRNA-seq data e.g. tools for an accurate prediction of sRNA loci (CoLIde) and miRNA loci (miRCat2), as well as workflows to guide the users through common steps such as quality checking of the input data, normalization of abundances or detection of differential expression represent the first step in sRNA-seq analyses. The UEA sRNA Workbench is available at: http://srna-workbench.cmp.uea.ac.uk The source code is available at: https://github.com/sRNAworkbenchuea/UEA_sRNA_Workbench. v.moulton@uea.ac.uk.
2010-03-01
proposed scheme for power and code allocation for the secondary user is outlined in Fig. 2. V. SIMULATION STUDIES We consider a primary DS - CDMA system...DATES COVERED (From - To) January 2008 – June 2009 4. TITLE AND SUBTITLE COGNITIVE CDMA CHANNELIZATION 5a. CONTRACT NUMBER In-House 5b. GRANT...TELEPHONE NUMBER (Include area code) N/A Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 Cognitive CDMA Channelization Kanke
Computer Aided Self-Forging Fragment Design,
1978-06-01
This value is reached so quickly that HEMP solutions using work hardening and those using only elastic—perfectly plastic formulations are quite...Elastic— Plastic Flow, UCRL—7322 , Lawrence Radiation Laboratory , Livermore , California (1969) . 4. Giroux , E. D . , HEMP Users Manual, UCRL—5l079...Laboratory, the HEMP computer code has been developed to serve as an effective design tool to simplify this task considerably. Using this code, warheads 78 06
Comparing apples and oranges: the Community Intercomparison Suite
NASA Astrophysics Data System (ADS)
Schutgens, Nick; Stier, Philip; Kershaw, Philip; Pascoe, Stephen
2015-04-01
Visual representation and comparison of geoscientific datasets presents a huge challenge due to the large variety of file formats and spatio-temporal sampling of data (be they observations or simulations). The Community Intercomparison Suite attempts to greatly simplify these tasks for users by offering an intelligent but simple command line tool for visualisation and colocation of diverse datasets. In addition, CIS can subset and aggregate large datasets into smaller more manageable datasets. Our philosophy is to remove as much as possible the need for specialist knowledge by the user of the structure of a dataset. The colocation of observations with model data is as simple as: "cis col
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lichtner, Peter C.; Hammond, Glenn E.; Lu, Chuan
PFLOTRAN solves a system of generally nonlinear partial differential equations describing multi-phase, multicomponent and multiscale reactive flow and transport in porous materials. The code is designed to run on massively parallel computing architectures as well as workstations and laptops (e.g. Hammond et al., 2011). Parallelization is achieved through domain decomposition using the PETSc (Portable Extensible Toolkit for Scientific Computation) libraries for the parallelization framework (Balay et al., 1997). PFLOTRAN has been developed from the ground up for parallel scalability and has been run on up to 218 processor cores with problem sizes up to 2 billion degrees of freedom. Writtenmore » in object oriented Fortran 90, the code requires the latest compilers compatible with Fortran 2003. At the time of this writing this requires gcc 4.7.x, Intel 12.1.x and PGC compilers. As a requirement of running problems with a large number of degrees of freedom, PFLOTRAN allows reading input data that is too large to fit into memory allotted to a single processor core. The current limitation to the problem size PFLOTRAN can handle is the limitation of the HDF5 file format used for parallel IO to 32 bit integers. Noting that 2 32 = 4; 294; 967; 296, this gives an estimate of the maximum problem size that can be currently run with PFLOTRAN. Hopefully this limitation will be remedied in the near future.« less
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1993-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application. An overview of the GRASP/Ada project with an emphasis on the current update is provided.
Assessing and managing breast cancer risk: clinicians' current practice and future needs.
Collins, Ian M; Steel, Emma; Mann, G Bruce; Emery, Jon D; Bickerstaffe, Adrian; Trainer, Alison; Butow, Phyllis; Pirotta, Marie; Antoniou, Antonis C; Cuzick, Jack; Hopper, John; Phillips, Kelly-Anne; Keogh, Louise A
2014-10-01
Decision support tools for the assessment and management of breast cancer risk may improve uptake of prevention strategies. End-user input in the design of such tools is critical to increase clinical use. Before developing such a computerized tool, we examined clinicians' practice and future needs. Twelve breast surgeons, 12 primary care physicians and 5 practice nurses participated in 4 focus groups. These were recorded, coded, and analyzed to identify key themes. Participants identified difficulties assessing risk, including a lack of available tools to standardize practice. Most expressed confidence identifying women at potentially high risk, but not moderate risk. Participants felt a tool could especially reassure young women at average risk. Desirable features included: evidence-based, accessible (e.g. web-based), and displaying absolute (not relative) risks in multiple formats. The potential to create anxiety was a concern. Development of future tools should address these issues to optimize translation of knowledge into clinical practice. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hartle, M.; McKnight, R. L.
2000-01-01
This manual is a combination of a user manual, theory manual, and programmer manual. The reader is assumed to have some previous exposure to the finite element method. This manual is written with the idea that the CSTEM (Coupled Structural Thermal Electromagnetic-Computer Code) user needs to have a basic understanding of what the code is actually doing in order to properly use the code. For that reason, the underlying theory and methods used in the code are described to a basic level of detail. The manual gives an overview of the CSTEM code: how the code came into existence, a basic description of what the code does, and the order in which it happens (a flowchart). Appendices provide a listing and very brief description of every file used by the CSTEM code, including the type of file it is, what routine regularly accesses the file, and what routine opens the file, as well as special features included in CSTEM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, Scott Carlton; Roberts, Jesse D.
2014-03-01
This document describes the marine hydrokinetic (MHK) input file and subroutines for the Sandia National Laboratories Environmental Fluid Dynamics Code (SNL-EFDC), which is a combined hydrodynamic, sediment transport, and water quality model based on the Environmental Fluid Dynamics Code (EFDC) developed by John Hamrick [1], formerly sponsored by the U.S. Environmental Protection Agency, and now maintained by Tetra Tech, Inc. SNL-EFDC has been previously enhanced with the incorporation of the SEDZLJ sediment dynamics model developed by Ziegler, Lick, and Jones [2-4]. SNL-EFDC has also been upgraded to more accurately simulate algae growth with specific application to optimizing biomass in anmore » open-channel raceway for biofuels production [5]. A detailed description of the input file containing data describing the MHK device/array is provided, along with a description of the MHK FORTRAN routine. Both a theoretical description of the MHK dynamics as incorporated into SNL-EFDC and an explanation of the source code are provided. This user manual is meant to be used in conjunction with the original EFDC [6] and sediment dynamics SNL-EFDC manuals [7]. Through this document, the authors provide information for users who wish to model the effects of an MHK device (or array of devices) on a flow system with EFDC and who also seek a clear understanding of the source code, which is available from staff in the Water Power Technologies Department at Sandia National Laboratories, Albuquerque, New Mexico.« less
NASA Astrophysics Data System (ADS)
Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald
2017-09-01
In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.
Special issue on network coding
NASA Astrophysics Data System (ADS)
Monteiro, Francisco A.; Burr, Alister; Chatzigeorgiou, Ioannis; Hollanti, Camilla; Krikidis, Ioannis; Seferoglu, Hulya; Skachek, Vitaly
2017-12-01
Future networks are expected to depart from traditional routing schemes in order to embrace network coding (NC)-based schemes. These have created a lot of interest both in academia and industry in recent years. Under the NC paradigm, symbols are transported through the network by combining several information streams originating from the same or different sources. This special issue contains thirteen papers, some dealing with design aspects of NC and related concepts (e.g., fountain codes) and some showcasing the application of NC to new services and technologies, such as data multi-view streaming of video or underwater sensor networks. One can find papers that show how NC turns data transmission more robust to packet losses, faster to decode, and more resilient to network changes, such as dynamic topologies and different user options, and how NC can improve the overall throughput. This issue also includes papers showing that NC principles can be used at different layers of the networks (including the physical layer) and how the same fundamental principles can lead to new distributed storage systems. Some of the papers in this issue have a theoretical nature, including code design, while others describe hardware testbeds and prototypes.
Squish: Near-Optimal Compression for Archival of Relational Datasets
Gao, Yihan; Parameswaran, Aditya
2017-01-01
Relational datasets are being generated at an alarmingly rapid rate across organizations and industries. Compressing these datasets could significantly reduce storage and archival costs. Traditional compression algorithms, e.g., gzip, are suboptimal for compressing relational datasets since they ignore the table structure and relationships between attributes. We study compression algorithms that leverage the relational structure to compress datasets to a much greater extent. We develop Squish, a system that uses a combination of Bayesian Networks and Arithmetic Coding to capture multiple kinds of dependencies among attributes and achieve near-entropy compression rate. Squish also supports user-defined attributes: users can instantiate new data types by simply implementing five functions for a new class interface. We prove the asymptotic optimality of our compression algorithm and conduct experiments to show the effectiveness of our system: Squish achieves a reduction of over 50% in storage size relative to systems developed in prior work on a variety of real datasets. PMID:28180028
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan, Lehua; Oldenburg, Curtis M.
TOGA is a numerical reservoir simulator for modeling non-isothermal flow and transport of water, CO 2, multicomponent oil, and related gas components for applications including CO 2-enhanced oil recovery (CO 2-EOR) and geologic carbon sequestration in depleted oil and gas reservoirs. TOGA uses an approach based on the Peng-Robinson equation of state (PR-EOS) to calculate the thermophysical properties of the gas and oil phases including the gas/oil components dissolved in the aqueous phase, and uses a mixing model to estimate the thermophysical properties of the aqueous phase. The phase behavior (e.g., occurrence and disappearance of the three phases, gas +more » oil + aqueous) and the partitioning of non-aqueous components (e.g., CO 2, CH 4, and n-oil components) between coexisting phases are modeled using K-values derived from assumptions of equal-fugacity that have been demonstrated to be very accurate as shown by comparison to measured data. Models for saturated (water) vapor pressure and water solubility (in the oil phase) are used to calculate the partitioning of the water (H 2O) component between the gas and oil phases. All components (e.g., CO 2, H 2O, and n hydrocarbon components) are allowed to be present in all phases (aqueous, gaseous, and oil). TOGA uses a multiphase version of Darcy’s Law to model flow and transport through porous media of mixtures with up to three phases over a range of pressures and temperatures appropriate to hydrocarbon recovery and geologic carbon sequestration systems. Transport of the gaseous and dissolved components is by advection and Fickian molecular diffusion. New methods for phase partitioning and thermophysical property modeling in TOGA have been validated against experimental data published in the literature for describing phase partitioning and phase behavior. Flow and transport has been verified by testing against related TOUGH2 EOS modules and CMG. The code has also been validated against a CO 2-EOR experimental core flood involving flow of three phases and 12 components. Results of simulations of a hypothetical 3D CO 2-EOR problem involving three phases and multiple components are presented to demonstrate the field-scale capabilities of the new code. This user guide provides instructions for use and sample problems for verification and demonstration.« less
Emerge - A Python environment for the modeling of subsurface transfers
NASA Astrophysics Data System (ADS)
Lopez, S.; Smai, F.; Sochala, P.
2014-12-01
The simulation of subsurface mass and energy transfers often relies on specific codes that were mainly developed using compiled languages which usually ensure computational efficiency at the expense of relatively long development times and relatively rigid software. Even if a very detailed, possibly graphical, user-interface is developed the core numerical aspects are rarely accessible and the smallest modification will always need a compilation step. Thus, user-defined physical laws or alternative numerical schemes may be relatively difficult to use. Over the last decade, Python has emerged as a popular and widely used language in the scientific community. There already exist several libraries for the pre and post-treatment of input and output files for reservoir simulators (e.g. pytough). Development times in Python are considerably reduced compared to compiled languages, and programs can be easily interfaced with libraries written in compiled languages with several comprehensive numerical libraries that provide sequential and parallel solvers (e.g. PETSc, Trilinos…). The core objective of the Emerge project is to explore the possibility to develop a modeling environment in full Python. Consequently, we are developing an open python package with the classes/objects necessary to express, discretize and solve the physical problems encountered in the modeling of subsurface transfers. We heavily relied on Python to have a convenient and concise way of manipulating potentially complex concepts with a few lines of code and a high level of abstraction. Our result aims to be a friendly numerical environment targeting both numerical engineers and physicist or geoscientists with the possibility to quickly specify and handle geometries, arbitrary meshes, spatially or temporally varying properties, PDE formulations, boundary conditions…
User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.
1982-01-01
This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.
User's manual for CBS3DS, version 1.0
NASA Astrophysics Data System (ADS)
Reddy, C. J.; Deshpande, M. D.
1995-10-01
CBS3DS is a computer code written in FORTRAN 77 to compute the backscattering radar cross section of cavity backed apertures in infinite ground plane and slots in thick infinite ground plane. CBS3DS implements the hybrid Finite Element Method (FEM) and Method of Moments (MoM) techniques. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity/slot and the triangular elements with the basis functions for MoM at the apertures. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials; due to MoM, the apertures can be of any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computer the code is intended to run.
NASA Technical Reports Server (NTRS)
Chambers, Lin Hartung
1994-01-01
The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less
Coding and decoding for code division multiple user communication systems
NASA Technical Reports Server (NTRS)
Healy, T. J.
1985-01-01
A new algorithm is introduced which decodes code division multiple user communication signals. The algorithm makes use of the distinctive form or pattern of each signal to separate it from the composite signal created by the multiple users. Although the algorithm is presented in terms of frequency-hopped signals, the actual transmitter modulator can use any of the existing digital modulation techniques. The algorithm is applicable to error-free codes or to codes where controlled interference is permitted. It can be used when block synchronization is assumed, and in some cases when it is not. The paper also discusses briefly some of the codes which can be used in connection with the algorithm, and relates the algorithm to past studies which use other approaches to the same problem.
Some User's Insights Into ADIFOR 2.0D
NASA Technical Reports Server (NTRS)
Giesy, Daniel P.
2002-01-01
Some insights are given which were gained by one user through experience with the use of the ADIFOR 2.0D software for automatic differentiation of Fortran code. These insights are generally in the area of the user interface with the generated derivative code - particularly the actual form of the interface and the use of derivative objects, including "seed" matrices. Some remarks are given as to how to iterate application of ADIFOR in order to generate second derivative code.
Implementation of the Regulatory Authority Information System in Egypt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, S.D.; Schetnan, R.; Hasan, A.
2006-07-01
As part of the implementation of a bar-code-based system to track radioactive sealed sources (RSS) in Egypt, the Regulatory Authority Information System Personal Digital Assistant (RAIS PDA) Application was developed to extend the functionality of the International Atomic Energy Agency's (IAEA's) RAIS database by allowing users to download RSS data from the database to a portable PDA equipped with a bar-code scanner. [1, 4] The system allows users in the field to verify radioactive sealed source data, gather radioactive sealed source audit information, and upload that data to the RAIS database. This paper describes the development of the RAIS PDAmore » Application, its features, and how it will be implemented in Egypt. (authors)« less
NASA Technical Reports Server (NTRS)
Book, W. J.
1974-01-01
The Flexible Manipulator Analysis Program (FMAP) is a collection of FORTRAN coding to allow easy analysis of the flexible dynamics of mechanical arms. The user specifies the arm configuration and parameters and any or all of several frequency domain analyses to be performed, while the time domain impulse response is obtained by inverse Fourier transformation of the frequency response. A detailed explanation of how to use FMAP is provided.
14 CFR 1215.108 - Defining user service requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... to NASA Headquarters, Code OX, Space Network Division, Washington, DC 20546. Upon review and... submitted in writing to both NASA Headquarters, Code OX, Space Network Division, and GSFC, Code 501.... Request for services within priority groups shall be negotiated with non-NASA users on a first come, first...
31 CFR 29.106 - Representative payees.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the same circumstances as each plan permits for non-Federal Benefit Payments under the plan. (See e.g., section 4-629(b) of the D.C. Code (1997) (applicable to the Police and Firefighters Plan).) ...
31 CFR 29.106 - Representative payees.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the same circumstances as each plan permits for non-Federal Benefit Payments under the plan. (See e.g., section 4-629(b) of the D.C. Code (1997) (applicable to the Police and Firefighters Plan).) ...
31 CFR 29.106 - Representative payees.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the same circumstances as each plan permits for non-Federal Benefit Payments under the plan. (See e.g., section 4-629(b) of the D.C. Code (1997) (applicable to the Police and Firefighters Plan).) ...
31 CFR 29.106 - Representative payees.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the same circumstances as each plan permits for non-Federal Benefit Payments under the plan. (See e.g., section 4-629(b) of the D.C. Code (1997) (applicable to the Police and Firefighters Plan).) ...
31 CFR 29.106 - Representative payees.
Code of Federal Regulations, 2012 CFR
2012-07-01
... the same circumstances as each plan permits for non-Federal Benefit Payments under the plan. (See e.g., section 4-629(b) of the D.C. Code (1997) (applicable to the Police and Firefighters Plan).) ...
Make safety awareness a priority: Use a login software in your research facility
Camino, Fernando E.
2017-01-21
We report on a facility login software, whose objective is to improve safety in multi-user research facilities. Its most important safety features are: 1) blocks users from entering the lab after being absent for more than a predetermined number of days; 2) gives users a random safety quiz question, which they need to answer satisfactorily in order to use the facility; 3) blocks unauthorized users from using the facility afterhours; and 4) displays the current users in the facility. Besides restricting access to unauthorized users, the software keeps users mindful of key safety concepts. In addition, integration of the softwaremore » with a door controller system can convert it into an effective physical safety mechanism. Depending on DOE approval, the code may be available as open source.« less
Make safety awareness a priority: Use a login software in your research facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camino, Fernando E.
We report on a facility login software, whose objective is to improve safety in multi-user research facilities. Its most important safety features are: 1) blocks users from entering the lab after being absent for more than a predetermined number of days; 2) gives users a random safety quiz question, which they need to answer satisfactorily in order to use the facility; 3) blocks unauthorized users from using the facility afterhours; and 4) displays the current users in the facility. Besides restricting access to unauthorized users, the software keeps users mindful of key safety concepts. In addition, integration of the softwaremore » with a door controller system can convert it into an effective physical safety mechanism. Depending on DOE approval, the code may be available as open source.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamberland, Marc; Taylor, Randle E.P.; Rogers, Da
2016-08-15
Purpose: egs-brachy is a fast, new EGSnrc user-code for brachytherapy applications. This study characterizes egs-brachy features that enhance simulation efficiency. Methods: Calculations are performed to characterize efficiency gains from various features. Simulations include radionuclide and miniature x-ray tube sources in water phantoms and idealized prostate, breast, and eye plaque treatments. Features characterized include voxel indexing of sources to reduce boundary checks during radiation transport, scoring collision kerma via tracklength estimator, recycling photons emitted from sources, and using phase space data to initiate simulations. Bremsstrahlung cross section enhancement (BCSE), uniform bremsstrahlung splitting (UBS), and Russian Roulette (RR) are considered for electronicmore » brachytherapy. Results: Efficiency is enhanced by a factor of up to 300 using tracklength versus interaction scoring of collision kerma and by up to 2.7 and 2.6 using phase space sources and particle recycling respectively compared to simulations in which particles are initiated within sources. On a single 2.5 GHz Intel Xeon E5-2680 processor cor, simulations approximating prostate and breast permanent implant ((2 mm){sup 3} voxels) and eye plaque ((1 mm){sup 3}) treatments take as little as 9 s (prostate, eye) and up to 31 s (breast) to achieve 2% statistical uncertainty on doses within the PTV. For electronic brachytherapy, BCSE, UBS, and RR enhance efficiency by a factor >2000 compared to a factor of >10{sup 4} using a phase space source. Conclusion: egs-brachy features provide substantial efficiency gains, resulting in calculation times sufficiently fast for full Monte Carlo simulations for routine brachytherapy treatment planning.« less
NASA Astrophysics Data System (ADS)
Khakpour, Mohammad; Paulik, Christoph; Hahn, Sebastian
2016-04-01
Communication about remote sensing data quality between data providers and users as well as between the users is often difficult. The users have a hard time figuring out if a product has known problems over their region of interest and data providers have to spend a lot of effort to make this information available, if it exists. Scientific publications are one tool for communicating with the users base but they are static and mostly one way. As a data provider it is also often difficult to make feedback, received from users, available to the complete user base. The Geo Issue Tracking System (GeoITS) is an Open Source Web Application which has been developed to mitigate these problems. GeoITS combines a mapping interface (Google Maps) with a simple wiki platform. It allows users to give region specific feedback on a remote sensing product by drawing a polygon on the map and describing the problems they had using the remote sensing product in this area. These geolocated wiki entries are then viewable by other users as well as the data providers which can modify and extend the entries. In this way the conversations between the users and the data provider are no longer hidden in e.g. emails but open for all users of the dataset. This new kind of communication platform can enable better cooperation between users and data providers. It will also provide data providers with the ability to track problems their dataset might have in certain areas and resolve them with new product releases. The source code is available via http://github.com/TUW-GEO/geoits_dev A running instance can be tried at https://geoits.herokuapp.com/
NASA Technical Reports Server (NTRS)
Lahti, G. P.
1972-01-01
A two- or three-constraint, two-dimensional radiation shield weight optimization procedure and a computer program, DOPEX, is described. The DOPEX code uses the steepest descent method to alter a set of initial (input) thicknesses for a shield configuration to achieve a minimum weight while simultaneously satisfying dose constaints. The code assumes an exponential dose-shield thickness relation with parameters specified by the user. The code also assumes that dose rates in each principal direction are dependent only on thicknesses in that direction. Code input instructions, FORTRAN 4 listing, and a sample problem are given. Typical computer time required to optimize a seven-layer shield is about 0.1 minute on an IBM 7094-2.
Ainsworth, Neha Prasad; Vargo, Elisabeth Julie; Petróczi, Andrea
2018-02-01
2,4-Dinitrophenol (2,4-DNP) is a compound with multiple industrial purposes. Currently unlicensed for human consumption, it is used by the gym-going population for drastic, short-term body fat loss. Nonetheless, physiological mechanisms can lead to potentially fatal hyperthermia. Reported fatal incidents have caused concern and highlighted the need for intervention. Understanding decision-making leading to 2,4-DNP use alongside the perceived outgroup attitudes is vital to forming effective harm minimisation policies targeting current and potential users. First-hand accounts from this elusive population are scarce. Fourteen novel and experienced users (13 male, 1 female) were recruited via "snowballing" techniques. Semi-structured interviews were conducted, comprising 28 questions. Thematic content analysis was conducted using 37 codes. Four characteristic themes emerged: 1. Users considered the Internet to be a crucial multifunctional resource directly impacting their 2,4-DNP use. 2. Users "respected" 2,4-DNP, proactively taking harm reduction measures. 3. Attitudinal polarisation towards 2,4-DNP within the gym-going community was consistent in all accounts. 4. Users perceived outgroup populations to have inherently negative attitudes towards their use. These themes fell under the all-encompassing theme of "being in control". For the first time, this study offers a rich detail of attitudes toward 2,4-DNP use by giving a collective voice to users. The element of control over every aspect of the users' life appears to be a significant contributor to the successful risk-management of 2,4-DNP use. In the absence of an established safe upper limit and effective regulatory control, education is critical to harm minimisation. Copyright © 2017 Elsevier B.V. All rights reserved.
MAC/GMC 4.0 User's Manual: Keywords Manual. Volume 2
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2002-01-01
This document is the second volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, this document is the Keywords Manual, and Volume 3 is the Example Problem Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, applications of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume describes the basic information required to use the MAC/GMC 4.0 software, including a 'Getting Started' section, and an in-depth description of each of the 22 keywords used in the input file to control the execution of the code.
Database-driven web interface automating gyrokinetic simulations for validation
NASA Astrophysics Data System (ADS)
Ernst, D. R.
2010-11-01
We are developing a web interface to connect plasma microturbulence simulation codes with experimental data. The website automates the preparation of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data from TRANSP analysis of experiments, read from MDSPLUS over the internet. This database-driven tool saves user sessions, allowing searches of previous simulations, which can be restored to repeat the same analysis for a new discharge. The website includes a multi-tab, multi-frame, publication quality java plotter Webgraph, developed as part of this project. Input files can be uploaded as templates and edited with context-sensitive help. The website creates inputs for GS2 and GYRO using a well-tested and verified back-end, in use for several years for the GS2 code [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)]. A centralized web site has the advantage that users receive bug fixes instantaneously, while avoiding the duplicated effort of local compilations. Possible extensions to the database to manage run outputs, toward prototyping for the Fusion Simulation Project, are envisioned. Much of the web development utilized support from the DoE National Undergraduate Fellowship program [e.g., A. Suarez and D. R. Ernst, http://meetings.aps.org/link/BAPS.2005.DPP.GP1.57.
Airborne antenna radiation pattern code user's manual
NASA Technical Reports Server (NTRS)
Burnside, Walter D.; Kim, Jacob J.; Grandchamp, Brett; Rojas, Roberto G.; Law, Philip
1985-01-01
The use of a newly developed computer code to analyze the radiation patterns of antennas mounted on a ellipsoid and in the presence of a set of finite flat plates is described. It is shown how the code allows the user to simulate a wide variety of complex electromagnetic radiation problems using the ellipsoid/plates model. The code has the capacity of calculating radiation patterns around an arbitrary conical cut specified by the user. The organization of the code, definition of input and output data, and numerous practical examples are also presented. The analysis is based on the Uniform Geometrical Theory of Diffraction (UTD), and most of the computed patterns are compared with experimental results to show the accuracy of this solution.
Computer code for controller partitioning with IFPC application: A user's manual
NASA Technical Reports Server (NTRS)
Schmidt, Phillip H.; Yarkhan, Asim
1994-01-01
A user's manual for the computer code for partitioning a centralized controller into decentralized subcontrollers with applicability to Integrated Flight/Propulsion Control (IFPC) is presented. Partitioning of a centralized controller into two subcontrollers is described and the algorithm on which the code is based is discussed. The algorithm uses parameter optimization of a cost function which is described. The major data structures and functions are described. Specific instructions are given. The user is led through an example of an IFCP application.
Domestic Ice Breaking (DOMICE) Simulation Model User Guide
2013-02-01
Second, add new ice data to the variable “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (D9_historical_ice_d3), which contains the...within that “ NBL ” scheme. The interpretation of the SIGRID ice codes into ice thickness estimates is also contained within the sub- module “District 9...User Guide) “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (see Section 5.1.1.3.2 of this User Guide) “Historical District 1 Weekly Air
1982-11-01
Service code exceeded operational code in the ratio of 10 : I. No redundant information was required. It was modular. Internal parts of the program...to NASA’s analyses. We were to try to find an existing finite element program of a quality that would be worth recommending to all NASA Centers. We...Distinct manuals were published for users, programmers, theory, and demonstration problems. 3 It abounded with service code to provide user conveniences
NASA Astrophysics Data System (ADS)
Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.
2012-12-01
The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models, code, data, processing) are shared in the one virtual laboratory. VGL provides end users with access to an intuitive, user-centered interface that leverages cloud storage and cloud and cluster processing from both the research communities and commercial suppliers (e.g. Amazon). As the underlying data and information services are agnostic of the scientific domain, they can support many other data types. This fundamental characteristic results in a highly reusable virtual laboratory infrastructure that could also be used for example natural hazards, satellite processing, soil geochemistry, climate modeling, agriculture crop modeling.
Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L.
2017-05-01
MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.
Molmil: a molecular viewer for the PDB and beyond.
Bekker, Gert-Jan; Nakamura, Haruki; Kinjo, Akira R
2016-01-01
We have developed a new platform-independent web-based molecular viewer using JavaScript and WebGL. The molecular viewer, Molmil, has been integrated into several services offered by Protein Data Bank Japan and can be easily extended with new functionality by third party developers. Furthermore, the viewer can be used to load files in various formats from the user's local hard drive without uploading the data to a server. Molmil is available for all platforms supporting WebGL (e.g. Windows, Linux, iOS, Android) from http://gjbekker.github.io/molmil/. The source code is available at http://github.com/gjbekker/molmil under the LGPLv3 licence.
Interfacing WIPL-D with Mechanical CAD Software
NASA Technical Reports Server (NTRS)
Bliznyuk, Nataliya; Janic, Bojan
2007-01-01
of almost any popular CAD format, e.g. IGES, Parasolid, DXF, ACIS etc. The solid models are processed (simplified) and meshed in GiD(R), and then converted into WIPL-D Pro input file by simple Fortran or Matlab code. This algorithm allows the user to control the mesh of imported geometry, and to assign electric pperties to metalic and dielectric surfaces. Implementation of the algorithm is demonstrated by examples obtained from the NASA Discovery mission, Phoenix Lander 2008. Results for radiation pattern of Phoenix Lander UHF relay antenna with effect of Martian surface, both simulated in WIPL-D Pro and measured, are shown for comparison.
Incerti, S; Kyriakou, I; Bernal, M A; Bordage, M C; Francis, Z; Guatelli, S; Ivanchenko, V; Karamitros, M; Lampe, N; Lee, S B; Meylan, S; Min, C H; Shin, W G; Nieminen, P; Sakata, D; Tang, N; Villagrasa, C; Tran, H; Brown, J M C
2018-06-14
This Special Report presents a description of Geant4-DNA user applications dedicated to the simulation of track structures (TS) in liquid water and associated physical quantities (e.g. range, stopping power, mean free path…). These example applications are included in the Geant4 Monte Carlo toolkit and are available in open access. Each application is described and comparisons to recent international recommendations are shown (e.g. ICRU, MIRD), when available. The influence of physics models available in Geant4-DNA for the simulation of electron interactions in liquid water is discussed. Thanks to these applications, the authors show that the most recent sets of physics models available in Geant4-DNA (the so-called "option4″ and "option 6″ sets) enable more accurate simulation of stopping powers, dose point kernels and W-values in liquid water, than the default set of models ("option 2″) initially provided in Geant4-DNA. They also serve as reference applications for Geant4-DNA users interested in TS simulations. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, T.D. Jr.
1996-05-01
The Monte Carlo Model System (MCMS) for the Washington State University (WSU) Radiation Center provides a means through which core criticality and power distributions can be calculated, as well as providing a method for neutron and photon transport necessary for BNCT epithermal neutron beam design. The computational code used in this Model System is MCNP4A. The geometric capability of this Monte Carlo code allows the WSU system to be modeled very accurately. A working knowledge of the MCNP4A neutron transport code increases the flexibility of the Model System and is recommended, however, the eigenvalue/power density problems can be run withmore » little direct knowledge of MCNP4A. Neutron and photon particle transport require more experience with the MCNP4A code. The Model System consists of two coupled subsystems; the Core Analysis and Source Plane Generator Model (CASP), and the BeamPort Shell Particle Transport Model (BSPT). The CASP Model incorporates the S({alpha}, {beta}) thermal treatment, and is run as a criticality problem yielding, the system eigenvalue (k{sub eff}), the core power distribution, and an implicit surface source for subsequent particle transport in the BSPT Model. The BSPT Model uses the source plane generated by a CASP run to transport particles through the thermal column beamport. The user can create filter arrangements in the beamport and then calculate characteristics necessary for assessing the BNCT potential of the given filter want. Examples of the characteristics to be calculated are: neutron fluxes, neutron currents, fast neutron KERMAs and gamma KERMAs. The MCMS is a useful tool for the WSU system. Those unfamiliar with the MCNP4A code can use the MCMS transparently for core analysis, while more experienced users will find the particle transport capabilities very powerful for BNCT filter design.« less
X-Antenna: A graphical interface for antenna analysis codes
NASA Technical Reports Server (NTRS)
Goldstein, B. L.; Newman, E. H.; Shamansky, H. T.
1995-01-01
This report serves as the user's manual for the X-Antenna code. X-Antenna is intended to simplify the analysis of antennas by giving the user graphical interfaces in which to enter all relevant antenna and analysis code data. Essentially, X-Antenna creates a Motif interface to the user's antenna analysis codes. A command-file allows new antennas and codes to be added to the application. The menu system and graphical interface screens are created dynamically to conform to the data in the command-file. Antenna data can be saved and retrieved from disk. X-Antenna checks all antenna and code values to ensure they are of the correct type, writes an output file, and runs the appropriate antenna analysis code. Volumetric pattern data may be viewed in 3D space with an external viewer run directly from the application. Currently, X-Antenna includes analysis codes for thin wire antennas (dipoles, loops, and helices), rectangular microstrip antennas, and thin slot antennas.
SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Wang, L.
1994-01-01
SPLICER is a genetic algorithm tool which can be used to solve search and optimization problems. Genetic algorithms are adaptive search procedures (i.e. problem solving methods) based loosely on the processes of natural selection and Darwinian "survival of the fittest." SPLICER provides the underlying framework and structure for building a genetic algorithm application. These algorithms apply genetically-inspired operators to populations of potential solutions in an iterative fashion, creating new populations while searching for an optimal or near-optimal solution to the problem at hand. SPLICER 1.0 was created using a modular architecture that includes a Genetic Algorithm Kernel, interchangeable Representation Libraries, Fitness Modules and User Interface Libraries, and well-defined interfaces between these components. The architecture supports portability, flexibility, and extensibility. SPLICER comes with all source code and several examples. For instance, a "traveling salesperson" example searches for the minimum distance through a number of cities visiting each city only once. Stand-alone SPLICER applications can be used without any programming knowledge. However, to fully utilize SPLICER within new problem domains, familiarity with C language programming is essential. SPLICER's genetic algorithm (GA) kernel was developed independent of representation (i.e. problem encoding), fitness function or user interface type. The GA kernel comprises all functions necessary for the manipulation of populations. These functions include the creation of populations and population members, the iterative population model, fitness scaling, parent selection and sampling, and the generation of population statistics. In addition, miscellaneous functions are included in the kernel (e.g., random number generators). Different problem-encoding schemes and functions are defined and stored in interchangeable representation libraries. This allows the GA kernel to be used with any representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.
Simon, Nadine; Käthner, Ivo; Ruf, Carolin A; Pasqualotto, Emanuele; Kübler, Andrea; Halder, Sebastian
2014-01-01
Brain-computer interfaces (BCIs) can serve as muscle independent communication aids. Persons, who are unable to control their eye muscles (e.g., in the completely locked-in state) or have severe visual impairments for other reasons, need BCI systems that do not rely on the visual modality. For this reason, BCIs that employ auditory stimuli were suggested. In this study, a multiclass BCI spelling system was implemented that uses animal voices with directional cues to code rows and columns of a letter matrix. To reveal possible training effects with the system, 11 healthy participants performed spelling tasks on 2 consecutive days. In a second step, the system was tested by a participant with amyotrophic lateral sclerosis (ALS) in two sessions. In the first session, healthy participants spelled with an average accuracy of 76% (3.29 bits/min) that increased to 90% (4.23 bits/min) on the second day. Spelling accuracy by the participant with ALS was 20% in the first and 47% in the second session. The results indicate a strong training effect for both the healthy participants and the participant with ALS. While healthy participants reached high accuracies in the first session and second session, accuracies for the participant with ALS were not sufficient for satisfactory communication in both sessions. More training sessions might be needed to improve spelling accuracies. The study demonstrated the feasibility of the auditory BCI with healthy users and stresses the importance of training with auditory multiclass BCIs, especially for potential end-users of BCI with disease.
User's guide to the SEPHIS computer code for calculating the Thorex solvent extraction system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, S.B.; Rainey, R.H.
1979-05-01
The SEPHIS computer program was developed to simulate the countercurrent solvent extraction process. The code has now been adapted to model the Acid Thorex flow sheet. This report represents a practical user's guide to SEPHIS - Thorex containing a program description, user information, program listing, and sample input and output.
Metal matrix composite analyzer (METCAN) user's manual, version 4.0
NASA Technical Reports Server (NTRS)
Lee, H.-J.; Gotsis, P. K.; Murthy, P. L. N.; Hopkins, D. A.
1992-01-01
The Metal Matrix Composite Analyzer (METCAN) is a computer code developed at Lewis Research Center to simulate the high temperature nonlinear behavior of metal matrix composites. An updated version of the METCAN User's Manual is presented. The manual provides the user with a step by step outline of the procedure necessary to run METCAN. The preparation of the input file is demonstrated, and the output files are explained. The sample problems are presented to highlight various features of METCAN. An overview of the geometric conventions, micromechanical unit cell, and the nonlinear constitutive relationships is also provided.
The Gift Code User Manual. Volume I. Introduction and Input Requirements
1975-07-01
REPORT & PERIOD COVERED ‘TII~ GIFT CODE USER MANUAL; VOLUME 1. INTRODUCTION AND INPUT REQUIREMENTS FINAL 6. PERFORMING ORG. REPORT NUMBER ?. AuTHOR(#) 8...reverua side if neceaeary and identify by block number] (k St) The GIFT code is a FORTRANcomputerprogram. The basic input to the GIFT ode is data called
NASA Lewis steady-state heat pipe code users manual
NASA Technical Reports Server (NTRS)
Tower, Leonard K.; Baker, Karl W.; Marks, Timothy S.
1992-01-01
The NASA Lewis heat pipe code was developed to predict the performance of heat pipes in the steady state. The code can be used as a design tool on a personal computer or with a suitable calling routine, as a subroutine for a mainframe radiator code. A variety of wick structures, including a user input option, can be used. Heat pipes with multiple evaporators, condensers, and adiabatic sections in series and with wick structures that differ among sections can be modeled. Several working fluids can be chosen, including potassium, sodium, and lithium, for which monomer-dimer equilibrium is considered. The code incorporates a vapor flow algorithm that treats compressibility and axially varying heat input. This code facilitates the determination of heat pipe operating temperatures and heat pipe limits that may be encountered at the specified heat input and environment temperature. Data are input to the computer through a user-interactive input subroutine. Output, such as liquid and vapor pressures and temperatures, is printed at equally spaced axial positions along the pipe as determined by the user.
NASA Lewis steady-state heat pipe code users manual
NASA Astrophysics Data System (ADS)
Tower, Leonard K.; Baker, Karl W.; Marks, Timothy S.
1992-06-01
The NASA Lewis heat pipe code was developed to predict the performance of heat pipes in the steady state. The code can be used as a design tool on a personal computer or with a suitable calling routine, as a subroutine for a mainframe radiator code. A variety of wick structures, including a user input option, can be used. Heat pipes with multiple evaporators, condensers, and adiabatic sections in series and with wick structures that differ among sections can be modeled. Several working fluids can be chosen, including potassium, sodium, and lithium, for which monomer-dimer equilibrium is considered. The code incorporates a vapor flow algorithm that treats compressibility and axially varying heat input. This code facilitates the determination of heat pipe operating temperatures and heat pipe limits that may be encountered at the specified heat input and environment temperature. Data are input to the computer through a user-interactive input subroutine. Output, such as liquid and vapor pressures and temperatures, is printed at equally spaced axial positions along the pipe as determined by the user.
Near Zone: Basic scattering code user's manual with space station applications
NASA Technical Reports Server (NTRS)
Marhefka, R. J.; Silvestro, J. W.
1989-01-01
The Electromagnetic Code - Basic Scattering Code, Version 3, is a user oriented computer code to analyze near and far zone patterns of antennas in the presence of scattering structures, to provide coupling between antennas in a complex environment, and to determine radiation hazard calculations at UHF and above. The analysis is based on uniform asymptotic techniques formulated in terms of the Uniform Geometrical Theory of Diffraction (UTD). Complicated structures can be simulated by arbitrarily oriented flat plates and an infinite ground plane that can be perfectly conducting or dielectric. Also, perfectly conducting finite elliptic cylinder, elliptic cone frustum sections, and finite composite ellipsoids can be used to model the superstructure of a ship, the body of a truck, and airplane, a satellite, etc. This manual gives special consideration to space station modeling applications. This is a user manual designed to give an overall view of the operation of the computer code, to instruct a user in how to model structures, and to show the validity of the code by comparing various computed results against measured and alternative calculations such as method of moments whenever available.
Users manual and modeling improvements for axial turbine design and performance computer code TD2-2
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.
MELTSPREAD3 is a transient one-dimensional computer code that has been developed to predict the gravity-driven flow and freezing behavior of molten reactor core materials (corium) in containment geometries. Predictions can be made for corium flowing across surfaces under either dry or wet cavity conditions. The spreading surfaces that can be selected are steel, concrete, a user-specified material (e.g., a ceramic), or an arbitrary combination thereof. The corium can have a wide range of compositions of reactor core materials that includes distinct oxide phases (predominantly Zr, and steel oxides) plus metallic phases (predominantly Zr and steel). The code requires input thatmore » describes the containment geometry, melt “pour” conditions, and cavity atmospheric conditions (i.e., pressure, temperature, and cavity flooding information). For cases in which the cavity contains a preexisting water layer at the time of RPV failure, melt jet breakup and particle bed formation can be calculated mechanistically given the time-dependent melt pour conditions (input data) as well as the heatup and boiloff of water in the melt impingement zone (calculated). For core debris impacting either the containment floor or previously spread material, the code calculates the transient hydrodynamics and heat transfer which determine the spreading and freezing behavior of the melt. The code predicts conditions at the end of the spreading stage, including melt relocation distance, depth and material composition profiles, substrate ablation profile, and wall heatup. Code output can be used as input to other models such as CORQUENCH that evaluate long term core-concrete interaction behavior following the transient spreading stage. MELTSPREAD3 was originally developed to investigate BWR Mark I liner vulnerability, but has been substantially upgraded and applied to other reactor designs (e.g., the EPR), and more recently to the plant accidents at Fukushima Daiichi. The most recent round of improvements that are documented in this report have been specifically implemented to support industry in developing Severe Accident Water Management (SAWM) strategies for Boiling Water Reactors.« less
VisIVO: A Library and Integrated Tools for Large Astrophysical Dataset Exploration
NASA Astrophysics Data System (ADS)
Becciani, U.; Costa, A.; Ersotelos, N.; Krokos, M.; Massimino, P.; Petta, C.; Vitello, F.
2012-09-01
VisIVO provides an integrated suite of tools and services that can be used in many scientific fields. VisIVO development starts in the Virtual Observatory framework. VisIVO allows users to visualize meaningfully highly-complex, large-scale datasets and create movies of these visualizations based on distributed infrastructures. VisIVO supports high-performance, multi-dimensional visualization of large-scale astrophysical datasets. Users can rapidly obtain meaningful visualizations while preserving full and intuitive control of the relevant parameters. VisIVO consists of VisIVO Desktop - a stand-alone application for interactive visualization on standard PCs, VisIVO Server - a platform for high performance visualization, VisIVO Web - a custom designed web portal, VisIVOSmartphone - an application to exploit the VisIVO Server functionality and the latest VisIVO features: VisIVO Library allows a job running on a computational system (grid, HPC, etc.) to produce movies directly with the code internal data arrays without the need to produce intermediate files. This is particularly important when running on large computational facilities, where the user wants to have a look at the results during the data production phase. For example, in grid computing facilities, images can be produced directly in the grid catalogue while the user code is running in a system that cannot be directly accessed by the user (a worker node). The deployment of VisIVO on the DG and gLite is carried out with the support of EDGI and EGI-Inspire projects. Depending on the structure and size of datasets under consideration, the data exploration process could take several hours of CPU for creating customized views and the production of movies could potentially last several days. For this reason an MPI parallel version of VisIVO could play a fundamental role in increasing performance, e.g. it could be automatically deployed on nodes that are MPI aware. A central concept in our development is thus to produce unified code that can run either on serial nodes or in parallel by using HPC oriented grid nodes. Another important aspect, to obtain as high performance as possible, is the integration of VisIVO processes with grid nodes where GPUs are available. We have selected CUDA for implementing a range of computationally heavy modules. VisIVO is supported by EGI-Inspire, EDGI and SCI-BUS projects.
The Code of the Street and Violent Versus Property Crime Victimization.
McNeeley, Susan; Wilcox, Pamela
2015-01-01
Previous research has shown that individuals who adopt values in line with the code of the street are more likely to experience violent victimization (e.g., Stewart, Schreck, & Simons, 2006). This study extends this literature by examining the relationship between the street code and multiple types of violent and property victimization. This research investigates the relationship between street code-related values and 4 types of victimization (assault, breaking and entering, theft, and vandalism) using Poisson-based multilevel regression models. Belief in the street code was associated with higher risk of experiencing assault, breaking and entering, and vandalism, whereas theft victimization was not related to the street code. The results suggest that the code of the street influences victimization broadly--beyond violence--by increasing behavior that provokes retaliation from others in various forms.
Implementation of statistical process control for proteomic experiments via LC MS/MS.
Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J
2014-04-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.
Experiment Software and Projects on the Web with VISPA
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, B.; Fischer, R.; Geiser, E.; Glaser, C.; Müller, G.; Rieger, M.; Urban, M.; von Cube, R. F.; Welling, C.
2017-10-01
The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides a powerful extension mechanism that enables to interface a wide range of applications. Beyond basic applications such as a code editor, a file browser, or a terminal, it meets the demands of sophisticated experiment-specific use cases that focus on physics data analyses and typically require a high degree of interactivity. As an example, we developed a data inspector that is capable of browsing interactively through event content of several data formats, e.g., MiniAOD which is utilized by the CMS collaboration. The VISPA extension mechanism can also be used to embed external web-based applications that benefit from dynamic allocation of user-defined computing resources via SSH. For example, by wrapping the JSROOT project, ROOT files located on any remote machine can be inspected directly through a VISPA server instance. We introduced domains that combine groups of users and role-based permissions. Thereby, tailored projects are enabled, e.g. for teaching where access to student’s homework is restricted to a team of tutors, or for experiment-specific data that may only be accessible for members of the collaboration. We present the extension mechanism including corresponding applications and give an outlook onto the new permission system.
NASA Astrophysics Data System (ADS)
Duffy, Alan; Yates, Brian; Takacs, Peter
2012-09-01
The Optical Metrology Facility at the Canadian Light Source (CLS) has recently purchased MountainsMap surface analysis software from Digital Surf and we report here our experiences with this package and its usefulness as a tool for examining metrology data of synchrotron x-ray mirrors. The package has a number of operators that are useful for determining surface roughness and slope error including compliance with ISO standards (viz. ISO 4287 and ISO 25178). The software is extensible with MATLAB scripts either by loading an m-file or by a user written script. This makes it possible to apply a custom operator to measurement data sets. Using this feature we have applied the simple six-line MATLAB code for the direct least square fitting of ellipses developed by Fitzgibbon et. al. to investigate the residual slope error of elliptical mirrors upon the removal of the best-fit-ellipse. The software includes support for many instruments (e.g. Zygo, MicroMap, etc...) and can import ASCII data (e.g. LTP data). The stitching module allows the user to assemble overlapping images and we report on our experiences with this feature applied to MicroMap surface roughness data. The power spectral density function was determined for the stitched and unstitched data and compared.
Use or Non-Use of Gerontechnology—A Qualitative Study
Chen, Ke; Chan, Alan Hoi-shou
2013-01-01
This study employed a qualitative approach to explore the attitudes and experiences of older people towards using gerontechnology, and to determine the underlying reasons that might account for their use and non-use of gerontechnology. Four focus group discussions and 26 individual interviews were undertaken. Qualitative data were analyzed using NVivo software and were categorized using coding and grounded theory techniques. The result indicated that old people in Hong Kong had an overall positive attitude toward technology. Positive attitudes were most frequently related to enhanced convenience and advanced features. Negative attitudes were most frequently associated with health risks and social problems arising from using technology (e.g., social isolation and addiction). Usage of technology is driven by outcome expectations and social influences, and supported by facilitators, whereas non-use of gerontechnology relates to the personal (e.g., health and functional capacities), technological (e.g., cost and complexity), and environmental barriers experienced. Use of gerontechnology is a synthesis of person, technology, and environment. To encourage non-users to adopt technology, there is a need to remove barriers at personal, technological, and environmental levels. PMID:24084674
NASA Astrophysics Data System (ADS)
Fadakar Alghalandis, Younes
2017-05-01
Rapidly growing topic, the discrete fracture network engineering (DFNE), has already attracted many talents from diverse disciplines in academia and industry around the world to challenge difficult problems related to mining, geothermal, civil, oil and gas, water and many other projects. Although, there are few commercial software capable of providing some useful functionalities fundamental for DFNE, their costs, closed code (black box) distributions and hence limited programmability and tractability encouraged us to respond to this rising demand with a new solution. This paper introduces an open source comprehensive software package for stochastic modeling of fracture networks in two- and three-dimension in discrete formulation. Functionalities included are geometric modeling (e.g., complex polygonal fracture faces, and utilizing directional statistics), simulations, characterizations (e.g., intersection, clustering and connectivity analyses) and applications (e.g., fluid flow). The package is completely written in Matlab scripting language. Significant efforts have been made to bring maximum flexibility to the functions in order to solve problems in both two- and three-dimensions in an easy and united way that is suitable for beginners, advanced and experienced users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User`s Guide that documents the use of the code for all users. To release the code and the User`s Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less
Comment on ‘egs_brachy: a versatile and fast Monte Carlo code for brachytherapy’
NASA Astrophysics Data System (ADS)
Yegin, Gultekin
2018-02-01
In a recent paper (Chamberland et al 2016 Phys. Med. Biol. 61 8214) develop a new Monte Carlo code called egs_brachy for brachytherapy treatments. It is based on EGSnrc, and written in the C++ programming language. In order to benchmark the egs_brachy code, the authors use it in various test case scenarios in which complex geometry conditions exist. Another EGSnrc based brachytherapy dose calculation engine, BrachyDose, is used for dose comparisons. The authors fail to prove that egs_brachy can produce reasonable dose values for brachytherapy sources in a given medium. The dose comparisons in the paper are erroneous and misleading. egs_brachy should not be used in any further research studies unless and until all the potential bugs are fixed in the code.
Development of Web Interfaces for Analysis Codes
NASA Astrophysics Data System (ADS)
Emoto, M.; Watanabe, T.; Funaba, H.; Murakami, S.; Nagayama, Y.; Kawahata, K.
Several codes have been developed to analyze plasma physics. However, most of them are developed to run on supercomputers. Therefore, users who typically use personal computers (PCs) find it difficult to use these codes. In order to facilitate the widespread use of these codes, a user-friendly interface is required. The authors propose Web interfaces for these codes. To demonstrate the usefulness of this approach, the authors developed Web interfaces for two analysis codes. One of them is for FIT developed by Murakami. This code is used to analyze the NBI heat deposition, etc. Because it requires electron density profiles, electron temperatures, and ion temperatures as polynomial expressions, those unfamiliar with the experiments find it difficult to use this code, especially visitors from other institutes. The second one is for visualizing the lines of force in the LHD (large helical device) developed by Watanabe. This code is used to analyze the interference caused by the lines of force resulting from the various structures installed in the vacuum vessel of the LHD. This code runs on PCs; however, it requires that the necessary parameters be edited manually. Using these Web interfaces, users can execute these codes interactively.
Fiedler, Jan; Baker, Andrew H; Dimmeler, Stefanie; Heymans, Stephane; Mayr, Manuel; Thum, Thomas
2018-05-23
Non-coding RNAs are increasingly recognized not only as regulators of various biological functions but also as targets for a new generation of RNA therapeutics and biomarkers. We hereby review recent insights relating to non-coding RNAs including microRNAs (e.g. miR-126, miR-146a), long non-coding RNAs (e.g. MIR503HG, GATA6-AS, SMILR) and circular RNAs (e.g. cZNF292) and their role in vascular diseases. This includes identification and therapeutic use of hypoxia-regulated non-coding RNAs and endogenous non-coding RNAs that regulate intrinsic smooth muscle cell signalling, age-related non-coding RNAs and non-coding RNAs involved in the regulation of mitochondrial biology and metabolic control. Finally, we discuss non-coding RNA species with biomarker potential.
Leveraging the NLM map from SNOMED CT to ICD-10-CM to facilitate adoption of ICD-10-CM.
Cartagena, F Phil; Schaeffer, Molly; Rifai, Dorothy; Doroshenko, Victoria; Goldberg, Howard S
2015-05-01
Develop and test web services to retrieve and identify the most precise ICD-10-CM code(s) for a given clinical encounter. Facilitate creation of user interfaces that 1) provide an initial shortlist of candidate codes, ideally visible on a single screen; and 2) enable code refinement. To satisfy our high-level use cases, the analysis and design process involved reviewing available maps and crosswalks, designing the rule adjudication framework, determining necessary metadata, retrieving related codes, and iteratively improving the code refinement algorithm. The Partners ICD-10-CM Search and Mapping Services (PI-10 Services) are SOAP web services written using Microsoft's.NET 4.0 Framework, Windows Communications Framework, and SQL Server 2012. The services cover 96% of the Partners problem list subset of SNOMED CT codes that map to ICD-10-CM codes and can return up to 76% of the 69,823 billable ICD-10-CM codes prior to creation of custom mapping rules. We consider ways to increase 1) the coverage ratio of the Partners problem list subset of SNOMED CT codes and 2) the upper bound of returnable ICD-10-CM codes by creating custom mapping rules. Future work will investigate the utility of the transitive closure of SNOMED CT codes and other methods to assist in custom rule creation and, ultimately, to provide more complete coverage of ICD-10-CM codes. ICD-10-CM will be easier for clinicians to manage if applications display short lists of candidate codes from which clinicians can subsequently select a code for further refinement. The PI-10 Services support ICD-10 migration by implementing this paradigm and enabling users to consistently and accurately find the best ICD-10-CM code(s) without translation from ICD-9-CM. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Swinnen, Thijs Willem; Westhovens, Rene; de Vlam, Kurt; Geurts, Luc; Vanden Abeele, Vero
2016-01-01
Background Chronic arthritis (CA), an umbrella term for inflammatory rheumatic and other musculoskeletal diseases, is highly prevalent. Effective disease-modifying antirheumatic drugs for CA are available, with the exception of osteoarthritis, but require a long-term commitment of patients to comply with the medication regimen and management program as well as a tight follow-up by the treating physician and health professionals. Additionally, patients are advised to participate in physical exercise programs. Adherence to exercises and physical activity programs is often very low. Patients would benefit from support to increase medication compliance as well as compliance to the physical exercise programs. To address these shortcomings, health apps for CA patients have been created. These mobile apps assist patients in self-management of overall health measures, health prevention, and disease management. By including persuasive principles designed to reinforce, change, or shape attitudes or behaviors, health apps can transform into support tools that motivate and stimulate users to achieve or keep up with target behavior, also called persuasive systems. However, the extent to which health apps for CA patients consciously and successfully employ such persuasive principles remains unknown. Objective The objective of this study was to evaluate the number and type of persuasive principles present in current health apps for CA patients. Methods A review of apps for arthritis patients was conducted across the three major app stores (Google Play, Apple App Store, and Windows Phone Store). Collected apps were coded according to 37 persuasive principles, based on an altered version of the Persuasive System Design taxonomy of Oinas-Kukkonen and Harjuma and the taxonomy of Behavior Change Techniques of Michie and Abraham. In addition, user ratings, number of installs, and price of the apps were also coded. Results We coded 28 apps. On average, 5.8 out of 37 persuasive principles were used in each app. The most used category of persuasive principles was System Credibility with an average of 2.6 principles. Task Support was the second most used, with an average of 2.3 persuasive principles. Next was Dialogue Support with an average of 0.5 principles. Social Support was last with an average of 0.01 persuasive principles only. Conclusions Current health apps for CA patients would benefit from adding Social Support techniques (eg, social media, user fora) and extending Dialogue Support techniques (eg, rewards, praise). The addition of automated tracking of health-related parameters (eg, physical activity, step count) could further reduce the effort for CA patients to manage their disease and thus increase Task Support. Finally, apps for health could benefit from a more evidence-based approach, both in developing the app as well as ensuring that content can be verified as scientifically proven, which will result in enhanced System Credibility. PMID:27742604
Clinical toxicology of newer recreational drugs.
Hill, Simon L; Thomas, Simon H L
2011-10-01
Novel synthetic 'designer' drugs with stimulant, ecstasy-like (entactogenic) and/or hallucinogenic properties have become increasingly popular among recreational drug users in recent years. The substances used change frequently in response to market trends and legislative controls and it is an important challenge for poisons centres and clinical toxicologists to remain updated on the pharmacological and toxicological effects of these emerging agents. To review the available information on newer synthetic stimulant, entactogenic and hallucinogenic drugs, provide a framework for classification of these drugs based on chemical structure and describe their pharmacology and clinical toxicology. A comprehensive review of the published literature was performed using PUBMED and Medline databases, together with additional non-peer reviewed information sources, including books, media reports, government publications and internet resources, including drug user web forums. Novel synthetic stimulant, entactogenic or hallucinogenic designer drugs are increasingly available to users as demonstrated by user surveys, poisons centre calls, activity on internet drug forums, hospital attendance data and mortality data. Some population sub groups such as younger adults who attend dance music clubs are more likely to use these substances. The internet plays an important role in determining the awareness of and availability of these newer drugs of abuse. Most novel synthetic stimulant, entactogenic or hallucinogenic drugs of abuse can be classified according to chemical structure as piperazines (e.g. benzylpiperazine (BZP), trifluoromethylphenylpiperazine), phenethylamines (e.g. 2C or D-series of ring-substituted amfetamines, benzodifurans, cathinones, aminoindans), tryptamines (e.g. dimethyltryptamine, alpha-methyltryptamine, ethyltryptamine, 5-methoxy-alphamethyltryptamine) or piperidines and related substances (e.g. desoxypipradrol, diphenylprolinol). Alternatively classification may be based on clinical effects as either primarily stimulant, entactogenic or hallucinogenic, although most drugs have a combination of such effects. CLINICAL TOXICOLOGY: Piperazines, phenethylamines, tryptamines and piperidines have actions at multiple central nervous system (CNS) receptor sites, with patterns of effects varying between agents. Predominantly stimulant drugs (e.g. benzylpiperazine, mephedrone, naphyrone, diphenylprolinol) inhibit monoamine (especially dopamine) reuptake and are characteristically associated with a sympathomimetic toxidrome. Entactogenic drugs (e.g. phenylpiperazines, methylone) provoke central serotonin release, while newer hallucinogens (e.g. 5-methoxy-N,N-diisopropyltryptamine (5-MeO-DiPT), 2,5-dimethoxy-4-bromoamfetamine (DOB)) are serotonin receptor agonists. As a result, serotoninergic effects predominate in toxicity. There are limited reliable data to guide clinicians managing patients with toxicity due to these substances. The harms associated with emerging recreational drugs are not fully documented, although it is clear that they are not without risk. Management of users with acute toxic effects is pragmatic and primarily extrapolated from experience with longer established stimulant or hallucinogenic drugs such as amfetamines, 3,4-methylenedioxymethamfetamine (MDMA) and lysergic acid diethylamide (LSD).
The Helicopter Antenna Radiation Prediction Code (HARP)
NASA Technical Reports Server (NTRS)
Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.
1990-01-01
The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1992-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.
Summary of papers on current and anticipated uses of thermal-hydraulic codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especiallymore » faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).« less
Measuring diagnoses: ICD code accuracy.
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-10-01
To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.
ERIC Educational Resources Information Center
Uehara, Suwako; Noriega, Edgar Josafat Martinez
2016-01-01
The availability of user-friendly coding software is increasing, yet teachers might hesitate to use this technology to develop for educational needs. This paper discusses studies related to technology for educational uses and introduces an evaluation application being developed. Through questionnaires by student users and open-ended discussion by…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.
1994-02-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses that individuals may have received from operations at the Hanford Site since 1944. This report deals specifically with the atmospheric transport model, Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). RATCHET is a major rework of the MESOILT2 model used in the first phase of the HEDR Project; only the bookkeeping framework escaped major changes. Changes to the code include (1) significant changes in the representation of atmospheric processes and (2) incorporation of Monte Carlo methods for representing uncertainty in input data, model parameters,more » and coefficients. To a large extent, the revisions to the model are based on recommendations of a peer working group that met in March 1991. Technical bases for other portions of the atmospheric transport model are addressed in two other documents. This report has three major sections: a description of the model, a user`s guide, and a programmer`s guide. These sections discuss RATCHET from three different perspectives. The first provides a technical description of the code with emphasis on details such as the representation of the model domain, the data required by the model, and the equations used to make the model calculations. The technical description is followed by a user`s guide to the model with emphasis on running the code. The user`s guide contains information about the model input and output. The third section is a programmer`s guide to the code. It discusses the hardware and software required to run the code. The programmer`s guide also discusses program structure and each of the program elements.« less
ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William
2008-04-01
ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a methodmore » for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.« less
Data Sciences Summer Institute Topology Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, Seth
DSSI_TOPOPT is a 2D topology optimization code that designs stiff structures made of a single linear elastic material and void space. The code generates a finite element mesh of a rectangular design domain on which the user specifies displacement and load boundary conditions. The code iteratively designs a structure that minimizes the compliance (maximizes the stiffness) of the structure under the given loading, subject to an upper bound on the amount of material used. Depending on user options, the code can evaluate the performance of a user-designed structure, or create a design from scratch. Output includes the finite element mesh,more » design, and visualizations of the design.« less
Manual for obscuration code with space station applications
NASA Technical Reports Server (NTRS)
Marhefka, R. J.; Takacs, L.
1986-01-01
The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.
CHEETAH: A next generation thermochemical code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fried, L.; Souers, P.
1994-11-01
CHEETAH is an effort to bring the TIGER thermochemical code into the 1990s. A wide variety of improvements have been made in Version 1.0. We have improved the robustness and ease of use of TIGER. All of TIGER`s solvers have been replaced by new algorithms. We find that CHEETAH solves a wider variety of problems with no user intervention (e.g. no guesses for the C-J state) than TIGER did. CHEETAH has been made simpler to use than TIGER; typical use of the code occurs with the new standard run command. CHEETAH will make the use of thermochemical codes more attractivemore » to practical explosive formulators. We have also made an extensive effort to improve over the results of TIGER. CHEETAH`s version of the BKW equation of state (BKWC) is able to accurately reproduce energies from cylinder tests; something that other BKW parameter sets have been unable to do. Calculations performed with BKWC execute very quickly; typical run times are under 10 seconds on a workstation. In the future we plan to improve the underlying science in CHEETAH. More accurate equations of state will be used in the gas and the condensed phase. A kinetics capability will be added to the code that will predict reaction zone thickness. Further ease of use features will eventually be added; an automatic formulator that adjusts concentrations to match desired properties is planned.« less
Performance optimization of Qbox and WEST on Intel Knights Landing
NASA Astrophysics Data System (ADS)
Zheng, Huihuo; Knight, Christopher; Galli, Giulia; Govoni, Marco; Gygi, Francois
We present the optimization of electronic structure codes Qbox and WEST targeting the Intel®Xeon Phi™processor, codenamed Knights Landing (KNL). Qbox is an ab-initio molecular dynamics code based on plane wave density functional theory (DFT) and WEST is a post-DFT code for excited state calculations within many-body perturbation theory. Both Qbox and WEST employ highly scalable algorithms which enable accurate large-scale electronic structure calculations on leadership class supercomputer platforms beyond 100,000 cores, such as Mira and Theta at the Argonne Leadership Computing Facility. In this work, features of the KNL architecture (e.g. hierarchical memory) are explored to achieve higher performance in key algorithms of the Qbox and WEST codes and to develop a road-map for further development targeting next-generation computing architectures. In particular, the optimizations of the Qbox and WEST codes on the KNL platform will target efficient large-scale electronic structure calculations of nanostructured materials exhibiting complex structures and prediction of their electronic and thermal properties for use in solar and thermal energy conversion device. This work was supported by MICCoM, as part of Comp. Mats. Sci. Program funded by the U.S. DOE, Office of Sci., BES, MSE Division. This research used resources of the ALCF, which is a DOE Office of Sci. User Facility under Contract DE-AC02-06CH11357.
NASA Astrophysics Data System (ADS)
Miki, Nobuhiko; Kishiyama, Yoshihisa; Higuchi, Kenichi; Sawahashi, Mamoru; Nakagawa, Masao
In the Evolved UTRA (UMTS Terrestrial Radio Access) downlink, Orthogonal Frequency Division Multiplexing (OFDM) based radio access was adopted because of its inherent immunity to multipath interference and flexible accommodation of different spectrum arrangements. This paper presents the optimum adaptive modulation and channel coding (AMC) scheme when resource blocks (RBs) is simultaneously assigned to the same user when frequency and time domain channel-dependent scheduling is assumed in the downlink OFDMA radio access with single-antenna transmission. We start by presenting selection methods for the modulation and coding scheme (MCS) employing mutual information both for RB-common and RB-dependent modulation schemes. Simulation results show that, irrespective of the application of power adaptation to RB-dependent modulation, the improvement in the achievable throughput of the RB-dependent modulation scheme compared to that for the RB-common modulation scheme is slight, i.e., 4 to 5%. In addition, the number of required control signaling bits in the RB-dependent modulation scheme becomes greater than that for the RB-common modulation scheme. Therefore, we conclude that the RB-common modulation and channel coding rate scheme is preferred, when multiple RBs of the same coded stream are assigned to one user in the case of single-antenna transmission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-11-30
The PeakWorks software is designed to assist in the quantitative analysis of atom probe tomography (APT) generated mass spectra. Specifically, through an interactive user interface, mass peaks can be identified automatically (defined by a threshold) and/or identified manually. The software then provides a means to assign specific elemental isotopes (including more than one) to each peak. The software also provides a means for the user to choose background subtraction of each peak based on background fitting functions, the choice of which is left to the users discretion. Peak ranging (the mass range over which peaks are integrated) is also automatedmore » allowing the user to chose a quantitative range (e.g. full-widthhalf- maximum). The software then integrates all identified peaks, providing a background-subtracted composition, which also includes the deconvolution of peaks (i.e. those peaks that happen to have overlapping isotopic masses). The software is also able to output a 'range file' that can be used in other software packages, such as within IVAS. A range file lists the peak identities, the mass range of each identified peak, and a color code for the peak. The software is also able to generate 'dummy' peak ranges within an outputted range file that can be used within IVAS to provide a means for background subtracted proximity histogram analysis.« less
SMOG 2: A Versatile Software Package for Generating Structure-Based Models.
Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C
2016-03-01
Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.
Support for Systematic Code Reviews with the SCRUB Tool
NASA Technical Reports Server (NTRS)
Holzmann, Gerald J.
2010-01-01
SCRUB is a code review tool that supports both large, team-based software development efforts (e.g., for mission software) as well as individual tasks. The tool was developed at JPL to support a new, streamlined code review process that combines human-generated review reports with program-generated review reports from a customizable range of state-of-the-art source code analyzers. The leading commercial tools include Codesonar, Coverity, and Klocwork, each of which can achieve a reasonably low rate of false-positives in the warnings that they generate. The time required to analyze code with these tools can vary greatly. In each case, however, the tools produce results that would be difficult to realize with human code inspections alone. There is little overlap in the results produced by the different analyzers, and each analyzer used generally increases the effectiveness of the overall effort. The SCRUB tool allows all reports to be accessed through a single, uniform interface (see figure) that facilitates brows ing code and reports. Improvements over existing software include significant simplification, and leveraging of a range of commercial, static source code analyzers in a single, uniform framework. The tool runs as a small stand-alone application, avoiding the security problems related to tools based on Web browsers. A developer or reviewer, for instance, must have already obtained access rights to a code base before that code can be browsed and reviewed with the SCRUB tool. The tool cannot open any files or folders to which the user does not already have access. This means that the tool does not need to enforce or administer any additional security policies. The analysis results presented through the SCRUB tool s user interface are always computed off-line, given that, especially for larger projects, this computation can take longer than appropriate for interactive tool use. The recommended code review process that is supported by the SCRUB tool consists of three phases: Code Review, Developer Response, and Closeout Resolution. In the Code Review phase, all tool-based analysis reports are generated, and specific comments from expert code reviewers are entered into the SCRUB tool. In the second phase, Developer Response, the developer is asked to respond to each comment and tool-report that was produced, either agreeing or disagreeing to provide a fix that addresses the issue that was raised. In the third phase, Closeout Resolution, all disagreements are discussed in a meeting of all parties involved, and a resolution is made for all disagreements. The first two phases generally take one week each, and the third phase is concluded in a single closeout meeting.
PASCO: Structural panel analysis and sizing code: Users manual - Revised
NASA Technical Reports Server (NTRS)
Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.
1981-01-01
A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.
Incorporation of Electrical Systems Models Into an Existing Thermodynamic Cycle Code
NASA Technical Reports Server (NTRS)
Freeh, Josh
2003-01-01
Integration of entire system includes: Fuel cells, motors, propulsors, thermal/power management, compressors, etc. Use of existing, pre-developed NPSS capabilities includes: 1) Optimization tools; 2) Gas turbine models for hybrid systems; 3) Increased interplay between subsystems; 4) Off-design modeling capabilities; 5) Altitude effects; and 6) Existing transient modeling architecture. Other factors inclde: 1) Easier transfer between users and groups of users; 2) General aerospace industry acceptance and familiarity; and 3) Flexible analysis tool that can also be used for ground power applications.
Software Management Environment (SME) release 9.4 user reference material
NASA Technical Reports Server (NTRS)
Hendrick, R.; Kistler, D.; Manter, K.
1992-01-01
This document contains user reference material for the Software Management Environment (SME) prototype, developed for the Systems Development Branch (Code 552) of the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC). The SME provides an integrated set of management tools that can be used by software development managers in their day-to-day management and planning activities. This document provides an overview of the SME, a description of all functions, and detailed instructions concerning the software's installation and use.
Decoding the disease-associated proteins encoded in the human chromosome 4.
Chen, Lien-Chin; Liu, Mei-Ying; Hsiao, Yung-Chin; Choong, Wai-Kok; Wu, Hsin-Yi; Hsu, Wen-Lian; Liao, Pao-Chi; Sung, Ting-Yi; Tsai, Shih-Feng; Yu, Jau-Song; Chen, Yu-Ju
2013-01-04
Chromosome 4 is the fourth largest chromosome, containing approximately 191 megabases (~6.4% of the human genome) with 757 protein-coding genes. A number of marker genes for many diseases have been found in this chromosome, including genetic diseases (e.g., hepatocellular carcinoma) and biomedical research (cardiac system, aging, metabolic disorders, immune system, cancer and stem cell) related genes (e.g., oncogenes, growth factors). As a pilot study for the chromosome 4-centric human proteome project (Chr 4-HPP), we present here a systematic analysis of the disease association, protein isoforms, coding single nucleotide polymorphisms of these 757 protein-coding genes and their experimental evidence at the protein level. We also describe how the findings from the chromosome 4 project might be used to drive the biomarker discovery and validation study in disease-oriented projects, using the examples of secretomic and membrane proteomic approaches in cancer research. By integrating with cancer cell secretomes and several other existing databases in the public domain, we identified 141 chromosome 4-encoded proteins as cancer cell-secretable/shedable proteins. Additionally, we also identified 54 chromosome 4-encoded proteins that have been classified as cancer-associated proteins with successful selected or multiple reaction monitoring (SRM/MRM) assays developed. From literature annotation and topology analysis, 271 proteins were recognized as membrane proteins while 27.9% of the 757 proteins do not have any experimental evidence at the protein-level. In summary, the analysis revealed that the chromosome 4 is a rich resource for cancer-associated proteins for biomarker verification projects and for drug target discovery projects.
Putting Home Data Management into Perspective
2009-12-01
approaches. However, users of home and personal storage live it. Popular interfaces (e.g., iTunes , iPhoto, and even drop-down lists of recently...users of home and personal storage live it. Popular interfaces (e.g., iTunes , iPhoto, and even drop-down lists of recently-opened Word documents...live it. Popular interfaces (e.g., iTunes , iPhoto, and even drop- down lists of recently-opened Word documents) allow users to navigate file
The development of a program analysis environment for Ada: Reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1991-01-01
The Graphical Representations of Algorithms, Structures, and Processes for Ada (GRASP/Ada) has successfully created and prototyped a new algorithm level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and thus improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under the Virtual Memory System (VMS) on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. In Phase 3 of the project, the prototype was prepared for limited distribution (GRASP/Ada Version 3.0) to facilitate evaluation. The user interface was extensively reworked. The current prototype provides the capability for the user to generate CSD from Ada source code in a reverse engineering mode with a level of flexibility suitable for practical application.
ERIC Educational Resources Information Center
Cole, Charles; Mandelblatt, Bertie
2000-01-01
Uses Kintsch's proposition-based construction-integration theory of discourse comprehension to detail the user coding operations that occur in each of the three subsystems (Perception, Comprehension, Application) in which users process an information retrieval systems (IRS) message. Describes an IRS device made up of two separate parts that enable…
User's manual for the BNW-I optimization code for dry-cooled power plants. Volume III. [PLCIRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braun, D.J.; Daniel, D.J.; De Mier, W.V.
1977-01-01
This appendix to User's Manual for the BNW-1 Optimization Code for Dry-Cooled Power Plants provides a listing of the BNW-I optimization code for determining, for a particular size power plant, the optimum dry cooling tower design using a plastic tube cooling surface and circular tower arrangement of the tube bundles. (LCL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William
ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theoristsmore » alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less
Coupled Monte Carlo neutronics and thermal hydraulics for power reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernnat, W.; Buck, M.; Mattes, M.
The availability of high performance computing resources enables more and more the use of detailed Monte Carlo models even for full core power reactors. The detailed structure of the core can be described by lattices, modeled by so-called repeated structures e.g. in Monte Carlo codes such as MCNP5 or MCNPX. For cores with mainly uniform material compositions, fuel and moderator temperatures, there is no problem in constructing core models. However, when the material composition and the temperatures vary strongly a huge number of different material cells must be described which complicate the input and in many cases exceed code ormore » memory limits. The second problem arises with the preparation of corresponding temperature dependent cross sections and thermal scattering laws. Only if these problems can be solved, a realistic coupling of Monte Carlo neutronics with an appropriate thermal-hydraulics model is possible. In this paper a method for the treatment of detailed material and temperature distributions in MCNP5 is described based on user-specified internal functions which assign distinct elements of the core cells to material specifications (e.g. water density) and temperatures from a thermal-hydraulics code. The core grid itself can be described with a uniform material specification. The temperature dependency of cross sections and thermal neutron scattering laws is taken into account by interpolation, requiring only a limited number of data sets generated for different temperatures. Applications will be shown for the stationary part of the Purdue PWR benchmark using ATHLET for thermal- hydraulics and for a generic Modular High Temperature reactor using THERMIX for thermal- hydraulics. (authors)« less
Zradziński, Patryk; Karpowicz, Jolanta; Gryz, Krzysztof; Leszko, Wiesław
2017-06-27
Low frequency magnetic field, inducing electrical field (Ein) inside conductive structures may directly affect the human body, e.g., by electrostimulation in the nervous system. In addition, the spatial distribution and level of Ein are disturbed in tissues neighbouring the medical implant. Numerical models of magneto-therapeutic applicator (emitting sinusoidal magnetic field of frequency 100 Hz) and the user of hearing implant (based on bone conduction: Bonebridge type - IS-BB or BAHA (bone anchorde hearing aid) type - IS-BAHA) were worked out. Values of Ein were analyzed in the model of the implant user's head, e.g., physiotherapist, placed next to the applicator. It was demonstrated that the use of IS-BB or IS-BAHA makes electromagnetic hazards significantly higher (up to 4-fold) compared to the person without implant exposed to magnetic field heterogeneous in space. Hazards for IS-BAHA users are higher than those for IS-BB users. It was found that applying the principles of directive 2013/35/EU, at exposure to magnetic field below exposure limits the direct biophysical effects of exposure in hearing prosthesis users may exceed relevant limits. Whereas applying principles and limits set up by Polish labor law or the International Commission on Non-Ionizing Radiation Protection (ICNIRP) guidelines, the compliance with the exposure limits also ensures the compliance with relevant limits of electric field induced in the body of hearing implant user. It is necessary to assess individually electromagnetic hazard concerning hearing implant users bearing in mind significantly higher hazards to them compared to person without implant or differences between levels of hazards faced by users of implants of various structural or technological solutions. Med Pr 2017;68(4):469-477. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Applied Computational Transonic Aerodynamics,
1982-08-01
contributions. Considering first the body integral (2.95) we now have the situation that, with the effect of the boundary layer represented, e.g. through... effects , (3) static aeroelastic distortion, (4) up to three interfering bodies of nacelle or store type, and (5) an improved method of treating...tip. To date, no modeling of nacelle or store pylons has been included in this code. In the NLR code [641, the effect of (finite) bodies and wing
A graphic user interface for efficient 3D photo-reconstruction based on free software
NASA Astrophysics Data System (ADS)
Castillo, Carlos; James, Michael; Gómez, Jose A.
2015-04-01
Recently, different studies have stressed the applicability of 3D photo-reconstruction based on Structure from Motion algorithms in a wide range of geoscience applications. For the purpose of image photo-reconstruction, a number of commercial and freely available software packages have been developed (e.g. Agisoft Photoscan, VisualSFM). The workflow involves typically different stages such as image matching, sparse and dense photo-reconstruction, point cloud filtering and georeferencing. For approaches using open and free software, each of these stages usually require different applications. In this communication, we present an easy-to-use graphic user interface (GUI) developed in Matlab® code as a tool for efficient 3D photo-reconstruction making use of powerful existing software: VisualSFM (Wu, 2015) for photo-reconstruction and CloudCompare (Girardeau-Montaut, 2015) for point cloud processing. The GUI performs as a manager of configurations and algorithms, taking advantage of the command line modes of existing software, which allows an intuitive and automated processing workflow for the geoscience user. The GUI includes several additional features: a) a routine for significantly reducing the duration of the image matching operation, normally the most time consuming stage; b) graphical outputs for understanding the overall performance of the algorithm (e.g. camera connectivity, point cloud density); c) a number of useful options typically performed before and after the photo-reconstruction stage (e.g. removal of blurry images, image renaming, vegetation filtering); d) a manager of batch processing for the automated reconstruction of different image datasets. In this study we explore the advantages of this new tool by testing its performance using imagery collected in several soil erosion applications. References Girardeau-Montaut, D. 2015. CloudCompare documentation accessed at http://cloudcompare.org/ Wu, C. 2015. VisualSFM documentation access at http://ccwu.me/vsfm/doc.html#.
Developing Toolsets for AirBorne Data (TAD): Overview of Design Concept
NASA Astrophysics Data System (ADS)
Parker, L.; Perez, J.; Chen, G.; Benson, A.; Peeters, M. C.
2013-12-01
NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. Even though the spatial and temporal coverage is limited, the aircraft data offer high resolution and comprehensive simultaneous coverage of many variables, e.g. ozone precursors, intermediate photochemical species, and photochemical products. The recent NASA Earth Venture Program has generated an unprecedented amount of aircraft observations in terms of the sheer number of measurements and data volume. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for aircraft data for scientific research on climate change and air quality relevant issues, particularly: 1) Provide timely access to a broad user community, 2) Provide an intuitive user interface to facilitate quick discovery of the variables and data, 3) Provide data products and tools to facilitate model assessment activities, e.g., merge files and data subsetting capabilities, 4) Provide simple utility 'calculators', e.g., unit conversion and aerosol size distribution processing, and 5) Provide Web Coverage Service capable tools to enhance the data usability. The general strategy and design of TAD will be presented.
Pretest mediction of Semiscale Test S-07-10 B. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobbe, C A
A best estimate prediction of Semiscale Test S-07-10B was performed at INEL by EG and G Idaho as part of the RELAP4/MOD6 code assessment effort and as the Nuclear Regulatory Commission pretest calculation for the Small Break Experiment. The RELAP4/MOD6 Update 4 and the RELAP4/MOD7 computer codes were used to analyze Semiscale Test S-07-10B, a 10% communicative cold leg break experiment. The Semiscale Mod-3 system utilized an electrially heated simulated core operating at a power level of 1.94 MW. The initial system pressure and temperature in the upper plenum was 2276 psia and 604/sup 0/F, respectively.
Purple L1 Milestone Review Panel TotalView Debugger Functionality and Performance for ASC Purple
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, M
2006-12-12
ASC code teams require a robust software debugging tool to help developers quickly find bugs in their codes and get their codes running. Development debugging commonly runs up to 512 processes. Production jobs run up to full ASC Purple scale, and at times require introspection while running. Developers want a debugger that runs on all their development and production platforms and that works with all compilers and runtimes used with ASC codes. The TotalView Multiprocess Debugger made by Etnus was specified for ASC Purple to address this needed capability. The ASC Purple environment builds on the environment seen by TotalViewmore » on ASCI White. The debugger must now operate with the Power5 CPU, Federation switch, AIX 5.3 operating system including large pages, IBM compilers 7 and 9, POE 4.2 parallel environment, and rs6000 SLURM resource manager. Users require robust, basic debugger functionality with acceptable performance at development debugging scale. A TotalView installation must be provided at the beginning of the early user access period that meets these requirements. A functional enhancement, fast conditional data watchpoints, and a scalability enhancement, capability up to 8192 processes, are to be demonstrated.« less
Current and anticipated uses of the thermal hydraulics codes at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less
Measuring Diagnoses: ICD Code Accuracy
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-01-01
Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999
ERIC Educational Resources Information Center
Simkin, Mark G.
2008-01-01
Data-validation routines enable computer applications to test data to ensure their accuracy, completeness, and conformance to industry or proprietary standards. This paper presents five programming cases that require students to validate five different types of data: (1) simple user data entries, (2) UPC codes, (3) passwords, (4) ISBN numbers, and…
Veinot, Tiffany C; Campbell, Terrance R; Kruger, Daniel J; Grodzinski, Alison
2013-01-01
We investigated the user requirements of African-American youth (aged 14-24 years) to inform the design of a culturally appropriate, network-based informatics intervention for the prevention of HIV and other sexually transmitted infections (STI). We conducted 10 focus groups with 75 African-American youth from a city with high HIV/STI prevalence. Data analyses involved coding using qualitative content analysis procedures and memo writing. Unexpectedly, the majority of participants' design recommendations concerned trust. Youth expressed distrust towards people and groups, which was amplified within the context of information technology-mediated interactions about HIV/STI. Participants expressed distrust in the reliability of condoms and the accuracy of HIV tests. They questioned the benevolence of many institutions, and some rejected authoritative HIV/STI information. Therefore, reputational information, including rumor, influenced HIV/STI-related decision making. Participants' design requirements also focused on trust-related concerns. Accordingly, we developed a novel trust-centered design framework to guide intervention design. Current approaches to online trust for health informatics do not consider group-level trusting patterns. Yet, trust was the central intervention-relevant issue among African-American youth, suggesting an important focus for culturally informed design. Our design framework incorporates: intervention objectives (eg, network embeddedness, participation); functional specifications (eg, decision support, collective action, credible question and answer services); and interaction design (eg, member control, offline network linkages, optional anonymity). Trust is a critical focus for HIV/STI informatics interventions for young African Americans. Our design framework offers practical, culturally relevant, and systematic guidance to designers to reach this underserved group better.
NASA Technical Reports Server (NTRS)
Steinke, R. J.
1982-01-01
A FORTRAN computer code is presented for off-design performance prediction of axial-flow compressors. Stage and compressor performance is obtained by a stage-stacking method that uses representative velocity diagrams at rotor inlet and outlet meanline radii. The code has options for: (1) direct user input or calculation of nondimensional stage characteristics; (2) adjustment of stage characteristics for off-design speed and blade setting angle; (3) adjustment of rotor deviation angle for off-design conditions; and (4) SI or U.S. customary units. Correlations from experimental data are used to model real flow conditions. Calculations are compared with experimental data.
ELEFANT: a user-friendly multipurpose geodynamics code
NASA Astrophysics Data System (ADS)
Thieulot, C.
2014-07-01
A new finite element code for the solution of the Stokes and heat transport equations is presented. It has purposely been designed to address geological flow problems in two and three dimensions at crustal and lithospheric scales. The code relies on the Marker-in-Cell technique and Lagrangian markers are used to track materials in the simulation domain which allows recording of the integrated history of deformation; their (number) density is variable and dynamically adapted. A variety of rheologies has been implemented including nonlinear thermally activated dislocation and diffusion creep and brittle (or plastic) frictional models. The code is built on the Arbitrary Lagrangian Eulerian kinematic description: the computational grid deforms vertically and allows for a true free surface while the computational domain remains of constant width in the horizontal direction. The solution to the large system of algebraic equations resulting from the finite element discretisation and linearisation of the set of coupled partial differential equations to be solved is obtained by means of the efficient parallel direct solver MUMPS whose performance is thoroughly tested, or by means of the WISMP and AGMG iterative solvers. The code accuracy is assessed by means of many geodynamically relevant benchmark experiments which highlight specific features or algorithms, e.g., the implementation of the free surface stabilisation algorithm, the (visco-)plastic rheology implementation, the temperature advection, the capacity of the code to handle large viscosity contrasts. A two-dimensional application to salt tectonics presented as case study illustrates the potential of the code to model large scale high resolution thermo-mechanically coupled free surface flows.
User's Manual for PCSMS (Parallel Complex Sparse Matrix Solver). Version 1.
NASA Technical Reports Server (NTRS)
Reddy, C. J.
2000-01-01
PCSMS (Parallel Complex Sparse Matrix Solver) is a computer code written to make use of the existing real sparse direct solvers to solve complex, sparse matrix linear equations. PCSMS converts complex matrices into real matrices and use real, sparse direct matrix solvers to factor and solve the real matrices. The solution vector is reconverted to complex numbers. Though, this utility is written for Silicon Graphics (SGI) real sparse matrix solution routines, it is general in nature and can be easily modified to work with any real sparse matrix solver. The User's Manual is written to make the user acquainted with the installation and operation of the code. Driver routines are given to aid the users to integrate PCSMS routines in their own codes.
Tools4miRs – one place to gather all the tools for miRNA analysis
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-01-01
Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626
Tools4miRs - one place to gather all the tools for miRNA analysis.
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-09-01
MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
ERDA-NASA wind energy project ready to involve users
NASA Technical Reports Server (NTRS)
Thomas, R.; Puthoff, R.; Savino, J.; Johnson, W.
1976-01-01
The NASA contribution to the Wind Energy Project is discussed. NASA is responsible for the following: (1) identification of cost-effective configurations and sizes of wind-conversion systems, (2) the development of technology needed to produce these systems, (3) the design of wind-conversion systems that are compatible with user requirements, particularly utility networks, and (4) technology transfer obtained from the program to stimulate rapid commercial application of wind systems. Various elements of the NASA program are outlined, including industry-built user operation, the evaluation phase, the proposed plan and schedule for site selection and user involvement, supporting research and technology (e.g., energy storage), and component and subsystem technology development.
CFL3D User's Manual (Version 5.0)
NASA Technical Reports Server (NTRS)
Krist, Sherrie L.; Biedron, Robert T.; Rumsey, Christopher L.
1998-01-01
This document is the User's Manual for the CFL3D computer code, a thin-layer Reynolds-averaged Navier-Stokes flow solver for structured multiple-zone grids. Descriptions of the code's input parameters, non-dimensionalizations, file formats, boundary conditions, and equations are included. Sample 2-D and 3-D test cases are also described, and many helpful hints for using the code are provided.
EG and G and NASA face seal codes comparison
NASA Technical Reports Server (NTRS)
Basu, Prit
1994-01-01
This viewgraph presentation presents the following results for the example comparison: EG&G code with face deformations suppressed and SPIRALG agree well with each other as well as with the experimental data; 0 rpm stiffness data calculated by EG&G code are about 70-100 percent lower than that by SPIRALG; there is no appreciable difference between 0 rpm and 16,000 rpm stiffness and damping coefficients calculated by SPIRALG; and the film damping above 500 psig calculated by SPIRALG is much higher than the O-Ring secondary seal damping (e.g. 50 lbf.s/in).
75 FR 32459 - Notice Announcing Preliminary Permit Drawing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-08
... accordingly. \\2\\ 18 CFR 4.37 (2009). See, e.g., BPUS Generation Development, LLC, 126 FERC ] 61,132 (2009). On... drawing. Kimberly D. Bose, Secretary. [FR Doc. 2010-13559 Filed 6-7-10; 8:45 am] BILLING CODE 6717-01-P ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, John C; Peplow, Douglas E.; Mosher, Scott W
2011-01-01
This paper provides a review of the hybrid (Monte Carlo/deterministic) radiation transport methods and codes used at the Oak Ridge National Laboratory and examples of their application for increasing the efficiency of real-world, fixed-source Monte Carlo analyses. The two principal hybrid methods are (1) Consistent Adjoint Driven Importance Sampling (CADIS) for optimization of a localized detector (tally) region (e.g., flux, dose, or reaction rate at a particular location) and (2) Forward Weighted CADIS (FW-CADIS) for optimizing distributions (e.g., mesh tallies over all or part of the problem space) or multiple localized detector regions (e.g., simultaneous optimization of two or moremore » localized tally regions). The two methods have been implemented and automated in both the MAVRIC sequence of SCALE 6 and ADVANTG, a code that works with the MCNP code. As implemented, the methods utilize the results of approximate, fast-running 3-D discrete ordinates transport calculations (with the Denovo code) to generate consistent space- and energy-dependent source and transport (weight windows) biasing parameters. These methods and codes have been applied to many relevant and challenging problems, including calculations of PWR ex-core thermal detector response, dose rates throughout an entire PWR facility, site boundary dose from arrays of commercial spent fuel storage casks, radiation fields for criticality accident alarm system placement, and detector response for special nuclear material detection scenarios and nuclear well-logging tools. Substantial computational speed-ups, generally O(102-4), have been realized for all applications to date. This paper provides a brief review of the methods, their implementation, results of their application, and current development activities, as well as a considerable list of references for readers seeking more information about the methods and/or their applications.« less
Development of MPEG standards for 3D and free viewpoint video
NASA Astrophysics Data System (ADS)
Smolic, Aljoscha; Kimata, Hideaki; Vetro, Anthony
2005-11-01
An overview of 3D and free viewpoint video is given in this paper with special focus on related standardization activities in MPEG. Free viewpoint video allows the user to freely navigate within real world visual scenes, as known from virtual worlds in computer graphics. Suitable 3D scene representation formats are classified and the processing chain is explained. Examples are shown for image-based and model-based free viewpoint video systems, highlighting standards conform realization using MPEG-4. Then the principles of 3D video are introduced providing the user with a 3D depth impression of the observed scene. Example systems are described again focusing on their realization based on MPEG-4. Finally multi-view video coding is described as a key component for 3D and free viewpoint video systems. MPEG is currently working on a new standard for multi-view video coding. The conclusion is that the necessary technology including standard media formats for 3D and free viewpoint is available or will be available in the near future, and that there is a clear demand from industry and user side for such applications. 3DTV at home and free viewpoint video on DVD will be available soon, and will create huge new markets.
NIRP Core Software Suite v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitener, Dustin Heath; Folz, Wesley; Vo, Duong
The NIRP Core Software Suite is a core set of code that supports multiple applications. It includes miscellaneous base code for data objects, mathematic equations, and user interface components; and the framework includes several fully-developed software applications that exist as stand-alone tools to compliment other applications. The stand-alone tools are described below. Analyst Manager: An application to manage contact information for people (analysts) that use the software products. This information is often included in generated reports and may be used to identify the owners of calculations. Radionuclide Viewer: An application for viewing the DCFPAK radiological data. Compliments the Mixture Managermore » tool. Mixture Manager: An application to create and manage radionuclides mixtures that are commonly used in other applications. High Explosive Manager: An application to manage explosives and their properties. Chart Viewer: An application to view charts of data (e.g. meteorology charts). Other applications may use this framework to create charts specific to their data needs.« less
LTCP 2D Graphical User Interface. Application Description and User's Guide
NASA Technical Reports Server (NTRS)
Ball, Robert; Navaz, Homayun K.
1996-01-01
A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.
Zero-forcing pre-coding for MIMO WiMAX transceivers: Performance analysis and implementation issues
NASA Astrophysics Data System (ADS)
Cattoni, A. F.; Le Moullec, Y.; Sacchi, C.
Next generation wireless communication networks are expected to achieve ever increasing data rates. Multi-User Multiple-Input-Multiple-Output (MU-MIMO) is a key technique to obtain the expected performance, because such a technique combines the high capacity achievable using MIMO channel with the benefits of space division multiple access. In MU-MIMO systems, the base stations transmit signals to two or more users over the same channel, for this reason every user can experience inter-user interference. This paper provides a capacity analysis of an online, interference-based pre-coding algorithm able to mitigate the multi-user interference of the MU-MIMO systems in the context of a realistic WiMAX application scenario. Simulation results show that pre-coding can significantly increase the channel capacity. Furthermore, the paper presents several feasibility considerations for implementation of the analyzed technique in a possible FPGA-based software defined radio.
Reporting Codes and Fuel Pathways for the EPA Moderated Transaction System (EMTS)
Users should reference this document for a complete list of all reporting codes and all possible fuel pathways for Renewable Fuel Standard (RFS) and Fuels Averaging, Banking and Trading (ABT) users of the EPA Moderated Transaction System (EMTS).
Uhlirova, Hana; Tian, Peifang; Kılıç, Kıvılcım; Thunemann, Martin; Sridhar, Vishnu B; Chmelik, Radim; Bartsch, Hauke; Dale, Anders M; Devor, Anna; Saisan, Payam A
2018-05-04
The importance of sharing experimental data in neuroscience grows with the amount and complexity of data acquired and various techniques used to obtain and process these data. However, the majority of experimental data, especially from individual studies of regular-sized laboratories never reach wider research community. A graphical user interface (GUI) engine called Neurovascular Network Explorer 2.0 (NNE 2.0) has been created as a tool for simple and low-cost sharing and exploring of vascular imaging data. NNE 2.0 interacts with a database containing optogenetically-evoked dilation/constriction time-courses of individual vessels measured in mice somatosensory cortex in vivo by 2-photon microscopy. NNE 2.0 enables selection and display of the time-courses based on different criteria (subject, branching order, cortical depth, vessel diameter, arteriolar tree) as well as simple mathematical manipulation (e.g. averaging, peak-normalization) and data export. It supports visualization of the vascular network in 3D and enables localization of the individual functional vessel diameter measurements within vascular trees. NNE 2.0, its source code, and the corresponding database are freely downloadable from UCSD Neurovascular Imaging Laboratory website 1 . The source code can be utilized by the users to explore the associated database or as a template for databasing and sharing their own experimental results provided the appropriate format.
Phonological coding during reading.
Leinenger, Mallorie
2014-11-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Phonological coding during reading
Leinenger, Mallorie
2014-01-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679
Data Mining and Knowledge Discovery tools for exploiting big Earth-Observation data
NASA Astrophysics Data System (ADS)
Espinoza Molina, D.; Datcu, M.
2015-04-01
The continuous increase in the size of the archives and in the variety and complexity of Earth-Observation (EO) sensors require new methodologies and tools that allow the end-user to access a large image repository, to extract and to infer knowledge about the patterns hidden in the images, to retrieve dynamically a collection of relevant images, and to support the creation of emerging applications (e.g.: change detection, global monitoring, disaster and risk management, image time series, etc.). In this context, we are concerned with providing a platform for data mining and knowledge discovery content from EO archives. The platform's goal is to implement a communication channel between Payload Ground Segments and the end-user who receives the content of the data coded in an understandable format associated with semantics that is ready for immediate exploitation. It will provide the user with automated tools to explore and understand the content of highly complex images archives. The challenge lies in the extraction of meaningful information and understanding observations of large extended areas, over long periods of time, with a broad variety of EO imaging sensors in synergy with other related measurements and data. The platform is composed of several components such as 1.) ingestion of EO images and related data providing basic features for image analysis, 2.) query engine based on metadata, semantics and image content, 3.) data mining and knowledge discovery tools for supporting the interpretation and understanding of image content, 4.) semantic definition of the image content via machine learning methods. All these components are integrated and supported by a relational database management system, ensuring the integrity and consistency of Terabytes of Earth Observation data.
Effects of Norfolk Harbor Deepening on Management of Craney Island Disposal Area.
1983-04-01
VICKSBURG MS ENVIRONMENTAL LAB UNCLASSIFIED D F HAVES 01 APR 03 F/G 13/2 N smmmhhhhhhhhh smhmhhhhhomhl Ehhhhhhmhrnmm .4’ -Te SIBM 11-25 -A MIROOP RU4TT-oif...CLASSIFICATION OUNCLASSIFIEUNUMTED 1l SAMEAS ItP . C0 W USERS Unclasslfteg 22a. NAME OF RESPONSIBLE INDIVIDUAL 22b. TtLE VOtElA. Ale Code) U&, OfFICE
NASA Technical Reports Server (NTRS)
Sorenson, Reese L.; Alter, Stephen J.
1995-01-01
This document is a users' manual for a new three-dimensional structured multiple-block volume g generator called 3DGRAPE/AL. It is a significantly improved version of the previously-released a widely-distributed programs 3DGRAPE and 3DMAGGS. It generates volume grids by iteratively solving the Poisson Equations in three-dimensions. The right-hand-side terms are designed so that user-specific; grid cell heights and user-specified grid cell skewness near boundary surfaces result automatically, with little user intervention. The code is written in Fortran-77, and can be installed with or without a simple graphical user interface which allows the user to watch as the grid is generated. An introduction describing the improvements over the antecedent 3DGRAPE code is presented first. Then follows a chapter on the basic grid generator program itself, and comments on installing it. The input is then described in detail. After that is a description of the Graphical User Interface. Five example cases are shown next, with plots of the results. Following that is a chapter on two input filters which allow use of input data generated elsewhere. Last is a treatment of the theory embodied in the code.
Improvements in the MGA Code Provide Flexibility and Better Error Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruhter, W D; Kerr, J
2005-05-26
The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less
The Programming Language Python In Earth System Simulations
NASA Astrophysics Data System (ADS)
Gross, L.; Imranullah, A.; Mora, P.; Saez, E.; Smillie, J.; Wang, C.
2004-12-01
Mathematical models in earth sciences base on the solution of systems of coupled, non-linear, time-dependent partial differential equations (PDEs). The spatial and time-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix system. Acknowledgements: Project work is supported by Australian Commonwealth Government through the Australian Computational Earth Systems Simulator Major National Research Facility, Queensland State Government Smart State Research Facility Fund, The University of Queensland and SGI.
Computer program BL2D for solving two-dimensional and axisymmetric boundary layers
NASA Technical Reports Server (NTRS)
Iyer, Venkit
1995-01-01
This report presents the formulation, validation, and user's manual for the computer program BL2D. The program is a fourth-order-accurate solution scheme for solving two-dimensional or axisymmetric boundary layers in speed regimes that range from low subsonic to hypersonic Mach numbers. A basic implementation of the transition zone and turbulence modeling is also included. The code is a result of many improvements made to the program VGBLP, which is described in NASA TM-83207 (February 1982), and can effectively supersede it. The code BL2D is designed to be modular, user-friendly, and portable to any machine with a standard fortran77 compiler. The report contains the new formulation adopted and the details of its implementation. Five validation cases are presented. A detailed user's manual with the input format description and instructions for running the code is included. Adequate information is presented in the report to enable the user to modify or customize the code for specific applications.
NASA Astrophysics Data System (ADS)
Kempton, Eliza M.-R.; Lupu, Roxana; Owusu-Asare, Albert; Slough, Patrick; Cale, Bryson
2017-04-01
We present Exo-Transmit, a software package to calculate exoplanet transmission spectra for planets of varied composition. The code is designed to generate spectra of planets with a wide range of atmospheric composition, temperature, surface gravity, and size, and is therefore applicable to exoplanets ranging in mass and size from hot Jupiters down to rocky super-Earths. Spectra can be generated with or without clouds or hazes with options to (1) include an optically thick cloud deck at a user-specified atmospheric pressure or (2) to augment the nominal Rayleigh scattering by a user-specified factor. The Exo-Transmit code is written in C and is extremely easy to use. Typically the user will only need to edit parameters in a single user input file in order to run the code for a planet of their choosing. Exo-Transmit is available publicly on Github with open-source licensing at https://github.com/elizakempton/Exo_Transmit.
A user's manual for the Electromagnetic Surface Patch code: ESP version 3
NASA Technical Reports Server (NTRS)
Newman, E. H.; Dilsavor, R. L.
1987-01-01
This report serves as a user's manual for Version III of the Electromagnetic Surface Patch Code or ESP code. ESP is user-oriented, based on the method of moments (MM) for treating geometries consisting of an interconnection of thin wires and perfectly conducting polygonal plates. Wire/plate junctions must be about 0.1 lambda or more from any plate edge. Several plates may intersect along a common edge. Excitation may be by either a delta-gap voltage generator or by a plane wave. The thin wires may have finite conductivity and also may contain lumped loads. The code computes most of the usual quantities of interest such as current distribution, input impedance, radiation efficiency, mutual coupling, far zone gain patterns (both polarizations) and radar-cross-section (both/cross polarizations).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. Q.; Shemon, E. R.; Thomas, J. W.
SHARP is an advanced modeling and simulation toolkit for the analysis of nuclear reactors. It is comprised of several components including physical modeling tools, tools to integrate the physics codes for multi-physics analyses, and a set of tools to couple the codes within the MOAB framework. Physics modules currently include the neutronics code PROTEUS, the thermal-hydraulics code Nek5000, and the structural mechanics code Diablo. This manual focuses on performing multi-physics calculations with the SHARP ToolKit. Manuals for the three individual physics modules are available with the SHARP distribution to help the user to either carry out the primary multi-physics calculationmore » with basic knowledge or perform further advanced development with in-depth knowledge of these codes. This manual provides step-by-step instructions on employing SHARP, including how to download and install the code, how to build the drivers for a test case, how to perform a calculation and how to visualize the results. Since SHARP has some specific library and environment dependencies, it is highly recommended that the user read this manual prior to installing SHARP. Verification tests cases are included to check proper installation of each module. It is suggested that the new user should first follow the step-by-step instructions provided for a test problem in this manual to understand the basic procedure of using SHARP before using SHARP for his/her own analysis. Both reference output and scripts are provided along with the test cases in order to verify correct installation and execution of the SHARP package. At the end of this manual, detailed instructions are provided on how to create a new test case so that user can perform novel multi-physics calculations with SHARP. Frequently asked questions are listed at the end of this manual to help the user to troubleshoot issues.« less
Use of CCSDS and OSI Protocols on the Advanced Communications Technology Satellite
NASA Technical Reports Server (NTRS)
Chirieleison, Don
1996-01-01
Although ACTS (Advanced Communications Technology Satellite) provides an almost error-free channel during much of the day and under most conditions, there are times when it is not suitable for reliably error-free data communications when operating in the uncoded mode. Because coded operation is not always available to every earth station, measures must be taken in the end system to maintain adequate throughput when transferring data under adverse conditions. The most effective approach that we tested to improve performance was the addition of an 'outer' Reed-Solomon code through use of CCSDS (Consultative Committee for Space Data Systems) GOS 2 (a forward error correcting code). This addition can benefit all users of an ACTS channel including those applications that do not require totally reliable transport, but it is somewhat expensive because additional hardware is needed. Although we could not characterize the link noise statistically (it appeared to resemble uncorrelated white noise, the type that block codes are least effective in correcting), we did find that CCSDS GOS 2 gave an essentially error-free link at BER's (bit error rate) as high as 6x10(exp -4). For users that demand reliable transport, an ARQ (Automatic Repeat Queuing) protocol such as TCP (Transmission Control Protocol) or TP4 (Transport Protocol, Class 4) will probably be used. In this category, it comes as no surprise that the best choice of the protocol suites tested over ACTS was TP4 using CCSDS GOS 2. TP4 behaves very well over an error-free link which GOS 2 provides up to a point. Without forward error correction, however, TP4 service begins to degrade in the 10(exp -7)-10(exp -6) range and by 4x10(exp -6), it barely gives any throughput at all. If Congestion Avoidance is used in TP4, the degradation is even more pronounced. Fortunately, as demonstrated here, this effect can be more than compensated for by choosing the Selective Acknowledgment option. In fact, this option can enable TP4 to deliver some throughput at error rates as high as 10(exp -5).
Workflow Management for Complex HEP Analyses
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.
2017-10-01
We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.
Operational manual for two-dimensional transonic code TSFOIL
NASA Technical Reports Server (NTRS)
Stahara, S. S.
1978-01-01
This code solves the two-dimensional, transonic, small-disturbance equations for flow past lifting airfoils in both free air and various wind-tunnel environments by using a variant of the finite-difference method. A description of the theoretical and numerical basis of the code is provided, together with complete operating instructions and sample cases for the general user. In addition, a programmer's manual is also presented to assist the user interested in modifying the code. Included in the programmer's manual are a dictionary of subroutine variables in common and a detailed description of each subroutine.
User interfaces for computational science: A domain specific language for OOMMF embedded in Python
NASA Astrophysics Data System (ADS)
Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans
2017-05-01
Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, A.W.
1990-04-01
This paper describes an approach to solve air quality problems which frequently occur during iterations of the baseline change process. From a schedule standpoint, it is desirable to perform this evaluation in as short a time as possible while budgetary pressures limit the size of the staff available to do the work. Without a method in place to deal with baseline change proposal requests the environment analysts may not be able to produce the analysis results in the time frame expected. Using a concept called the Rapid Response Air Quality Analysis System (RAAS), the problems of timing and cost becomemore » tractable. The system could be adapted to assess other atmospheric pathway impacts, e.g., acoustics or visibility. The air quality analysis system used to perform the EA analysis (EA) for the Salt Repository Project (part of the Civilian Radioactive Waste Management Program), and later to evaluate the consequences of proposed baseline changes, consists of three components: Emission source data files; Emission rates contained in spreadsheets; Impact assessment model codes. The spreadsheets contain user-written codes (macros) that calculate emission rates from (1) emission source data (e.g., numbers and locations of sources, detailed operating schedules, and source specifications including horsepower, load factor, and duty cycle); (2) emission factors such as those published by the U.S. Environmental Protection Agency, and (3) control efficiencies.« less
DFMSPH14: A C-code for the double folding interaction potential of two spherical nuclei
NASA Astrophysics Data System (ADS)
Gontchar, I. I.; Chushnyakova, M. V.
2016-09-01
This is a new version of the DFMSPH code designed to obtain the nucleus-nucleus potential by using the double folding model (DFM) and in particular to find the Coulomb barrier. The new version uses the charge, proton, and neutron density distributions provided by the user. Also we added an option for fitting the DFM potential by the Gross-Kalinowski profile. The main functionalities of the original code (e.g. the nucleus-nucleus potential as a function of the distance between the centers of mass of colliding nuclei, the Coulomb barrier characteristics, etc.) have not been modified. Catalog identifier: AEFH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 7211 No. of bytes in distributed program, including test data, etc.: 114404 Distribution format: tar.gz Programming language: C Computer: PC and Mac Operation system: Windows XP and higher, MacOS, Unix/Linux Memory required to execute with typical data: below 10 Mbyte Classification: 17.9 Catalog identifier of previous version: AEFH_v1_0 Journal reference of previous version: Comp. Phys. Comm. 181 (2010) 168 Does the new version supersede the previous version?: Yes Nature of physical problem: The code calculates in a semimicroscopic way the bare interaction potential between two colliding spherical nuclei as a function of the center of mass distance. The height and the position of the Coulomb barrier are found. The calculated potential is approximated by an analytical profile (Woods-Saxon or Gross-Kalinowski) near the barrier. Dependence of the barrier parameters upon the characteristics of the effective NN forces (like, e.g. the range of the exchange part of the nuclear term) can be investigated. Method of solution: The nucleus-nucleus potential is calculated using the double folding model with the Coulomb and the effective M3Y NN interactions. For the direct parts of the Coulomb and the nuclear terms, the Fourier transform method is used. In order to calculate the exchange parts, the density matrix expansion method is applied. Typical running time: less than 1 minute. Reason for new version: Many users asked us how to implement their own density distributions in the DFMSPH. Now this option has been added. Also we found that the calculated Double-Folding Potential (DFP) is approximated more accurately by the Gross-Kalinowski (GK) profile. This option has been also added.
RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphreys, S.L.; Miller, L.A.; Monroe, D.K.
1998-04-01
This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less
Issues central to a useful image understanding environment
NASA Astrophysics Data System (ADS)
Beveridge, J. Ross; Draper, Bruce A.; Hanson, Allen R.; Riseman, Edward M.
1992-04-01
A recent DARPA initiative has sparked interested in software environments for computer vision. The goal is a single environment to support both basic research and technology transfer. This paper lays out six fundamental attributes such a system must possess: (1) support for both C and Lisp, (2) extensibility, (3) data sharing, (4) data query facilities tailored to vision, (5) graphics, and (6) code sharing. The first three attributes fundamentally constrain the system design. Support for both C and Lisp demands some form of database or data-store for passing data between languages. Extensibility demands that system support facilities, such as spatial retrieval of data, be readily extended to new user-defined datatypes. Finally, data sharing demands that data saved by one user, including data of a user-defined type, must be readable by another user.
Rocha, Rodrigo dos Santos; Meireles, José Roberto Cardoso; de Moraes Marcílio Cerqueira, Eneida
2014-01-01
Chromosomal damage and apoptosis were analyzed in users of mouthwash and/or alcoholic beverages, using the micronucleus test on exfoliated oral mucosa cells. Samples from four groups of 20 individuals each were analyzed: three exposed groups (EG1, EG2 and EG3) and a control group (CG). EG1 comprised mouthwash users; EG2 comprised drinkers, and EG3 users of both mouthwashes and alcoholic beverages. Cell material was collected by gently scraping the insides of the cheeks. Then the cells were fixed in a methanol/acetic acid (3:1) solution and stained and counterstained, respectively, with Schiff reactive and fast green. Endpoints were computed on 2,000 cells in a blind test. Statistical analysis showed that chromosomal damage and apoptosis were significantly higher in individuals of groups EG1 and EG3 than in controls (p < 0.005 and p < 0.001, respectively). No significant difference in chromosomal damage and apoptosis was observed between the exposed groups. In EG2, only the occurrence of apoptosis was significantly higher than in the controls. These results suggest that mouthwashes alone or in association with alcoholic drinks induce genotoxic effects, manifested as chromosomal damage and apoptosis. They also suggest that alcoholic drinks are effective for stimulating the process of apoptosis. However, these data need to be confirmed in larger samples. PMID:25505845
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baeza, J.A.; Ureba, A.; Jimenez-Ortega, E.
Purpose: Although there exist several radiotherapy research platforms, such as: CERR, the most widely used and referenced; SlicerRT, which allows treatment plan comparison from various sources; and MMCTP, a full MCTP system; it is still needed a full MCTP toolset that provides users complete control of calculation grids, interpolation methods and filters in order to “fairly” compare results from different TPSs, supporting verification with experimental measurements. Methods: This work presents CARMEN, a MatLab-based platform including multicore and GPGPU accelerated functions for loading RT data; designing treatment plans; and evaluating dose matrices and experimental data.CARMEN supports anatomic and functional imaging inmore » DICOM format, as well as RTSTRUCT, RTPLAN and RTDOSE. Besides, it contains numerous tools to accomplish the MCTP process, managing egs4phant and phase space files.CARMEN planning mode assist in designing IMRT, VMAT and MERT treatments via both inverse and direct optimization. The evaluation mode contains a comprehensive toolset (e.g. 2D/3D gamma evaluation, difference matrices, profiles, DVH, etc.) to compare datasets from commercial TPS, MC simulations (i.e. 3ddose) and radiochromic film in a user-controlled manner. Results: CARMEN has been validated against commercial RTPs and well-established evaluation tools, showing coherent behavior of its multiple algorithms. Furthermore, CARMEN platform has been used to generate competitive complex treatment that has been published in comparative studies. Conclusion: A new research oriented MCTP platform with a customized validation toolset has been presented. Despite of being coded with a high-level programming language, CARMEN is agile due to the use of parallel algorithms. The wide-spread use of MatLab provides straightforward access to CARMEN’s algorithms to most researchers. Similarly, our platform can benefit from the MatLab community scientific developments as filters, registration algorithms etc. Finally, CARMEN arises the importance of grid and filtering control in treatment plan comparison.« less
NEQAIR96,Nonequilibrium and Equilibrium Radiative Transport and Spectra Program: User's Manual
NASA Technical Reports Server (NTRS)
Whiting, Ellis E.; Park, Chul; Liu, Yen; Arnold, James O.; Paterson, John A.
1996-01-01
This document is the User's Manual for a new version of the NEQAIR computer program, NEQAIR96. The program is a line-by-line and a line-of-sight code. It calculates the emission and absorption spectra for atomic and diatomic molecules and the transport of radiation through a nonuniform gas mixture to a surface. The program has been rewritten to make it easy to use, run faster, and include many run-time options that tailor a calculation to the user's requirements. The accuracy and capability have also been improved by including the rotational Hamiltonian matrix formalism for calculating rotational energy levels and Hoenl-London factors for dipole and spin-allowed singlet, doublet, triplet, and quartet transitions. Three sample cases are also included to help the user become familiar with the steps taken to produce a spectrum. A new user interface is included that uses check location, to select run-time options and to enter selected run data, making NEQAIR96 easier to use than the older versions of the code. The ease of its use and the speed of its algorithms make NEQAIR96 a valuable educational code as well as a practical spectroscopic prediction and diagnostic code.
The Navy/NASA Engine Program (NNEP89): A user's manual
NASA Technical Reports Server (NTRS)
Plencner, Robert M.; Snyder, Christopher A.
1991-01-01
An engine simulation computer code called NNEP89 was written to perform 1-D steady state thermodynamic analysis of turbine engine cycles. By using a very flexible method of input, a set of standard components are connected at execution time to simulate almost any turbine engine configuration that the user could imagine. The code was used to simulate a wide range of engine cycles from turboshafts and turboprops to air turborockets and supersonic cruise variable cycle engines. Off design performance is calculated through the use of component performance maps. A chemical equilibrium model is incorporated to adequately predict chemical dissociation as well as model virtually any fuel. NNEP89 is written in standard FORTRAN77 with clear structured programming and extensive internal documentation. The standard FORTRAN77 programming allows it to be installed onto most mainframe computers and workstations without modification. The NNEP89 code was derived from the Navy/NASA Engine program (NNEP). NNEP89 provides many improvements and enhancements to the original NNEP code and incorporates features which make it easier to use for the novice user. This is a comprehensive user's guide for the NNEP89 code.
Jabłoński, Michał; Starčuková, Jana; Starčuk, Zenon
2017-01-23
Proton magnetic resonance spectroscopy is a non-invasive measurement technique which provides information about concentrations of up to 20 metabolites participating in intracellular biochemical processes. In order to obtain any metabolic information from measured spectra a processing should be done in specialized software, like jMRUI. The processing is interactive and complex and often requires many trials before obtaining a correct result. This paper proposes a jMRUI enhancement for efficient and unambiguous history tracking and file identification. A database storing all processing steps, parameters and files used in processing was developed for jMRUI. The solution was developed in Java, authors used a SQL database for robust storage of parameters and SHA-256 hash code for unambiguous file identification. The developed system was integrated directly in jMRUI and it will be publically available. A graphical user interface was implemented in order to make the user experience more comfortable. The database operation is invisible from the point of view of the common user, all tracking operations are performed in the background. The implemented jMRUI database is a tool that can significantly help the user to track the processing history performed on data in jMRUI. The created tool is oriented to be user-friendly, robust and easy to use. The database GUI allows the user to browse the whole processing history of a selected file and learn e.g. what processing lead to the results, where the original data are stored, to obtain the list of all processing actions performed on spectra.
Studying User Income through Language, Behaviour and Affect in Social Media.
Preoţiuc-Pietro, Daniel; Volkova, Svitlana; Lampos, Vasileios; Bachrach, Yoram; Aletras, Nikolaos
2015-01-01
Automatically inferring user demographics from social media posts is useful for both social science research and a range of downstream applications in marketing and politics. We present the first extensive study where user behaviour on Twitter is used to build a predictive model of income. We apply non-linear methods for regression, i.e. Gaussian Processes, achieving strong correlation between predicted and actual user income. This allows us to shed light on the factors that characterise income on Twitter and analyse their interplay with user emotions and sentiment, perceived psycho-demographics and language use expressed through the topics of their posts. Our analysis uncovers correlations between different feature categories and income, some of which reflect common belief e.g. higher perceived education and intelligence indicates higher earnings, known differences e.g. gender and age differences, however, others show novel findings e.g. higher income users express more fear and anger, whereas lower income users express more of the time emotion and opinions.
Studying User Income through Language, Behaviour and Affect in Social Media
Preoţiuc-Pietro, Daniel; Volkova, Svitlana; Lampos, Vasileios; Bachrach, Yoram; Aletras, Nikolaos
2015-01-01
Automatically inferring user demographics from social media posts is useful for both social science research and a range of downstream applications in marketing and politics. We present the first extensive study where user behaviour on Twitter is used to build a predictive model of income. We apply non-linear methods for regression, i.e. Gaussian Processes, achieving strong correlation between predicted and actual user income. This allows us to shed light on the factors that characterise income on Twitter and analyse their interplay with user emotions and sentiment, perceived psycho-demographics and language use expressed through the topics of their posts. Our analysis uncovers correlations between different feature categories and income, some of which reflect common belief e.g. higher perceived education and intelligence indicates higher earnings, known differences e.g. gender and age differences, however, others show novel findings e.g. higher income users express more fear and anger, whereas lower income users express more of the time emotion and opinions. PMID:26394145
A Risk-Continuum Categorization of Product Use Among US Youth Tobacco Users
El-Toukhy, Sherine
2016-01-01
Introduction: To examine prevalence and correlates of five mutually exclusive tobacco-use patterns among US youth tobacco users. Methods: A nationally representative sample of tobacco users (N = 3202, 9–17 years) was classified into five product-use patterns. Weighted multinominal and multivariate logistic regression models were used to examine prevalence of product-use patterns by gender, race and ethnicity, and grade level; and associations between product-use patterns and perceived accessibility of tobacco products, exposure and receptivity to pro-tobacco marketing, social benefits of smoking, and tobacco-associated risks. Results: Dual use (ie, use of two product categories) was the most prevalent pattern (30.5%), followed by non-cigarette combustible only (26.7%), polytobacco (ie, use of three product categories; 17.5%), cigarette only (14.9%), and noncombustible only (10.4%) use. Product-use patterns differed by gender, race, and ethnicity. Compared to cigarette only users, dual and polytobacco users were more likely to be exposed to and be receptive to pro-tobacco marketing, and were less likely to acknowledge tobacco-use related risks (Ps < .05). Conclusions: Curbing tobacco use warrants research on users of more than one tobacco-product categories according to the risk-continuum categorization. Implications: We present a risk-continuum categorization of product-use patterns among tobacco users not older than 17 years. We classify tobacco users into five mutually exclusive product-use patterns: cigarette only users, non-cigarette combustible only users, noncombustible only users, dual use, and polytobacco use. This categorization overcomes limitations in current literature on tobacco-use patterns, which include exclusion of certain products (eg, e-cigarettes) and product-use patterns (eg, exclusive users of non-cigarette products), and inconsistent classification of tobacco users. It is parsimonious yet complex enough to retain differential characteristics of sub-tobacco users based on number (single, dual, polytobacco) and categories (cigarettes, non-cigarette combustibles, noncombustibles) of tobacco products consumed. PMID:26764258
NASA Astrophysics Data System (ADS)
Gao, Shanghua; Fu, Guangyu; Liu, Tai; Zhang, Guoqing
2017-03-01
Tanaka et al. (Geophys J Int 164:273-289, 2006, Geophys J Int 170:1031-1052, 2007) proposed the spherical dislocation theory (SDT) in a spherically symmetric, self-gravitating visco-elastic earth model. However, to date there have been no reports on easily adopted, widely used software that utilizes Tanaka's theory. In this study we introduce a new code to compute post-seismic deformations (PSD), including displacements as well as Geoid and gravity changes, caused by a seismic source at any position. This new code is based on the above-mentioned SDT. The code consists of two parts. The first part is the numerical frame of the dislocation Green function (DGF), which contains a set of two-dimensional discrete numerical frames of DGFs on a symmetric earth model. The second part is an integration function, which performs bi-quadratic spline interpolation operations on the frame of DGFs. The inputs are the information on the seismic fault models and the information on the observation points. After the user prepares the inputs in a file with given format, the code will automatically compute the PSD. As an example, we use the new code to calculate the co-seismic displacements caused by the Tohoku-Oki Mw 9.0 earthquake. We compare the result with observations and the result from a full-elastic SDT, and we found that the Root Mean Square error between the calculated and observed results is 7.4 cm. This verifies the suitability of our new code. Finally, we discuss several issues that require attention when using the code, which should be helpful for users.
Levin, David C; Rao, Vijay M; Parker, Laurence; Frangos, Andrea J; Sunshine, Jonathan H
2011-01-01
Radiologists have always been considered the physicians who "control" noninvasive diagnostic imaging (NDI) and are primarily responsible for its growth. Yet nonradiologists have become increasingly aggressive in their performance and interpretation of imaging. The purpose of this study was to track overall Medicare payments to radiologists and nonradiologist physicians in recent years. The Medicare Part B files covering all fee-for-service physician payments for 1998 to 2008 were the data source. All codes for discretionary NDI were selected. Procedures mandated by the patient's clinical condition (eg, supervision and interpretation codes for interventional procedures, radiation therapy planning) were excluded, as were nonimaging radionuclide tests. Medicare physician specialty codes were used to identify radiologists and nonradiologists. Payments in all places of service were included. Overall Medicare NDI payments to radiologists and nonradiologist physicians from 1998 through 2008 were compared. A separate analysis of NDI payments to cardiologists was conducted, because next to radiologists, they are the highest users of imaging. In 1998, overall Part B payments to radiologists for discretionary NDI were $2.563 billion, compared with $2.020 billion to nonradiologists (ie, radiologists' payments were 27% higher). From 1998 to 2006, payments to nonradiologists increased by 166%, compared with 107% to radiologists. By 2006, payments to nonradiologists exceeded those to radiologists. By 2008, the second year after implementation of the Deficit Reduction Act, payments to radiologists had dropped by 13%, compared with 11% to nonradiologists. In 2008, nonradiologists received $4.807 billion for discretionary NDI, and radiologists received $4.638 billion. Payments to cardiologists for NDI increased by 195% from 1998 to 2006, then dropped by 8% by 2008. The growth in fee-for-service payments to nonradiologists for NDI was considerably more rapid than the growth for radiologists between 1998 and 2006. Then, by the end of 2008, 2 years after the implementation of the Deficit Reduction Act, steeper revenue losses had been experienced by radiologists. The result was that by 2008, overall Medicare fee-for-service payments for NDI were 4% higher to nonradiologists than they were to radiologists. Copyright © 2011 American College of Radiology. Published by Elsevier Inc. All rights reserved.
SIERRA Code Coupling Module: Arpeggio User Manual Version 4.44
DOE Office of Scientific and Technical Information (OSTI.GOV)
Subia, Samuel R.; Overfelt, James R.; Baur, David G.
2017-04-01
The SNL Sierra Mechanics code suite is designed to enable simulation of complex multiphysics scenarios. The code suite is composed of several specialized applications which can operate either in standalone mode or coupled with each other. Arpeggio is a supported utility that enables loose coupling of the various Sierra Mechanics applications by providing access to Framework services that facilitate the coupling. More importantly Arpeggio orchestrates the execution of applications that participate in the coupling. This document describes the various components of Arpeggio and their operability. The intent of the document is to provide a fast path for analysts interested inmore » coupled applications via simple examples of its usage.« less
1981-01-01
Reference Direction4 at " Is - (198) SNetwork’Ports. In either c•es, the port voltagemay be related to the appl &id field on the "segment by’ t~h constant...04 6.|• swot -0 1, i.61-03 45.766 17 0 0.117* 0.US30 ,0001 0.01111,31 1 I. K-03 1.137ft-04 i .3%$K-03 11.i1i is 0 0a1113 0.2178 0.0003 0.00339 1.1117K
Interactive QR code beautification with full background image embedding
NASA Astrophysics Data System (ADS)
Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo
2017-06-01
QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.
In this spreadsheet, user(s) provide their company’s manufacturer code, user contact information for EV-CIS, and user roles. This spreadsheet is used for the Company Authorizing Official (CAO), CROMERR Signer, and EV-CIS Submitters.
Flexible Generation of Kalman Filter Code
NASA Technical Reports Server (NTRS)
Richardson, Julian; Wilson, Edward
2006-01-01
Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator
Adolescent dual-product users: Acquisition and situational use of cigarettes and cigars.
Trapl, Erika S; Koopman Gonzalez, Sarah J; Fryer, Craig S
2018-05-16
Little is known about how adolescents who smoke both cigarettes and cigar products obtain and use these products. This study sought to explore cigarette and cigar acquisition and situational use among high school smokers. Data are drawn from the 2011 Cuyahoga County Youth Risk Behavior Survey. Analysis was limited to youth who smoke cigarettes as well as cigars, cigarillos, and little cigars (CCLC) in the past month (N = 649). Consumption of both products was calculated and used to create four subtypes of users based on high or low use of each product (Dual High, Dual Low, High CCLC/Low Cigarette, and Low CCLC/High Cigarette users). Current users were asked to identify situations in which they use cigarettes and CCLCs and ways in which they obtain these products. Data were analyzed overall and by user subtype. Youth reported acquiring cigarettes and CCLC in similar ways, although youth were more likely to take cigarettes from family members than CCLC (11.1% vs. 4.8%). Several differences were observed between cigarettes and CCLC for situational use. While both products are frequently used in social situations (e.g., with friends), cigarettes were more likely to be used in solitary situations (e.g., before bed). Further, significant differences were observed among the four user subtypes. Study results highlight important, nuanced differences regarding how young multi-tobacco users obtain and the situational use of such products. Importantly, these findings vary by user subtype, informing future interventions to prevent and reduce smoking among the most vulnerable subgroups of youth. Copyright © 2018 Elsevier B.V. All rights reserved.
Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei
2017-01-01
Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060
Intrinsic Radiation Source Generation with the ISC Package: Data Comparisons and Benchmarking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solomon, Clell J. Jr.
The characterization of radioactive emissions from unstable isotopes (intrinsic radiation) is necessary for shielding and radiological-dose calculations from radioactive materials. While most radiation transport codes, e.g., MCNP [X-5 Monte Carlo Team, 2003], provide the capability to input user prescribed source definitions, such as radioactive emissions, they do not provide the capability to calculate the correct radioactive-source definition given the material compositions. Special modifications to MCNP have been developed in the past to allow the user to specify an intrinsic source, but these modification have not been implemented into the primary source base [Estes et al., 1988]. To facilitate the descriptionmore » of the intrinsic radiation source from a material with a specific composition, the Intrinsic Source Constructor library (LIBISC) and MCNP Intrinsic Source Constructor (MISC) utility have been written. The combination of LIBISC and MISC will be herein referred to as the ISC package. LIBISC is a statically linkable C++ library that provides the necessary functionality to construct the intrinsic-radiation source generated by a material. Furthermore, LIBISC provides the ability use different particle-emission databases, radioactive-decay databases, and natural-abundance databases allowing the user flexibility in the specification of the source, if one database is preferred over others. LIBISC also provides functionality for aging materials and producing a thick-target bremsstrahlung photon source approximation from the electron emissions. The MISC utility links to LIBISC and facilitates the description of intrinsic-radiation sources into a format directly usable with the MCNP transport code. Through a series of input keywords and arguments the MISC user can specify the material, age the material if desired, and produce a source description of the radioactive emissions from the material in an MCNP readable format. Further details of using the MISC utility can be obtained from the user guide [Solomon, 2012]. The remainder of this report presents a discussion of the databases available to LIBISC and MISC, a discussion of the models employed by LIBISC, a comparison of the thick-target bremsstrahlung model employed, a benchmark comparison to plutonium and depleted-uranium spheres, and a comparison of the available particle-emission databases.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Advisory... by teleconference. Please dial (877) 930-8819 and enter code 1579739. Web links: Windows Connection-2: http://wm.onlinevideoservice.com/CDC2 Flash Connection-4 (For Safari and Google Chrome Users): http...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Advisory... by teleconference. Please dial (877) 930-8819 and enter code 1579739. Web Links Windows Connection-2: http://wm.onlinevideoservice.com/CDC2 . Flash Connection-4 (For Safari and Google Chrome Users): http...
PRESTO Digital Computer Code User’s Guide. Volume I. System Overview. Revision A.
1980-10-31
Two, Three, Five or Six should address a request to: Headquarters Defense Nuclear Agency Attention: RAEE Washington, D.C. 20305 Similarly, readers...Agency Harry Diamond Laboratories ATTN: RAAE Department of the Army ATTN: NATA ATTN: 21500 ATTN: RAEE ATTN: Chief Div 10000 4 cy ATTN: TITL ATTN
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... when discharges change as a result of the use of an alternative methodology or data for computing flood... land use regulation. (ii) It must be well-documented including source codes and user's manuals. (iii... projects that may effect map changes when they are completed. (4) The datum and date of releveling of...
Applicable Testing and Associated Challenges
DOT National Transportation Integrated Search
2014-12-04
NovAtel Context to Set 1 GPS L1 Only - NovAtel receivers are wideband, at a minimum of 20MHz to adequately capture the full L1 CA main lobe - To achieve 4 cm code and 0.5 mm carrier phase measurements on GPS L1 - GPS L1 only users are typically a SW ...
Elan4/SPARC V9 Cross Loader and Dynamic Linker
DOE Office of Scientific and Technical Information (OSTI.GOV)
anf Fabien Lebaillif-Delamare, Fabrizio Petrini
2004-10-25
The Elan4/Sparc V9 Croos Loader and Liner is a part of the Linux system software that allows the dynamic loading and linking of user code in the network interface Quadrics QsNETII, also called as Elan4 Quadrics. Elan44 uses a thread processor that is based on the assembly instruction set of the Sparc V9. All this software is integrated as a Linux kernel module in the Linux 2.6.5 release.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy
NASA Astrophysics Data System (ADS)
Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.
2016-12-01
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.
Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M
2016-12-07
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual
NASA Technical Reports Server (NTRS)
Raju, M. S.
1998-01-01
EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.
Warner, Echo L; Ellington, Lee; Kirchhoff, Anne C; Cloyes, Kristin G
2018-04-01
Social media (SM) is a burgeoning source of social support for young adults (YAs). We explored the language used to communicate about YA cancer on Instagram and for indicators of social support (i.e., number of likes and comments). Instagram posts using #youngadultcancer were randomly selected (N = 50). Text and hashtags were collected, and posts were coded for gender (female and male), treatment status (active treatment and survivorship), type of user (individual and organization), and caregiver status (yes and no). Indicators of social support, valence (e.g., positive vs. negative terms), and lexical content (e.g., emotional terms and pronouns) were measured using Yoshikoder and Linguistic Inquiry Word Count and compared by gender, treatment status, type of user, and caregiver status. Survivors' posts had more likes compared to those in active treatment (mean: 54.5 vs. 32.3, p = 0.03). Individuals' posts had more comments than those of organizations (mean: 5.3 vs. 1.2, p = 0.01). More positive (30%) than negative (13%) terms were used by survivors (p < 0.01) and those in active treatment (20% vs. 9%, p = 0.04). Individuals' used more positive than negative language (p < 0.01), whereas organizations used equally positive and negative terms. Survivors used more emotional terms (79.6% vs. 34.9%, p < 0.01) and fewer pronouns (mean: 39.5 vs. 71.7, p = 0.01) than those in active treatment. Organizations (71.0%) used more emotional terms than individuals (55.9%, p = 0.03). We describe how Instagram users communicate about YA cancer and whether the language they use garners social support. Studying online language use may help YA patients, caregivers, and organizations use SM to gain social support.
PETSc Users Manual Revision 3.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.; Brown, J.; Buschelman, K.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication. PETSc includes an expanding suite of parallel linear, nonlinear equation solvers and time integrators that may be used in application codes written in Fortran, C, C++, Python, and MATLAB (sequential). PETSc provides many of the mechanisms neededmore » within parallel application codes, such as parallel matrix and vector assembly routines. The library is organized hierarchically, enabling users to employ the level of abstraction that is most appropriate for a particular problem. By using techniques of object-oriented programming, PETSc provides enormous flexibility for users. PETSc is a sophisticated set of software tools; as such, for some users it initially has a much steeper learning curve than a simple subroutine library. In particular, for individuals without some computer science background, experience programming in C, C++ or Fortran and experience using a debugger such as gdb or dbx, it may require a significant amount of time to take full advantage of the features that enable efficient software use. However, the power of the PETSc design and the algorithms it incorporates may make the efficient implementation of many application codes simpler than “rolling them” yourself; For many tasks a package such as MATLAB is often the best tool; PETSc is not intended for the classes of problems for which effective MATLAB code can be written. PETSc also has a MATLAB interface, so portions of your code can be written in MATLAB to “try out” the PETSc solvers. The resulting code will not be scalable however because currently MATLAB is inherently not scalable; and PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential code. Certainly all parts of a previously sequential code need not be parallelized but the matrix generation portion must be parallelized to expect any kind of reasonable performance. Do not expect to generate your matrix sequentially and then “use PETSc” to solve the linear system in parallel. Since PETSc is under continued development, small changes in usage and calling sequences of routines will occur. PETSc is supported; see the web site http://www.mcs.anl.gov/petsc for information on contacting support. A http://www.mcs.anl.gov/petsc/publications may be found a list of publications and web sites that feature work involving PETSc. We welcome any reports of corrections for this document.« less
32 CFR 623.4 - Accounting procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... or O) indicating which loan of the day is first; e.g., A-first, B-second, etc. 51 “M”. 52-53 “G4” for... commercial bills of lading (CBL). Freight charges will be paid by the borrower. The CBL will cite proper project codes. NOTE: In emergencies where use of CBL would delay shipment, government bills of lading (GBL...
32 CFR 623.4 - Accounting procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... or O) indicating which loan of the day is first; e.g., A-first, B-second, etc. 51 “M”. 52-53 “G4” for... commercial bills of lading (CBL). Freight charges will be paid by the borrower. The CBL will cite proper project codes. NOTE: In emergencies where use of CBL would delay shipment, government bills of lading (GBL...
Optimal Codes for the Burst Erasure Channel
NASA Technical Reports Server (NTRS)
Hamkins, Jon
2010-01-01
Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure protection. As can be seen, the simple interleaved RS codes have substantially lower inefficiency over a wide range of transmission lengths.
Direct-Y: Fast Acquisition of the GPS PPS Signal
NASA Technical Reports Server (NTRS)
Namoos, Omar M.; DiEsposti, Raymond S.
1996-01-01
The NAVSTAR Global Positioning System (GPS) provides positioning and time information to military users via the Precise Positioning Service (PPS) which typically allows users a significant margin of precision over the commercially available Standard Positioning Service (SPS), Military sets that rely on first acquiring the SPS Coarse Acquisition (C/A) code, read from the data message the handover word (HOW) that provides the time-of-signal transmission needed to acquire and lock onto the PPS Y-code. Under extreme battlefield conditions, the use of GPS would be denied to the warfighter who cannot pick up the un-encrypted C/A code. Studies are underway at the GPS Joint Program Office (JPO) at the Space and Missile Center, Los Angeles Air Force Base that are aimed at developing the capability to directly acquire Y-code without first acquiring C/A code. This paper briefly outlines efforts to develop 'direct-Y' acquisition, and various approaches to solving this problem. The potential ramifications of direct-Y to military users are also discussed.
Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller
NASA Astrophysics Data System (ADS)
Perdikis, S.; Leeb, R.; Williamson, J.; Ramsay, A.; Tavella, M.; Desideri, L.; Hoogerwerf, E.-J.; Al-Khodairy, A.; Murray-Smith, R.; Millán, J. d. R.
2014-06-01
Objective. While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. Approach. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. Main results. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. Significance. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.
Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller.
Perdikis, S; Leeb, R; Williamson, J; Ramsay, A; Tavella, M; Desideri, L; Hoogerwerf, E-J; Al-Khodairy, A; Murray-Smith, R; Millán, J D R
2014-06-01
While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philip, Bobby
2012-06-01
The Advanced Multi-Physics (AMP) code, in its present form, will allow a user to build a multi-physics application code for existing mechanics and diffusion operators and extend them with user-defined material models and new physics operators. There are examples that demonstrate mechanics, thermo-mechanics, coupled diffusion, and mechanical contact. The AMP code is designed to leverage a variety of mathematical solvers (PETSc, Trilinos, SUNDIALS, and AMP solvers) and mesh databases (LibMesh and AMP) in a consistent interchangeable approach.
Functional Requirements of a Target Description System for Vulnerability Analysis
1979-11-01
called GIFT .1,2 Together the COMGEOM description model and GIFT codes make up the BRL’s target description system. The significance of a target...and modifying target descriptions are described. 1 Lawrence W. Bain, Jr. and Mathew J. Reisinger, "The GIFT Code User Manual; Volume 1...34The GIFT Code User Manual; Volume II, The Output Options," unpublished draft of BRL report. II. UNDERLYING PHILOSOPHY The BRL has a computer
Medical Applications of the PHITS Code (3): User Assistance Program for Medical Physics Computation.
Furuta, Takuya; Hashimoto, Shintaro; Sato, Tatsuhiko
2016-01-01
DICOM2PHITS and PSFC4PHITS are user assistance programs for medical physics PHITS applications. DICOM2PHITS is a program to construct the voxel PHITS simulation geometry from patient CT DICOM image data by using a conversion table from CT number to material composition. PSFC4PHITS is a program to convert the IAEA phase-space file data to PHITS format to be used as a simulation source of PHITS. Both of the programs are useful for users who want to apply PHITS simulation to verification of the treatment planning of radiation therapy. We are now developing a program to convert dose distribution obtained by PHITS to DICOM RT-dose format. We also want to develop a program which is able to implement treatment information included in other DICOM files (RT-plan and RT-structure) as a future plan.
Dissemination and support of ARGUS for accelerator applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User's Guide that documents the use of the code for all users. To release the code and the User's Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less
Development of a Detailed Surface Chemistry Framework in DSMC
NASA Technical Reports Server (NTRS)
Swaminathan-Gopalan, K.; Borner, A.; Stephani, K. A.
2017-01-01
Many of the current direct simulation Monte Carlo (DSMC) codes still employ only simple surface catalysis models. These include only basic mechanisms such as dissociation, recombination, and exchange reactions, without any provision for adsorption and finite rate kinetics. Incorporating finite rate chemistry at the surface is increasingly becoming a necessity for various applications such as high speed re-entry flows over thermal protection systems (TPS), micro-electro-mechanical systems (MEMS), surface catalysis, etc. In the recent years, relatively few works have examined finite-rate surface reaction modeling using the DSMC method.In this work, a generalized finite-rate surface chemistry framework incorporating a comprehensive list of reaction mechanisms is developed and implemented into the DSMC solver SPARTA. The various mechanisms include adsorption, desorption, Langmuir-Hinshelwood (LH), Eley-Rideal (ER), Collision Induced (CI), condensation, sublimation, etc. The approach is to stochastically model the various competing reactions occurring on a set of active sites. Both gas-surface (e.g., ER, CI) and pure-surface (e.g., LH, desorption) reaction mechanisms are incorporated. The reaction mechanisms could also be catalytic or surface altering based on the participation of the bulk-phase species (e.g., bulk carbon atoms). Marschall and MacLean developed a general formulation in which multiple phases and surface sites are used and we adopt a similar convention in the current work. Microscopic parameters of reaction probabilities (for gas-surface reactions) and frequencies (for pure-surface reactions) that are require for DSMC are computed from the surface properties and macroscopic parameters such as rate constants, sticking coefficients, etc. The energy and angular distributions of the products are decided based on the reaction type and input parameters. Thus, the user has the capability to model various surface reactions via user-specified reaction rate constants, surface properties and parameters.
Veinot, Tiffany C; Campbell, Terrance R; Kruger, Daniel J; Grodzinski, Alison
2013-01-01
Objective We investigated the user requirements of African-American youth (aged 14–24 years) to inform the design of a culturally appropriate, network-based informatics intervention for the prevention of HIV and other sexually transmitted infections (STI). Materials and Methods We conducted 10 focus groups with 75 African-American youth from a city with high HIV/STI prevalence. Data analyses involved coding using qualitative content analysis procedures and memo writing. Results Unexpectedly, the majority of participants’ design recommendations concerned trust. Youth expressed distrust towards people and groups, which was amplified within the context of information technology-mediated interactions about HIV/STI. Participants expressed distrust in the reliability of condoms and the accuracy of HIV tests. They questioned the benevolence of many institutions, and some rejected authoritative HIV/STI information. Therefore, reputational information, including rumor, influenced HIV/STI-related decision making. Participants’ design requirements also focused on trust-related concerns. Accordingly, we developed a novel trust-centered design framework to guide intervention design. Discussion Current approaches to online trust for health informatics do not consider group-level trusting patterns. Yet, trust was the central intervention-relevant issue among African-American youth, suggesting an important focus for culturally informed design. Our design framework incorporates: intervention objectives (eg, network embeddedness, participation); functional specifications (eg, decision support, collective action, credible question and answer services); and interaction design (eg, member control, offline network linkages, optional anonymity). Conclusions Trust is a critical focus for HIV/STI informatics interventions for young African Americans. Our design framework offers practical, culturally relevant, and systematic guidance to designers to reach this underserved group better. PMID:23512830
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.
Letter order is not coded by open bigrams
Kinoshita, Sachiko; Norris, Dennis
2013-01-01
Open bigram (OB) models (e.g., SERIOL: Whitney, 2001, 2008; Binary OB, Grainger & van Heuven, 2003; Overlap OB, Grainger et al., 2006; Local combination detector model, Dehaene et al., 2005) posit that letter order in a word is coded by a set of ordered letter pairs. We report three experiments using bigram primes in the same-different match task, investigating the effects of order reversal and the number of letters intervening between the letters in the target. Reversed bigrams (e.g., fo-OF, ob-ABOLISH) produced robust priming, in direct contradiction to the assumption that letter order is coded by the presence of ordered letter pairs. Also in contradiction to the core assumption of current open bigram models, non-contiguous bigrams spanning three letters in the target (e.g., bs-ABOLISH) showed robust priming effects, equivalent in size to contiguous bigrams (e.g., bo-ABOLISH). These results question the role of open bigrams in coding letter order. PMID:23914048
NASA Astrophysics Data System (ADS)
Bailey, M.; Shipley, D. R.; Manning, J. W.
2015-02-01
Empirical fits are developed for depth-compensated wall- and cavity-replacement perturbations in the PTW Roos 34001 and IBA / Scanditronix NACP-02 parallel-plate ionisation chambers, for electron beam qualities from 4 to 22 MeV for depths up to approximately 1.1 × R50,D. These are based on calculations using the Monte Carlo radiation transport code EGSnrc and its user codes with a full simulation of the linac treatment head modelled using BEAMnrc. These fits are used with calculated restricted stopping-power ratios between air and water to match measured depth-dose distributions in water from an Elekta Synergy clinical linear accelerator at the UK National Physical Laboratory. Results compare well with those from recent publications and from the IPEM 2003 electron beam radiotherapy Code of Practice.
NASA Astrophysics Data System (ADS)
Ness, P. H.; Jacobson, H.
1984-10-01
The thrust of 'group technology' is toward the exploitation of similarities in component design and manufacturing process plans to achieve assembly line flow cost efficiencies for small batch production. The systematic method devised for the identification of similarities in component geometry and processing steps is a coding and classification scheme implemented by interactive CAD/CAM systems. This coding and classification scheme has led to significant increases in computer processing power, allowing rapid searches and retrievals on the basis of a 30-digit code together with user-friendly computer graphics.
Adding Pluggable and Personalized Natural Control Capabilities to Existing Applications
Lamberti, Fabrizio; Sanna, Andrea; Carlevaris, Gilles; Demartini, Claudio
2015-01-01
Advancements in input device and sensor technologies led to the evolution of the traditional human-machine interaction paradigm based on the mouse and keyboard. Touch-, gesture- and voice-based interfaces are integrated today in a variety of applications running on consumer devices (e.g., gaming consoles and smartphones). However, to allow existing applications running on desktop computers to utilize natural interaction, significant re-design and re-coding efforts may be required. In this paper, a framework designed to transparently add multi-modal interaction capabilities to applications to which users are accustomed is presented. Experimental observations confirmed the effectiveness of the proposed framework and led to a classification of those applications that could benefit more from the availability of natural interaction modalities. PMID:25635410
Representing videos in tangible products
NASA Astrophysics Data System (ADS)
Fageth, Reiner; Weiting, Ralf
2014-03-01
Videos can be taken with nearly every camera, digital point and shoot cameras, DSLRs as well as smartphones and more and more with so-called action cameras mounted on sports devices. The implementation of videos while generating QR codes and relevant pictures out of the video stream via a software implementation was contents in last years' paper. This year we present first data about what contents is displayed and how the users represent their videos in printed products, e.g. CEWE PHOTOBOOKS and greeting cards. We report the share of the different video formats used, the number of images extracted out of the video in order to represent the video, the positions in the book and different design strategies compared to regular books.
AMIDE: a free software tool for multimodality medical image analysis.
Loening, Andreas Markus; Gambhir, Sanjiv Sam
2003-07-01
Amide's a Medical Image Data Examiner (AMIDE) has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI) and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.
Adding pluggable and personalized natural control capabilities to existing applications.
Lamberti, Fabrizio; Sanna, Andrea; Carlevaris, Gilles; Demartini, Claudio
2015-01-28
Advancements in input device and sensor technologies led to the evolution of the traditional human-machine interaction paradigm based on the mouse and keyboard. Touch-, gesture- and voice-based interfaces are integrated today in a variety of applications running on consumer devices (e.g., gaming consoles and smartphones). However, to allow existing applications running on desktop computers to utilize natural interaction, significant re-design and re-coding efforts may be required. In this paper, a framework designed to transparently add multi-modal interaction capabilities to applications to which users are accustomed is presented. Experimental observations confirmed the effectiveness of the proposed framework and led to a classification of those applications that could benefit more from the availability of natural interaction modalities.
Automatic Data Traffic Control on DSM Architecture
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry; Kwak, Dochan (Technical Monitor)
2000-01-01
We study data traffic on distributed shared memory machines and conclude that data placement and grouping improve performance of scientific codes. We present several methods which user can employ to improve data traffic in his code. We report on implementation of a tool which detects the code fragments causing data congestions and advises user on improvements of data routing in these fragments. The capabilities of the tool include deduction of data alignment and affinity from the source code; detection of the code constructs having abnormally high cache or TLB misses; generation of data placement constructs. We demonstrate the capabilities of the tool on experiments with NAS parallel benchmarks and with a simple computational fluid dynamics application ARC3D.
Pitcher, T J; Lam, M E; Ainsworth, C; Martindale, A; Nakamura, K; Perry, R I; Ward, T
2013-10-01
This paper reports recent developments in Rapfish, a normative, scalable and flexible rapid appraisal technique that integrates both ecological and human dimensions to evaluate the status of fisheries in reference to a norm or goal. Appraisal status targets may be sustainability, compliance with a standard (such as the UN code of conduct for responsible fisheries) or the degree of progress in meeting some other goal or target. The method combines semi-quantitative (e.g. ecological) and qualitative (e.g. social) data via multiple evaluation fields, each of which is assessed through scores assigned to six to 12 attributes or indicators: the scoring method allows user flexibility to adopt a wide range of utility relationships. For assessing sustainability, six evaluation fields have been developed: ecological, technological, economic, social, ethical and institutional. Each field can be assessed directly with a set of scored attributes, or several of the fields can be dealt with in greater detail using nested subfields that themselves comprise multidimensional Rapfish assessments (e.g. the hierarchical institutional field encompasses both governance and management, including a detailed analysis of legality). The user has the choice of including all or only some of the available sustainability fields. For the attributes themselves, there will rarely be quantitative data, but scoring allows these items to be estimated. Indeed, within a normative framework, one important advantage with Rapfish is transparency of the rigour, quality and replicability of the scores. The Rapfish technique employs a constrained multidimensional ordination that is scaled to situate data points within evaluation space. Within each evaluation field, results may be presented as a two-dimensional plot or in a one-dimensional rank order. Uncertainty is expressed through the probability distribution of Monte-Carlo simulations that use the C.L. on each original observation. Overall results of the multidisciplinary analysis may be shown using kite diagrams that compare different locations, time periods (including future projections) and management scenarios, which make policy trade-offs explicit. These enhancements are now available in the R programming language and on an open website, where users can run Rapfish analyses by downloading the software or uploading their data to a user interface. © 2013 The Authors. Journal of Fish Biology © 2013 The Fisheries Society of the British Isles.
GPU Optimizations for a Production Molecular Docking Code*
Landaverde, Raphael; Herbordt, Martin C.
2015-01-01
Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users. PMID:26594667
GPU Optimizations for a Production Molecular Docking Code.
Landaverde, Raphael; Herbordt, Martin C
2014-09-01
Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users.
Hernández, Yözen; Bernstein, Rocky; Pagan, Pedro; Vargas, Levy; McCaig, William; Ramrattan, Girish; Akther, Saymon; Larracuente, Amanda; Di, Lia; Vieira, Filipe G; Qiu, Wei-Gang
2018-03-02
Automated bioinformatics workflows are more robust, easier to maintain, and results more reproducible when built with command-line utilities than with custom-coded scripts. Command-line utilities further benefit by relieving bioinformatics developers to learn the use of, or to interact directly with, biological software libraries. There is however a lack of command-line utilities that leverage popular Open Source biological software toolkits such as BioPerl ( http://bioperl.org ) to make many of the well-designed, robust, and routinely used biological classes available for a wider base of end users. Designed as standard utilities for UNIX-family operating systems, BpWrapper makes functionality of some of the most popular BioPerl modules readily accessible on the command line to novice as well as to experienced bioinformatics practitioners. The initial release of BpWrapper includes four utilities with concise command-line user interfaces, bioseq, bioaln, biotree, and biopop, specialized for manipulation of molecular sequences, sequence alignments, phylogenetic trees, and DNA polymorphisms, respectively. Over a hundred methods are currently available as command-line options and new methods are easily incorporated. Performance of BpWrapper utilities lags that of precompiled utilities while equivalent to that of other utilities based on BioPerl. BpWrapper has been tested on BioPerl Release 1.6, Perl versions 5.10.1 to 5.25.10, and operating systems including Apple macOS, Microsoft Windows, and GNU/Linux. Release code is available from the Comprehensive Perl Archive Network (CPAN) at https://metacpan.org/pod/Bio::BPWrapper . Source code is available on GitHub at https://github.com/bioperl/p5-bpwrapper . BpWrapper improves on existing sequence utilities by following the design principles of Unix text utilities such including a concise user interface, extensive command-line options, and standard input/output for serialized operations. Further, dozens of novel methods for manipulation of sequences, alignments, and phylogenetic trees, unavailable in existing utilities (e.g., EMBOSS, Newick Utilities, and FAST), are provided. Bioinformaticians should find BpWrapper useful for rapid prototyping of workflows on the command-line without creating custom scripts for comparative genomics and other bioinformatics applications.
NASA Technical Reports Server (NTRS)
Kumar, A.
1984-01-01
A computer program NASCRIN has been developed for analyzing two-dimensional flow fields in high-speed inlets. It solves the two-dimensional Euler or Navier-Stokes equations in conservation form by an explicit, two-step finite-difference method. An explicit-implicit method can also be used at the user's discretion for viscous flow calculations. For turbulent flow, an algebraic, two-layer eddy-viscosity model is used. The code is operational on the CDC CYBER 203 computer system and is highly vectorized to take full advantage of the vector-processing capability of the system. It is highly user oriented and is structured in such a way that for most supersonic flow problems, the user has to make only a few changes. Although the code is primarily written for supersonic internal flow, it can be used with suitable changes in the boundary conditions for a variety of other problems.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Scalable Computing of the Mesh Size Effect on Modeling Damage Mechanics in Woven Armor Composites
2008-12-01
manner of a user defined material subroutine to provide overall stress increments to, the parallel LS-DYNA3D a Lagrangian explicit code used in...finite element code, as a user defined material subroutine . The ability of this subroutine to model the effect of the progressions of a select number...is added as a user defined material subroutine to parallel LS-DYNA3D. The computations of the global mesh are handled by LS-DYNA3D and are spread
User's guide to resin infusion simulation program in the FORTRAN language
NASA Technical Reports Server (NTRS)
Weideman, Mark H.; Hammond, Vince H.; Loos, Alfred C.
1992-01-01
RTMCL is a user friendly computer code which simulates the manufacture of fabric composites by the resin infusion process. The computer code is based on the process simulation model described in reference 1. Included in the user's guide is a detailed step by step description of how to run the program and enter and modify the input data set. Sample input and output files are included along with an explanation of the results. Finally, a complete listing of the program is provided.
High-density digital recording
NASA Technical Reports Server (NTRS)
Kalil, F. (Editor); Buschman, A. (Editor)
1985-01-01
The problems associated with high-density digital recording (HDDR) are discussed. Five independent users of HDDR systems and their problems, solutions, and insights are provided as guidance for other users of HDDR systems. Various pulse code modulation coding techniques are reviewed. An introduction to error detection and correction head optimization theory and perpendicular recording are provided. Competitive tape recorder manufacturers apply all of the above theories and techniques and present their offerings. The methodology used by the HDDR Users Subcommittee of THIC to evaluate parallel HDDR systems is presented.
A users manual for the method of moments Aircraft Modeling Code (AMC), version 2
NASA Technical Reports Server (NTRS)
Peters, M. E.; Newman, E. H.
1994-01-01
This report serves as a user's manual for Version 2 of the 'Aircraft Modeling Code' or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. This report describes the input command language and also includes several examples which illustrate typical code inputs and outputs.
A user's manual for the method of moments Aircraft Modeling Code (AMC)
NASA Technical Reports Server (NTRS)
Peters, M. E.; Newman, E. H.
1989-01-01
This report serves as a user's manual for the Aircraft Modeling Code or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. The input command language is described and several examples which illustrate typical code inputs and outputs are also included.
TOOKUIL: A case study in user interface development for safety code application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.L.; Harkins, C.K.; Hoole, J.G.
1997-07-01
Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less
Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1994-01-01
An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.
Computer Aided Design of Polyhedron Solids to Model Air in Com-Geom Descriptions
1983-08-01
34The GIFT Code User Manual, Volume I, Introduction and Input Requirements," BRL Report No. 1802, July 1975 (Unclassified). (AD B0060Z7LK 2G...Kuehl, L. Bain and M. Reisinger, "The GIFT Code User Manual, Volume II, The Output Options," BRL Report ARBRL-TR-02189, September 1979...is generated from the GIFT code under op- tion XSECT. This option produces plot files which define cross- sectional views of the COM-GEOM
NASA Technical Reports Server (NTRS)
Walowit, Jed A.; Shapiro, Wilbur
2005-01-01
The SPIRALI code predicts the performance characteristics of incompressible cylindrical and face seals with or without the inclusion of spiral grooves. Performance characteristics include load capacity (for face seals), leakage flow, power requirements and dynamic characteristics in the form of stiffness, damping and apparent mass coefficients in 4 degrees of freedom for cylindrical seals and 3 degrees of freedom for face seals. These performance characteristics are computed as functions of seal and groove geometry, load or film thickness, running and disturbance speeds, fluid viscosity, and boundary pressures. A derivation of the equations governing the performance of turbulent, incompressible, spiral groove cylindrical and face seals along with a description of their solution is given. The computer codes are described, including an input description, sample cases, and comparisons with results of other codes.
Roland, Carl L; Lake, Joanita; Oderda, Gary M
2016-12-01
We conducted a systematic review to evaluate worldwide human English published literature from 2009 to 2014 on prevalence of opioid misuse/abuse in retrospective databases where International Classification of Diseases (ICD) codes were used. Inclusion criteria for the studies were use of a retrospective database, measured abuse, dependence, and/or poisoning using ICD codes, stated prevalence or it could be derived, and documented time frame. A meta-analysis was not performed. A qualitative narrative synthesis was used, and 16 studies were included for data abstraction. ICD code use varies; 10 studies used ICD codes that encompassed all three terms: abuse, dependence, or poisoning. Eight studies limited determination of misuse/abuse to an opioid user population. Abuse prevalence among opioid users in commercial databases using all three terms of ICD codes varied depending on the opioid; 21 per 1000 persons (reformulated extended-release oxymorphone; 2011-2012) to 113 per 1000 persons (immediate-release opioids; 2010-2011). Abuse prevalence in general populations using all three ICD code terms ranged from 1.15 per 1000 persons (commercial; 6 months 2010) to 8.7 per 1000 persons (Medicaid; 2002-2003). Prevalence increased over time. When similar ICD codes are used, the highest prevalence is in US government-insured populations. Limiting population to continuous opioid users increases prevalence. Prevalence varies depending on ICD codes used, population, time frame, and years studied. Researchers using ICD codes to determine opioid abuse prevalence need to be aware of cautions and limitations.
NASA Astrophysics Data System (ADS)
Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.
2012-12-01
Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Putt, Charles W.
1997-01-01
The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.
Potential markets for a satellite-based mobile communications system
NASA Technical Reports Server (NTRS)
Jamieson, W. M.; Peet, C. S.; Bengston, R. J.
1976-01-01
The objective of the study was to define the market needs for improved land mobile communications systems. Within the context of this objective, the following goals were set: (1) characterize the present mobile communications industry; (2) determine the market for an improved system for mobile communications; and (3) define the system requirements as seen from the potential customer's viewpoint. The scope of the study was defined by the following parameters: (1) markets were confined to U.S. and Canada; (2) range of operation generally exceeded 20 miles, but this was not restrictive; (3) the classes of potential users considered included all private sector users, and non-military public sector users; (4) the time span examined was 1975 to 1985; and (5) highly localized users were generally excluded - e.g., taxicabs, and local paging.
The social disutility of software ownership.
Douglas, David M
2011-09-01
Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.
iTOUGH2 Universal Optimization Using the PEST Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finsterle, S.A.
2010-07-01
iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2more » is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.« less
Application of a single-flicker online SSVEP BCI for spatial navigation.
Chen, Jingjing; Zhang, Dan; Engel, Andreas K; Gong, Qin; Maye, Alexander
2017-01-01
A promising approach for brain-computer interfaces (BCIs) employs the steady-state visual evoked potential (SSVEP) for extracting control information. Main advantages of these SSVEP BCIs are a simple and low-cost setup, little effort to adjust the system parameters to the user and comparatively high information transfer rates (ITR). However, traditional frequency-coded SSVEP BCIs require the user to gaze directly at the selected flicker stimulus, which is liable to cause fatigue or even photic epileptic seizures. The spatially coded SSVEP BCI we present in this article addresses this issue. It uses a single flicker stimulus that appears always in the extrafoveal field of view, yet it allows the user to control four control channels. We demonstrate the embedding of this novel SSVEP stimulation paradigm in the user interface of an online BCI for navigating a 2-dimensional computer game. Offline analysis of the training data reveals an average classification accuracy of 96.9±1.64%, corresponding to an information transfer rate of 30.1±1.8 bits/min. In online mode, the average classification accuracy reached 87.9±11.4%, which resulted in an ITR of 23.8±6.75 bits/min. We did not observe a strong relation between a subject's offline and online performance. Analysis of the online performance over time shows that users can reliably control the new BCI paradigm with stable performance over at least 30 minutes of continuous operation.
w4CSeq: software and web application to analyze 4C-seq data.
Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai
2016-11-01
Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites. Users can establish their own web server by downloading source codes at https://github.com/WGLab/w4CSeq Additionally, a demo web server is available at http://w4cseq.wglab.org CONTACT: kaiwang@usc.edu or wangelu@usc.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Ferret: a user-friendly Java tool to extract data from the 1000 Genomes Project.
Limou, Sophie; Taverner, Andrew M; Winkler, Cheryl A
2016-07-15
The 1000 Genomes (1KG) Project provides a near-comprehensive resource on human genetic variation in worldwide reference populations. 1KG variants can be accessed through a browser and through the raw and annotated data that are regularly released on an ftp server. We developed Ferret, a user-friendly Java tool, to easily extract genetic variation information from these large and complex data files. From a locus, gene(s) or SNP(s) of interest, Ferret retrieves genotype data for 1KG SNPs and indels, and computes allelic frequencies for 1KG populations and optionally, for the Exome Sequencing Project populations. By converting the 1KG data into files that can be imported into popular pre-existing tools (e.g. PLINK and HaploView), Ferret offers a straightforward way, even for non-bioinformatics specialists, to manipulate, explore and merge 1KG data with the user's dataset, as well as visualize linkage disequilibrium pattern, infer haplotypes and design tagSNPs. Ferret tool and source code are publicly available at http://limousophie35.github.io/Ferret/ ferret@nih.gov Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.
Code of Federal Regulations, 2011 CFR
2011-04-01
... ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES MEDICAL DEVICE REPORTING User... device; (10) Event problem codes—patient code and device code (refer to the “MEDWATCH Medical Device... device was involved, nature of the problem, patient followup or required treatment, and any environmental...
Code of Federal Regulations, 2010 CFR
2010-04-01
... ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES MEDICAL DEVICE REPORTING User... device; (10) Event problem codes—patient code and device code (refer to the “MEDWATCH Medical Device... device was involved, nature of the problem, patient followup or required treatment, and any environmental...
User's manual for the FLORA equilibrium and stability code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freis, R.P.; Cohen, B.I.
1985-04-01
This document provides a user's guide to the content and use of the two-dimensional axisymmetric equilibrium and stability code FLORA. FLORA addresses the low-frequency MHD stability of long-thin axisymmetric tandem mirror systems with finite pressure and finite-larmor-radius effects. FLORA solves an initial-value problem for interchange, rotational, and ballooning stability.
An installed nacelle design code using a multiblock Euler solver. Volume 2: User guide
NASA Technical Reports Server (NTRS)
Chen, H. C.
1992-01-01
This is a user manual for the general multiblock Euler design (GMBEDS) code. The code is for the design of a nacelle installed on a geometrically complex configuration such as a complete airplane with wing/body/nacelle/pylon. It consists of two major building blocks: a design module developed by LaRC using directive iterative surface curvature (DISC); and a general multiblock Euler (GMBE) flow solver. The flow field surrounding a complex configuration is divided into a number of topologically simple blocks to facilitate surface-fitted grid generation and improve flow solution efficiency. This user guide provides input data formats along with examples of input files and a Unix script for program execution in the UNICOS environment.
The FORTRAN static source code analyzer program (SAP) user's guide, revision 1
NASA Technical Reports Server (NTRS)
Decker, W.; Taylor, W.; Eslinger, S.
1982-01-01
The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
Investigating the use of quick response codes in the gross anatomy laboratory.
Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B
2015-01-01
The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n = 155) and doctor of physical therapy (n = 39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P = 0.113), and no significant gains in performance (P = 0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance. © 2014 American Association of Anatomists.
Impact of Preoperative Opioid Use After Emergency General Surgery.
Kim, Young; Cortez, Alexander R; Wima, Koffi; Dhar, Vikrom K; Athota, Krishna P; Schrager, Jason J; Pritts, Timothy A; Edwards, Michael J; Shah, Shimul A
2018-01-16
Preoperative exposure to narcotics has recently been associated with poor outcomes after elective major surgery, but little is known as to how preoperative opioid use impacts outcomes after common, emergency general surgical procedures (EGS). A high-volume, single-center analysis was performed on patients who underwent EGS from 2012 to 2013. EGS was defined as the seven emergent operations that account for 80% of the national burden. Preoperative opioid use was defined as having an active opioid prescription within 7 days prior to surgery. Chronic opioid use was defined as having an opioid prescription concurrent with 90 days after discharge. A total of 377 patients underwent EGS during the study period. Preoperative opioid use was present in 84 patients (22.3%). Preoperative opioid users had longer hospital LOS (10.5 vs 6 days), higher costs of care ($25,331 vs $11,454), and higher 30-day readmission rates (22.6 vs 8.2%) compared with opioid-naïve patients (p < 0.001 each). After covariate adjustment, preoperative opioid use was predictive of LOS (RR 1.19 [1.01-1.41]) and 30-day hospital readmission (OR 2.69 [1.25-5.75]) (p < 0.05 each). Total direct cost was not different after modeling. Preoperative opioid users required more narcotic refills compared with opioid-naïve patients (5 vs 0 refills, p < 0.001). After discharge, 15.4% of opioid-naïve patients met criteria for chronic opioid use, vs 77.4% in preoperative opioid users (p < 0.001). Preoperative opioid use is associated with greater resource utilization after emergency general surgery, as well as vastly different postoperative opioid prescription patterns. These findings may help to inform the impact of preoperative opioid use on patient care, and its implications on hospital and societal cost.
Boyd, Carol J; Young, Amy; McCabe, Sean E
2014-01-01
Approximately 18% of US adolescents engaged in prescription opioid abuse in 2013. However, this estimate may be misleading because it includes both medical misusers and nonmedical users, and there is evidence that these are 2 groups that differ relative to substance abuse and criminal risk. Thus, this study does not combine medical and nonmedical users; rather, it seeks to better understand the characteristics of nonmedical users. This was a school-based, cross-sectional study that was conducted during 2009-2010 in southeastern Michigan with a sample of 2627 adolescents using a Web-based survey. Three mutually exclusive groups were created based on responses regarding medical and nonmedical use of opioid analgesics. Group 1 had never used an opioid analgesic, Group 2 used an opioid analgesic only as prescribed, and Group 3 nonmedically used an opioid analgesic. In addition, Group 3 was divided into 2 mutually exclusive subgroups (self-treaters and sensation-seekers) based on reasons for nonmedical use. A series of multinomial logistic regressions were conducted to determine if the groups differed on the presence of pain, psychological symptoms (e.g., affective disorder, conduct disorder, attention-deficit/hyperactivity disorder [ADHD]), and drug abuse. Sixty-five percent (65.0%) of the sample was white/Caucasian and 29.5% was African American. The average age was 14.8 years (SD = 1.9). Seventy percent (70.4%; n = 1850) reported no lifetime opioid use, 24.5% (n = 644) were medical users, 3.5% (n = 92) were nonmedical users who used for pain relief only, and 1.6% (n = 41) were classified as nonmedical users for reasons other than for pain relief (e.g., to get high). Both medical users and nonmedical users reported more pain and substance abuse symptoms compared with never users. Those nonmedical users who used opioids for sensation-seeking motivations had greater odds of having psychological symptoms. These data support the need to further consider subgroups of nonmedical users of opioid analgesics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, John C; Peplow, Douglas E.; Mosher, Scott W
2010-01-01
This paper provides a review of the hybrid (Monte Carlo/deterministic) radiation transport methods and codes used at the Oak Ridge National Laboratory and examples of their application for increasing the efficiency of real-world, fixed-source Monte Carlo analyses. The two principal hybrid methods are (1) Consistent Adjoint Driven Importance Sampling (CADIS) for optimization of a localized detector (tally) region (e.g., flux, dose, or reaction rate at a particular location) and (2) Forward Weighted CADIS (FW-CADIS) for optimizing distributions (e.g., mesh tallies over all or part of the problem space) or multiple localized detector regions (e.g., simultaneous optimization of two or moremore » localized tally regions). The two methods have been implemented and automated in both the MAVRIC sequence of SCALE 6 and ADVANTG, a code that works with the MCNP code. As implemented, the methods utilize the results of approximate, fast-running 3-D discrete ordinates transport calculations (with the Denovo code) to generate consistent space- and energy-dependent source and transport (weight windows) biasing parameters. These methods and codes have been applied to many relevant and challenging problems, including calculations of PWR ex-core thermal detector response, dose rates throughout an entire PWR facility, site boundary dose from arrays of commercial spent fuel storage casks, radiation fields for criticality accident alarm system placement, and detector response for special nuclear material detection scenarios and nuclear well-logging tools. Substantial computational speed-ups, generally O(10{sup 2-4}), have been realized for all applications to date. This paper provides a brief review of the methods, their implementation, results of their application, and current development activities, as well as a considerable list of references for readers seeking more information about the methods and/or their applications.« less
A Semantic Analysis Method for Scientific and Engineering Code
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.
1998-01-01
This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.
Flowers, Natalie L
2010-01-01
CodeSlinger is a desktop application that was developed to aid medical professionals in the intertranslation, exploration, and use of biomedical coding schemes. The application was designed to provide a highly intuitive, easy-to-use interface that simplifies a complex business problem: a set of time-consuming, laborious tasks that were regularly performed by a group of medical professionals involving manually searching coding books, searching the Internet, and checking documentation references. A workplace observation session with a target user revealed the details of the current process and a clear understanding of the business goals of the target user group. These goals drove the design of the application's interface, which centers on searches for medical conditions and displays the codes found in the application's database that represent those conditions. The interface also allows the exploration of complex conceptual relationships across multiple coding schemes.
NASA Technical Reports Server (NTRS)
1975-01-01
A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.
GSE, data management system programmers/User' manual
NASA Technical Reports Server (NTRS)
Schlagheck, R. A.; Dolerhie, B. D., Jr.; Ghiglieri, F. J.
1974-01-01
The GSE data management system is a computerized program which provides for a central storage source for key data associated with the mechanical ground support equipment (MGSE). Eight major sort modes can be requested by the user. Attributes that are printed automatically with each sort include the GSE end item number, description, class code, functional code, fluid media, use location, design responsibility, weight, cost, quantity, dimensions, and applicable documents. Multiple subsorts are available for the class code, functional code, fluid media, use location, design responsibility, and applicable document categories. These sorts and how to use them are described. The program and GSE data bank may be easily updated and expanded.
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Heidegger, Nathan J.; Delaney, Robert A.
1999-01-01
The overall objective of this study was to evaluate the effects of turbulence models in a 3-D numerical analysis on the wake prediction capability. The current version of the computer code resulting from this study is referred to as ADPAC v7 (Advanced Ducted Propfan Analysis Codes -Version 7). This report is intended to serve as a computer program user's manual for the ADPAC code used and modified under Task 15 of NASA Contract NAS3-27394. The ADPAC program is based on a flexible multiple-block and discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Turbulence models now available in the ADPAC code are: a simple mixing-length model, the algebraic Baldwin-Lomax model with user defined coefficients, the one-equation Spalart-Allmaras model, and a two-equation k-R model. The consolidated ADPAC code is capable of executing in either a serial or parallel computing mode from a single source code.
Layered Wyner-Ziv video coding.
Xu, Qian; Xiong, Zixiang
2006-12-01
Following recent theoretical works on successive Wyner-Ziv coding (WZC), we propose a practical layered Wyner-Ziv video coder using the DCT, nested scalar quantization, and irregular LDPC code based Slepian-Wolf coding (or lossless source coding with side information at the decoder). Our main novelty is to use the base layer of a standard scalable video coder (e.g., MPEG-4/H.26L FGS or H.263+) as the decoder side information and perform layered WZC for quality enhancement. Similar to FGS coding, there is no performance difference between layered and monolithic WZC when the enhancement bitstream is generated in our proposed coder. Using an H.26L coded version as the base layer, experiments indicate that WZC gives slightly worse performance than FGS coding when the channel (for both the base and enhancement layers) is noiseless. However, when the channel is noisy, extensive simulations of video transmission over wireless networks conforming to the CDMA2000 1X standard show that H.26L base layer coding plus Wyner-Ziv enhancement layer coding are more robust against channel errors than H.26L FGS coding. These results demonstrate that layered Wyner-Ziv video coding is a promising new technique for video streaming over wireless networks.
Shaping electromagnetic waves using software-automatically-designed metasurfaces.
Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie
2017-06-15
We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
2016-06-01
RAVEN is a software framework able to perform parametric and stochastic analysis based on the response of complex system codes. The initial development was aimed at providing dynamic risk analysis capabilities to the thermohydraulic code RELAP-7, currently under development at Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose stochastic and uncertainty quantification platform, capable of communicating with any system code. In fact, the provided Application Programming Interfaces (APIs) allow RAVEN to interact with any code as long as all the parameters that need to be perturbed are accessible by input filesmore » or via python interfaces. RAVEN is capable of investigating system response and explore input space using various sampling schemes such as Monte Carlo, grid, or Latin hypercube. However, RAVEN strength lies in its system feature discovery capabilities such as: constructing limit surfaces, separating regions of the input space leading to system failure, and using dynamic supervised learning techniques. The development of RAVEN started in 2012 when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework arose. RAVEN’s principal assignment is to provide the necessary software and algorithms in order to employ the concepts developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just to identify the frequency of an event potentially leading to a system failure, but the proximity (or lack thereof) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. peak pressure in a pipe) is exceeded under certain conditions. Most of the capabilities, implemented having RELAP-7 as a principal focus, are easily deployable to other system codes. For this reason, several side activates have been employed (e.g. RELAP5-3D, any MOOSE-based App, etc.) or are currently ongoing for coupling RAVEN with several different software. The aim of this document is to provide a set of commented examples that can help the user to become familiar with the RAVEN code usage.« less
Computation of Reacting Flows in Combustion Processes
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Chen, Kuo-Huey
1997-01-01
The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.
Battisti, Robert A; Roodenrys, Steven; Johnstone, Stuart J; Pesa, Nicole; Hermens, Daniel F; Solowij, Nadia
2010-12-01
Chronic cannabis use has been related to deficits in cognition (particularly memory) and the normal functioning of brain structures sensitive to cannabinoids. There is increasing evidence that conflict monitoring and resolution processes (i.e. the ability to detect and respond to change) may be affected. This study examined the ability to inhibit an automatic reading response in order to activate a more difficult naming response (i.e. conflict resolution) in a variant of the discrete trial Stroop colour-naming task. Event-related brain potentials to neutral, congruent and incongruent trials were compared between 21 cannabis users (mean 16.4 years of near daily use) in the unintoxicated state and 19 non-using controls. Cannabis users showed increased errors on colour-incongruent trials (e.g. "RED" printed in blue ink) but no performance differences from controls on colour congruent (e.g. "RED" printed in red ink) or neutral trials (e.g. "*****" printed in green ink). Poorer incongruent trial performance was predicted by an earlier age of onset of regular cannabis use. Users showed altered expression of a late sustained potential related to conflict resolution, evident by opposite patterns of activity between trial types at midline and central sites, and altered relationships between neurophysiological and behavioural outcome measures not evident in the control group. These findings indicate that chronic use of cannabis may impair the brain's ability to respond optimally in the presence of events that require conflict resolution and hold implications for the ability to refrain from substance misuse and/or maintain substance abstention behaviours.
Automatic removal of cosmic ray signatures in Deep Impact images
NASA Astrophysics Data System (ADS)
Ipatov, S. I.; A'Hearn, M. F.; Klaasen, K. P.
The results of recognition of cosmic ray (CR) signatures on single images made during the Deep Impact mission were analyzed for several codes written by several authors. For automatic removal of CR signatures on many images, we suggest using the code imgclean ( http://pdssbn.astro.umd.edu/volume/didoc_0001/document/calibration_software/dical_v5/) written by E. Deutsch as other codes considered do not work properly automatically with a large number of images and do not run to completion for some images; however, other codes can be better for analysis of certain specific images. Sometimes imgclean detects false CR signatures near the edge of a comet nucleus, and it often does not recognize all pixels of long CR signatures. Our code rmcr is the only code among those considered that allows one to work with raw images. For most visual images made during low solar activity at exposure time t > 4 s, the number of clusters of bright pixels on an image per second per sq. cm of CCD was about 2-4, both for dark and normal sky images. At high solar activity, it sometimes exceeded 10. The ratio of the number of CR signatures consisting of n pixels obtained at high solar activity to that at low solar activity was greater for greater n. The number of clusters detected as CR signatures on a single infrared image is by at least a factor of several greater than the actual number of CR signatures; the number of clusters based on analysis of two successive dark infrared frames is in agreement with an expected number of CR signatures. Some glitches of false CR signatures include bright pixels repeatedly present on different infrared images. Our interactive code imr allows a user to choose the regions on a considered image where glitches detected by imgclean as CR signatures are ignored. In other regions chosen by the user, the brightness of some pixels is replaced by the local median brightness if the brightness of these pixels is greater by some factor than the median brightness. The interactive code allows one to delete long CR signatures and prevents removal of false CR signatures near the edge of the nucleus of the comet. The interactive code can be applied to editing any digital images. Results obtained can be used for other missions to comets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Scott; Bixler, Nathan E.; McFadden, Katherine Letizia
In 1973 the U.S. Environmental Protection Agency (EPA) developed SecPop to calculate population estimates to support a study on air quality. The Nuclear Regulatory Commission (NRC) adopted this program to support siting reviews for nuclear power plant construction and license applications. Currently SecPop is used to prepare site data input files for offsite consequence calculations with the MELCOR Accident Consequence Code System (MACCS). SecPop enables the use of site-specific population, land use, and economic data for a polar grid defined by the user. Updated versions of SecPop have been released to use U.S. decennial census population data. SECPOP90 was releasedmore » in 1997 to use 1990 population and economic data. SECPOP2000 was released in 2003 to use 2000 population data and 1997 economic data. This report describes the current code version, SecPop version 4.3, which uses 2010 population data and both 2007 and 2012 economic data. It is also compatible with 2000 census and 2002 economic data. At the time of this writing, the current version of SecPop is 4.3.0, and that version is described herein. This report contains guidance for the installation and use of the code as well as a description of the theory, models, and algorithms involved. This report contains appendices which describe the development of the 2010 census file, 2007 county file, and 2012 county file. Finally, an appendix is included that describes the validation assessments performed.« less
NASA Astrophysics Data System (ADS)
Umansky, Moti; Weihs, Daphne
2012-08-01
In many physical and biophysical studies, single-particle tracking is utilized to reveal interactions, diffusion coefficients, active modes of driving motion, dynamic local structure, micromechanics, and microrheology. The basic analysis applied to those data is to determine the time-dependent mean-square displacement (MSD) of particle trajectories and perform time- and ensemble-averaging of similar motions. The motion of particles typically exhibits time-dependent power-law scaling, and only trajectories with qualitatively and quantitatively comparable MSD should be ensembled. Ensemble averaging trajectories that arise from different mechanisms, e.g., actively driven and diffusive, is incorrect and can result inaccurate correlations between structure, mechanics, and activity. We have developed an algorithm to automatically and accurately determine power-law scaling of experimentally measured single-particle MSD. Trajectories can then categorized and grouped according to user defined cutoffs of time, amplitudes, scaling exponent values, or combinations. Power-law fits are then provided for each trajectory alongside categorized groups of trajectories, histograms of power laws, and the ensemble-averaged MSD of each group. The codes are designed to be easily incorporated into existing user codes. We expect that this algorithm and program will be invaluable to anyone performing single-particle tracking, be it in physical or biophysical systems. Catalogue identifier: AEMD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 25 892 No. of bytes in distributed program, including test data, etc.: 5 572 780 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.11 (2010b) or higher, program should also be backwards compatible. Symbolic Math Toolboxes (5.5) is required. The Curve Fitting Toolbox (3.0) is recommended. Computer: Tested on Windows only, yet should work on any computer running MATLAB. In Windows 7, should be used as administrator, if the user is not the administrator the program may not be able to save outputs and temporary outputs to all locations. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.11 / 2010b or higher. Supplementary material: Sample output files (approx. 30 MBytes) are available. Classification: 12 External routines: Several MATLAB subfunctions (m-files), freely available on the web, were used as part of and included in, this code: count, NaN suite, parseArgs, roundsd, subaxis, wcov, wmean, and the executable pdfTK.exe. Nature of problem: In many physical and biophysical areas employing single-particle tracking, having the time-dependent power-laws governing the time-averaged meansquare displacement (MSD) of a single particle is crucial. Those power laws determine the mode-of-motion and hint at the underlying mechanisms driving motion. Accurate determination of the power laws that describe each trajectory will allow categorization into groups for further analysis of single trajectories or ensemble analysis, e.g. ensemble and time-averaged MSD. Solution method: The algorithm in the provided program automatically analyzes and fits time-dependent power laws to single particle trajectories, then group particles according to user defined cutoffs. It accepts time-dependent trajectories of several particles, each trajectory is run through the program, its time-averaged MSD is calculated, and power laws are determined in regions where the MSD is linear on a log-log scale. Our algorithm searches for high-curvature points in experimental data, here time-dependent MSD. Those serve as anchor points for determining the ranges of the power-law fits. Power-law scaling is then accurately determined and error estimations of the parameters and quality of fit are provided. After all single trajectory time-averaged MSDs are fit, we obtain cutoffs from the user to categorize and segment the power laws into groups; cutoff are either in exponents of the power laws, time of appearance of the fits, or both together. The trajectories are sorted according to the cutoffs and the time- and ensemble-averaged MSD of each group is provided, with histograms of the distributions of the exponents in each group. The program then allows the user to generate new trajectory files with trajectories segmented according to the determined groups, for any further required analysis. Additional comments: README file giving the names and a brief description of all the files that make-up the package and clear instructions on the installation and execution of the program is included in the distribution package. Running time: On an i5 Windows 7 machine with 4 GB RAM the automated parts of the run (excluding data loading and user input) take less than 45 minutes to analyze and save all stages for an 844 trajectory file, including optional PDF save. Trajectory length did not affect run time (tested up to 3600 frames/trajectory), which was on average 3.2±0.4 seconds per trajectory.
Development of PIMAL: Mathematical Phantom with Moving Arms and Legs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akkurt, Hatice; Eckerman, Keith F.
2007-05-01
The computational model of the human anatomy (phantom) has gone through many revisions since its initial development in the 1970s. The computational phantom model currently used by the Nuclear Regulatory Commission (NRC) is based on a model published in 1974. Hence, the phantom model used by the NRC staff was missing some organs (e.g., neck, esophagus) and tissues. Further, locations of some organs were inappropriate (e.g., thyroid).Moreover, all the computational phantoms were assumed to be in the vertical-upright position. However, many occupational radiation exposures occur with the worker in other positions. In the first phase of this work, updates onmore » the computational phantom models were reviewed and a revised phantom model, which includes the updates for the relevant organs and compositions, was identified. This revised model was adopted as the starting point for this development work, and hence a series of radiation transport computations, using the Monte Carlo code MCNP5, was performed. The computational results were compared against values reported by the International Commission on Radiation Protection (ICRP) in Publication 74. For some of the organs (e.g., thyroid), there were discrepancies between the computed values and the results reported in ICRP-74. The reasons behind these discrepancies have been investigated and are discussed in this report.Additionally, sensitivity computations were performed to determine the sensitivity of the organ doses for certain parameters, including composition and cross sections used in the simulations. To assess the dose for more realistic exposure configurations, the phantom model was revised to enable flexible positioning of the arms and legs. Furthermore, to reduce the user time for analyses, a graphical user interface (GUI) was developed. The GUI can be used to visualize the positioning of the arms and legs as desired posture is achieved to generate the input file, invoke the computations, and extract the organ dose values from the MCNP5 output file. In this report, the main features of the phantom model with moving arms and legs and user interface are described.« less
Parallel evolutionary computation in bioinformatics applications.
Pinho, Jorge; Sobral, João Luis; Rocha, Miguel
2013-05-01
A large number of optimization problems within the field of Bioinformatics require methods able to handle its inherent complexity (e.g. NP-hard problems) and also demand increased computational efforts. In this context, the use of parallel architectures is a necessity. In this work, we propose ParJECoLi, a Java based library that offers a large set of metaheuristic methods (such as Evolutionary Algorithms) and also addresses the issue of its efficient execution on a wide range of parallel architectures. The proposed approach focuses on the easiness of use, making the adaptation to distinct parallel environments (multicore, cluster, grid) transparent to the user. Indeed, this work shows how the development of the optimization library can proceed independently of its adaptation for several architectures, making use of Aspect-Oriented Programming. The pluggable nature of parallelism related modules allows the user to easily configure its environment, adding parallelism modules to the base source code when needed. The performance of the platform is validated with two case studies within biological model optimization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Gil, Yolanda; Michel, Felix; Ratnakar, Varun; Read, Jordan S.; Hauder, Matheus; Duffy, Christopher; Hanson, Paul C.; Dugan, Hilary
2015-01-01
The Web was originally developed to support collaboration in science. Although scientists benefit from many forms of collaboration on the Web (e.g., blogs, wikis, forums, code sharing, etc.), most collaborative projects are coordinated over email, phone calls, and in-person meetings. Our goal is to develop a collaborative infrastructure for scientists to work on complex science questions that require multi-disciplinary contributions to gather and analyze data, that cannot occur without significant coordination to synthesize findings, and that grow organically to accommodate new contributors as needed as the work evolves over time. Our approach is to develop an organic data science framework based on a task-centered organization of the collaboration, includes principles from social sciences for successful on-line communities, and exposes an open science process. Our approach is implemented as an extension of a semantic wiki platform, and captures formal representations of task decomposition structures, relations between tasks and users, and other properties of tasks, data, and other relevant science objects. All these entities are captured through the semantic wiki user interface, represented as semantic web objects, and exported as linked data.
Status and plans for the future of the Vienna VLBI Software
NASA Astrophysics Data System (ADS)
Madzak, Matthias; Böhm, Johannes; Böhm, Sigrid; Girdiuk, Anastasiia; Hellerschmied, Andreas; Hofmeister, Armin; Krasna, Hana; Kwak, Younghee; Landskron, Daniel; Mayer, David; McCallum, Jamie; Plank, Lucia; Schönberger, Caroline; Shabala, Stanislav; Sun, Jing; Teke, Kamil
2016-04-01
The Vienna VLBI Software (VieVS) is a VLBI analysis software developed and maintained at Technische Universität Wien (TU Wien) since 2008 with contributions from groups all over the world. It is used for both academic purposes in university courses as well as for providing VLBI analysis results to the geodetic community. Written in a modular structure in Matlab, VieVS offers easy access to the source code and the possibility to adapt the programs for particular purposes. The new version 2.3, released in December 2015, includes several new parameters to be estimated in the global solution, such as tidal ERP variation coefficients. The graphical user interface was slightly modified for an improved user functionality and, e.g., the possibility of deriving baseline length repeatabilities. The scheduling of satellite observations was refined, the simulator newly includes the effect of source structure which can also be corrected for in the analysis. This poster gives an overview of all VLBI-related activities in Vienna and provides an outlook to future plans concerning the Vienna VLBI Software.
AMPS/PC - AUTOMATIC MANUFACTURING PROGRAMMING SYSTEM
NASA Technical Reports Server (NTRS)
Schroer, B. J.
1994-01-01
The AMPS/PC system is a simulation tool designed to aid the user in defining the specifications of a manufacturing environment and then automatically writing code for the target simulation language, GPSS/PC. The domain of problems that AMPS/PC can simulate are manufacturing assembly lines with subassembly lines and manufacturing cells. The user defines the problem domain by responding to the questions from the interface program. Based on the responses, the interface program creates an internal problem specification file. This file includes the manufacturing process network flow and the attributes for all stations, cells, and stock points. AMPS then uses the problem specification file as input for the automatic code generator program to produce a simulation program in the target language GPSS. The output of the generator program is the source code of the corresponding GPSS/PC simulation program. The system runs entirely on an IBM PC running PC DOS Version 2.0 or higher and is written in Turbo Pascal Version 4 requiring 640K memory and one 360K disk drive. To execute the GPSS program, the PC must have resident the GPSS/PC System Version 2.0 from Minuteman Software. The AMPS/PC program was developed in 1988.
Cunningham, Charles E; Niccols, Alison; Rimas, Heather; Robicheau, Randi; Anderson, Colleen; DeVries, Bart
2017-10-01
To engage users in the design of a regional child and youth health center. The perspective of users should be an integral component of a patient-centered, evidence-based approach to the design of health facilities. We conducted a discrete choice conjoint experiment (DCE), a method from marketing research and health economics, as a component of a strategy to engage users in the preconstruction planning process. A sample of 467 participants (290 staff and 177 clients or community stakeholders) completed the DCE. Latent class analysis identified three segments with different design preferences. A group we termed an enhanced design (57%) segment preferred a fully featured facility with personal contacts at the start of visits (in-person check-in, personal waiting room notification, volunteer-assisted wayfinding, and visible security), a family resource center with a health librarian, and an outdoor playground equipped with covered heated pathways. The self-guided design segment (11%), in contrast, preferred a design allowing a more independent use of the facility (e.g., self-check-in at computer kiosks, color-coded wayfinding, and a self-guided family resource center). Designs affording privacy and personal contact with staff were important to the private design segment (32%). The theme and decor of the building was less important than interactive features and personal contacts. A DCE allowed us to engage users in the planning process by estimating the value of individual design elements, identifying segments with differing views, informing decisions regarding design trade-offs, and simulating user response to design options.
Avidan, Alexander; Weissman, Charles; Levin, Phillip D
2015-04-01
Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
MAC/GMC Code Enhanced for Coupled Electromagnetothermoelastic Analysis of Smart Composites
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.; Aboudi, Jacob
2002-01-01
Intelligent materials are those that exhibit coupling between their electromagnetic response and their thermomechanical response. This coupling allows smart materials to react mechanically (e.g., an induced displacement) to applied electrical or magnetic fields (for instance). These materials find many important applications in sensors, actuators, and transducers. Recently interest has arisen in the development of smart composites that are formed via the combination of two or more phases, one or more of which is a smart material. To design with and utilize smart composites, designers need theories that predict the coupled smart behavior of these materials from the electromagnetothermoelastic properties of the individual phases. The micromechanics model known as the generalized method of cells (GMC) has recently been extended to provide this important capability. This coupled electromagnetothermoelastic theory has recently been incorporated within NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC). This software package is user friendly and has many additional features that render it useful as a design and analysis tool for composite materials in general, and with its new capabilities, for smart composites as well.
NASA Technical Reports Server (NTRS)
Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.
1998-01-01
This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.
Henneberg, M.F.; Strause, J.L.
2002-01-01
This report presents the instructions required to use the Scour Critical Bridge Indicator (SCBI) Code and Scour Assessment Rating (SAR) calculator developed by the Pennsylvania Department of Transportation (PennDOT) and the U.S. Geological Survey to identify Pennsylvania bridges with excessive scour conditions or a high potential for scour. Use of the calculator will enable PennDOT bridge personnel to quickly calculate these scour indices if site conditions change, new bridges are constructed, or new information needs to be included. Both indices are calculated for a bridge simultaneously because they must be used together to be interpreted accurately. The SCBI Code and SAR calculator program is run by a World Wide Web browser from a remote computer. The user can 1) add additional scenarios for bridges in the SCBI Code and SAR calculator database or 2) enter data for new bridges and run the program to calculate the SCBI Code and calculate the SAR. The calculator program allows the user to print the results and to save multiple scenarios for a bridge.
Geospatial Visualization of Scientific Data Through Keyhole Markup Language
NASA Astrophysics Data System (ADS)
Wernecke, J.; Bailey, J. E.
2008-12-01
The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.
NASA Astrophysics Data System (ADS)
Chang, Chao-Hsi; Wang, Jian-Xiong; Wu, Xing-Gang
2006-11-01
An upgraded version of the package BCVEGPY2.0: [C.-H. Chang, J.-X. Wang, X.-G. Wu, Comput. Phys. Commun. 174 (2006) 241] is presented, which works under LINUX system and is named as BCVEGPY2.1. With the version and a GNU C compiler additionally, users may simulate the B-events in various experimental environments very conveniently. It has been manipulated in better modularity and code reusability (less cross communication among various modules) than BCVEGPY2.0 has. Furthermore, in the upgraded version a special execution is arranged as that the GNU command make compiles a requested code with the help of a master makefile in main code directory, and then builds an executable file with the default name run. Finally, this paper may also be considered as an erratum, i.e., typo errors in BCVEGPY2.0 and corrections accordingly have been listed. New version program (BCVEGPY2.1) summaryTitle of program: BCVEGPY2.1 Catalogue identifier: ADTJ_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTJ_v2_1 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference to original program: BCVEGPY2.0 Reference in CPC: Comput. Phys. Commun. 174 (2006) 241 Does the new version supersede the old program: No Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating systems: LINUX Programming language used: FORTRAN 77/90 Memory required to execute with typical data: About 2.0 MB No. of lines in distributed program, including test data, etc.: 31 521 No. of bytes in distributed program, including test data, etc.: 1 310 179 Distribution format: tar.gz Nature of physical problem: Hadronic production of B meson itself and its excited states Method of solution: The code with option can generate weighted and unweighted events. An interface to PYTHIA is provided to meet the needs of jets hadronization in the production. Restrictions on the complexity of the problem: The hadronic production of (cb¯)-quarkonium in S-wave and P-wave states via the mechanism of gluon-gluon fusion are given by the so-called 'complete calculation' approach. Reasons for new version: Responding to the feedback from users, we rearrange the program in a convenient way and then it can be easily adopted by the users to do the simulations according to their own experimental environment (e.g. detector acceptances and experimental cuts). We have paid many efforts to rearrange the program into several modules with less cross communication among the modules, the main program is slimmed down and all the further actions are decoupled from the main program and can be easily called for various purposes. Typical running time: The typical running time is machine and user-parameters dependent. Typically, for production of the S-wave (cb¯)-quarkonium, when IDWTUP = 1, it takes about 20 hour on a 1.8 GHz Intel P4-processor machine to generate 1000 events; however, when IDWTUP = 3, to generate 10 6 events it takes about 40 minutes only. Of the production, the time for the P-wave (cb¯)-quarkonium will take almost two times longer than that for its S-wave quarkonium. Summary of the changes (improvements): (1) The structure and organization of the program have been changed a lot. The new version package BCVEGPY2.1 has been divided into several modules with less cross communication among the modules (some old version source files are divided into several parts for the purpose). The main program is slimmed down and all the further actions are decoupled from the main program so that they can be easily called for various applications. All of the Fortran codes are organized in the main code directory named as bcvegpy2.1, which contains the main program, all of its prerequisite files and subsidiary 'folders' (subdirectory to the main code directory). The method for setting the parameter is the same as that of the previous versions [C.-H. Chang, C. Driouich, P. Eerola, X.-G. Wu, Comput. Phys. Commun. 159 (2004) 192, hep-ph/0309120. [1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
The Role of Ontologies in Schema-based Program Synthesis
NASA Technical Reports Server (NTRS)
Bures, Tomas; Denney, Ewen; Fischer, Bernd; Nistor, Eugen C.
2004-01-01
Program synthesis is the process of automatically deriving executable code from (non-executable) high-level specifications. It is more flexible and powerful than conventional code generation techniques that simply translate algorithmic specifications into lower-level code or only create code skeletons from structural specifications (such as UML class diagrams). Key to building a successful synthesis system is specializing to an appropriate application domain. The AUTOBAYES and AUTOFILTER systems, under development at NASA Ames, operate in the two domains of data analysis and state estimation, respectively. The central concept of both systems is the schema, a representation of reusable computational knowledge. This can take various forms, including high-level algorithm templates, code optimizations, datatype refinements, or architectural information. A schema also contains applicability conditions that are used to determine when it can be applied safely. These conditions can refer to the initial specification, to intermediate results, or to elements of the partially-instantiated code. Schema-based synthesis uses AI technology to recursively apply schemas to gradually refine a specification into executable code. This process proceeds in two main phases. A front-end gradually transforms the problem specification into a program represented in an abstract intermediate code. A backend then compiles this further down into a concrete target programming language of choice. A core engine applies schemas on the initial problem specification, then uses the output of those schemas as the input for other schemas, until the full implementation is generated. Since there might be different schemas that implement different solutions to the same problem this process can generate an entire solution tree. AUTOBAYES and AUTOFILTER have reached the level of maturity where they enable users to solve interesting application problems, e.g., the analysis of Hubble Space Telescope images. They are large (in total around 100kLoC Prolog), knowledge intensive systems that employ complex symbolic reasoning to generate a wide range of non-trivial programs for complex application do- mains. Their schemas can have complex interactions, which make it hard to change them in isolation or even understand what an existing schema actually does. Adding more capabilities by increasing the number of schemas will only worsen this situation, ultimately leading to the entropy death of the synthesis system. The root came of this problem is that the domain knowledge is scattered throughout the entire system and only represented implicitly in the schema implementations. In our current work, we are addressing this problem by making explicit the knowledge from Merent parts of the synthesis system. Here; we discuss how Gruber's definition of an ontology as an explicit specification of a conceptualization matches our efforts in identifying and explicating the domain-specific concepts. We outline the dual role ontologies play in schema-based synthesis and argue that they address different audiences and serve different purposes. Their first role is descriptive: they serve as explicit documentation, and help to understand the internal structure of the system. Their second role is prescriptive: they provide the formal basis against which the other parts of the system (e.g., schemas) can be checked. Their final role is referential: ontologies also provide semantically meaningful "hooks" which allow schemas and tools to access the internal state of the program derivation process (e.g., fragments of the generated code) in domain-specific rather than language-specific terms, and thus to modify it in a controlled fashion. For discussion purposes we use AUTOLINEAR, a small synthesis system we are currently experimenting with, which can generate code for solving a system of linear equations, Az = b.
General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets
NASA Technical Reports Server (NTRS)
Marchen, Luis F.
2011-01-01
The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.
NASA Astrophysics Data System (ADS)
Tarquini, Simone; Nannipieri, Luca
2017-03-01
The increasing availability of high resolution digital elevation models (DEMs) is changing our viewpoint towards Earth surface landforms. Nevertheless, large-coverage, intermediate-resolution DEMs are still largely used, and can be the ideal choice in several applications based on the processing of spatially-integrated information. In 2012 the Istituto Nazionale di Geofisica e Vulcanologia opened a website for the free download of the "TINTALY" Digital Elevation Model (DEM), which covers the whole Italian territory. Since then, about 700 users from 28 different countries have been accredited for data download, and a report of 4 years of data dissemination and use is presented. The analysis of the intended use reveals that the 10 m-resolution, seamless TINITALY DEM is of use for an extremely assorted research community. Accredited users are working in virtually any branch of the Earth Sciences (e.g. Volcanology, Seismology, and Geomorphology), in spatially integrated humanities (e.g. History and Archaeology), and in other thematic areas such as in applied Physics and Zoology. Many users are also working in local administrations (e.g. Regions and Municipalities) for civil protection or land use planning purposes. In summary, the documented activity shows that the dissemination of seamless, large coverage elevation datasets can fertilize the technological progress of the whole society providing a significant benefit to stakeholders.
Proactive Aging Among Holocaust Survivors: Striving for the Best Possible Life.
Elran-Barak, Roni; Barak, Adi; Lomranz, Jacob; Benyamini, Yael
2016-10-14
To investigate methods that older Holocaust survivors and their age peers use in order to maintain the best possible life and to examine associations between these methods and subjective well-being. Participants were 481 older Israelis (mean age 77.4 ± 6.7 years): Holocaust survivors (n = 164), postwar immigrants (n = 183), and prewar immigrants (n = 134). Measures included sociodemographics and indicators of health and well-being. Respondents were asked to answer an open-ended question: "What are the methods you use to maintain the best possible life?". Answers were coded into eight categories. Holocaust survivors were significantly less likely to mention methods coded as "Enjoyment" (32.3%) relative to postwar (43.7%) and prewar (46.2%) immigrants and significantly more likely to mention methods coded as "Maintaining good health" (39.0%) relative to postwar (27.9%) and prewar (21.6%) immigrants. Controlling for sociodemographics and health status, Holocaust survivors still differed from their peers. Aging Holocaust survivors tended to focus on more essential/fundamental needs (e.g., health), whereas their peers tended to focus on a wider range of needs (e.g., enjoyment) in their effort to maintain the best possible life. Our findings may add to the proactivity model of successful aging by suggesting that aging individuals in Israel use both proactive (e.g., health) and cognitive (e.g., accepting the present) adaptation methods, regardless of their reported history during the war. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
EMPHASIS™/Nevada CABANA User Guide Version 2.1.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, C. David; Bohnhoff, William J.; Powell, Jennifer L.
2017-11-15
The CABle ANAlysis (CABANA) portion of the EMPHASIS™ suite is designed specifically for the simulation of cable SGEMP. The code can be used to evaluate the response of a specific cable design to threat or to compare and minimize the relative response of difference designs. This document provides user-specific information to facilitate the application of the code to cables of interest.
Code Optimization and Parallelization on the Origins: Looking from Users' Perspective
NASA Technical Reports Server (NTRS)
Chang, Yan-Tyng Sherry; Thigpen, William W. (Technical Monitor)
2002-01-01
Parallel machines are becoming the main compute engines for high performance computing. Despite their increasing popularity, it is still a challenge for most users to learn the basic techniques to optimize/parallelize their codes on such platforms. In this paper, we present some experiences on learning these techniques for the Origin systems at the NASA Advanced Supercomputing Division. Emphasis of this paper will be on a few essential issues (with examples) that general users should master when they work with the Origins as well as other parallel systems.
An arbitrary grid CFD algorithm for configuration aerodynamics analysis. Volume 2: FEMNAS user guide
NASA Technical Reports Server (NTRS)
Manhardt, Paul D.; Orzechowski, J. A.; Baker, A. J.
1992-01-01
This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.
A Risk-Continuum Categorization of Product Use Among US Youth Tobacco Users.
El-Toukhy, Sherine; Choi, Kelvin
2016-07-01
To examine prevalence and correlates of five mutually exclusive tobacco-use patterns among US youth tobacco users. A nationally representative sample of tobacco users (N = 3202, 9-17 years) was classified into five product-use patterns. Weighted multinominal and multivariate logistic regression models were used to examine prevalence of product-use patterns by gender, race and ethnicity, and grade level; and associations between product-use patterns and perceived accessibility of tobacco products, exposure and receptivity to pro-tobacco marketing, social benefits of smoking, and tobacco-associated risks. Dual use (ie, use of two product categories) was the most prevalent pattern (30.5%), followed by non-cigarette combustible only (26.7%), polytobacco (ie, use of three product categories; 17.5%), cigarette only (14.9%), and noncombustible only (10.4%) use. Product-use patterns differed by gender, race, and ethnicity. Compared to cigarette only users, dual and polytobacco users were more likely to be exposed to and be receptive to pro-tobacco marketing, and were less likely to acknowledge tobacco-use related risks (Ps < .05). Curbing tobacco use warrants research on users of more than one tobacco-product categories according to the risk-continuum categorization. We present a risk-continuum categorization of product-use patterns among tobacco users not older than 17 years. We classify tobacco users into five mutually exclusive product-use patterns: cigarette only users, non-cigarette combustible only users, noncombustible only users, dual use, and polytobacco use. This categorization overcomes limitations in current literature on tobacco-use patterns, which include exclusion of certain products (eg, e-cigarettes) and product-use patterns (eg, exclusive users of non-cigarette products), and inconsistent classification of tobacco users. It is parsimonious yet complex enough to retain differential characteristics of sub-tobacco users based on number (single, dual, polytobacco) and categories (cigarettes, non-cigarette combustibles, noncombustibles) of tobacco products consumed. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Solid Geometric Modeling - The Key to Improved Materiel Acquisition from Concept to Deployment
1984-09-01
M. J. Reisinger, "The GIFT Code User Manual; Volume I, Introduction and Input Requirements (U)," BRL Report No. 1802, July 1975. AD# A078364. 8 G...G. Kuehl, L. W. Bain, Jr., M. J. Reisinger, "The GIFT Code User Manual; Volume II, The Output Options (U)," USA ARRAOCOM Report No. 02189, Sep 79, AD...A078364 . • These results are plotted by a code called RunShot written by L. M. Rybak which takes input from GIFT and plots color shotlines on a
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.
1994-01-01
The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. This user's manual describes how to use the ADPAC code as developed in Task 5, NAS3-25270, including the modifications made to date in Tasks 7 and 8, NAS3-25270.
1987-03-31
I E.EF-C.-J I So-SUSSCR I SEP -RJLITI 4 1 So"-.LESC F Il BEP -TEF-r, I I So4-SIJSCR I EEP- POLUT 1 001---- 5 ISk-SUESEC PI E: I Si -SUE:-CP I EP-TEPtl...wartime) E- r 3 Table 1;-l (cont) U Code Agency - Department of Defense DQ Central Security Service 3 DR Defense Contract Audit Agency DS Defense
Perceiving groups: The people perception of diversity and hierarchy.
Phillips, L Taylor; Slepian, Michael L; Hughes, Brent L
2018-05-01
The visual perception of individuals has received considerable attention (visual person perception), but little social psychological work has examined the processes underlying the visual perception of groups of people (visual people perception). Ensemble-coding is a visual mechanism that automatically extracts summary statistics (e.g., average size) of lower-level sets of stimuli (e.g., geometric figures), and also extends to the visual perception of groups of faces. Here, we consider whether ensemble-coding supports people perception, allowing individuals to form rapid, accurate impressions about groups of people. Across nine studies, we demonstrate that people visually extract high-level properties (e.g., diversity, hierarchy) that are unique to social groups, as opposed to individual persons. Observers rapidly and accurately perceived group diversity and hierarchy, or variance across race, gender, and dominance (Studies 1-3). Further, results persist when observers are given very short display times, backward pattern masks, color- and contrast-controlled stimuli, and absolute versus relative response options (Studies 4a-7b), suggesting robust effects supported specifically by ensemble-coding mechanisms. Together, we show that humans can rapidly and accurately perceive not only individual persons, but also emergent social information unique to groups of people. These people perception findings demonstrate the importance of visual processes for enabling people to perceive social groups and behave effectively in group-based social interactions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Blind information-theoretic multiuser detection algorithms for DS-CDMA and WCDMA downlink systems.
Waheed, Khuram; Salem, Fathi M
2005-07-01
Code division multiple access (CDMA) is based on the spread-spectrum technology and is a dominant air interface for 2.5G, 3G, and future wireless networks. For the CDMA downlink, the transmitted CDMA signals from the base station (BS) propagate through a noisy multipath fading communication channel before arriving at the receiver of the user equipment/mobile station (UE/MS). Classical CDMA single-user detection (SUD) algorithms implemented in the UE/MS receiver do not provide the required performance for modern high data-rate applications. In contrast, multi-user detection (MUD) approaches require a lot of a priori information not available to the UE/MS. In this paper, three promising adaptive Riemannian contra-variant (or natural) gradient based user detection approaches, capable of handling the highly dynamic wireless environments, are proposed. The first approach, blind multiuser detection (BMUD), is the process of simultaneously estimating multiple symbol sequences associated with all the users in the downlink of a CDMA communication system using only the received wireless data and without any knowledge of the user spreading codes. This approach is applicable to CDMA systems with relatively short spreading codes but becomes impractical for systems using long spreading codes. We also propose two other adaptive approaches, namely, RAKE -blind source recovery (RAKE-BSR) and RAKE-principal component analysis (RAKE-PCA) that fuse an adaptive stage into a standard RAKE receiver. This adaptation results in robust user detection algorithms with performance exceeding the linear minimum mean squared error (LMMSE) detectors for both Direct Sequence CDMA (DS-CDMA) and wide-band CDMA (WCDMA) systems under conditions of congestion, imprecise channel estimation and unmodeled multiple access interference (MAI).
Remarks on CFD validation: A Boeing Commercial Airplane Company perspective
NASA Technical Reports Server (NTRS)
Rubbert, Paul E.
1987-01-01
Requirements and meaning of validation of computational fluid dynamics codes are discussed. Topics covered include: validating a code, validating a user, and calibrating a code. All results are presented in viewgraph format.
76 FR 20611 - Electronic On-Board Recorders and Hours of Service Supporting Documents
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-13
..., used, and disseminated (e.g., in post- accident litigation or in personal litigation such as divorce proceedings). Based on the factors above, the Agency has determined that the statute requires it to protect... Doc. 2011-8789 Filed 4-12-11; 8:45 am] BILLING CODE 4910-EX-P ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
Everyday listening questionnaire: correlation between subjective hearing and objective performance.
Brendel, Martina; Frohne-Buechner, Carolin; Lesinski-Schiedat, Anke; Lenarz, Thomas; Buechner, Andreas
2014-01-01
Clinical experience has demonstrated that speech understanding by cochlear implant (CI) recipients has improved over recent years with the development of new technology. The Everyday Listening Questionnaire 2 (ELQ 2) was designed to collect information regarding the challenges faced by CI recipients in everyday listening. The aim of this study was to compare self-assessment of CI users using ELQ 2 with objective speech recognition measures and to compare results between users of older and newer coding strategies. During their regular clinical review appointments a group of representative adult CI recipients implanted with the Advanced Bionics implant system were asked to complete the questionnaire. The first 100 patients who agreed to participate in this survey were recruited independent of processor generation and speech coding strategy. Correlations between subjectively scored hearing performance in everyday listening situations and objectively measured speech perception abilities were examined relative to the speech coding strategies used. When subjects were grouped by strategy there were significant differences between users of older 'standard' strategies and users of the newer, currently available strategies (HiRes and HiRes 120), especially in the categories of telephone use and music perception. Significant correlations were found between certain subjective ratings and the objective speech perception data in noise. There is a good correlation between subjective and objective data. Users of more recent speech coding strategies tend to have fewer problems in difficult hearing situations.
Programmable ubiquitous telerobotic devices
NASA Astrophysics Data System (ADS)
Doherty, Michael; Greene, Matthew; Keaton, David; Och, Christian; Seidl, Matthew L.; Waite, William; Zorn, Benjamin G.
1997-12-01
We are investigating a field of research that we call ubiquitous telepresence, which involves the design and implementation of low-cost robotic devices that can be programmed and operated from anywhere on the Internet. These devices, which we call ubots, can be used for academic purposes (e.g., a biologist could remote conduct a population survey), commercial purposes (e.g., a house could be shown remotely by a real-estate agent), and for recreation and education (e.g., someone could tour a museum remotely). We anticipate that such devices will become increasingly common due to recent changes in hardware and software technology. In particular, current hardware technology enables such devices to be constructed very cheaply (less than $500), and current software and network technology allows highly portable code to be written and downloaded across the Internet. In this paper, we present our prototype system architecture, and the ubot implementation we have constructed based on it. The hardware technology we use is the handy board, a 6811-based controller board with digital and analog inputs and outputs. Our software includes a network layer based on TCP/IP and software layers written in Java. Our software enables users across the Internet to program the behavior of the vehicle and to receive image feedback from a camera mounted on it.
Clinician-patient communication measures: drilling down into assumptions, approaches, and analyses.
Street, Richard L; Mazor, Kathleen M
2017-08-01
To critically examine properties of clinician-patient communication measures and offer suggestions for selecting measures appropriate to the purposes of research or clinical practice assessment. We analyzed different types of communication measures by focusing on their ontological properties. We describe their relative advantages and disadvantages with respect to different types of research questions. Communication measures vary along dimensions of reporter (observer vs. participant), focus of measurement (behavior, meaning, or quality), target, and timing. Observer coded measures of communication behavior function well as dependent variables (e.g., evaluating communication skill interventions, examining variability related to gender or race), but are less effective as predictors of perceptions and health outcomes. Measures of participants' judgments (e.g., what the communication means or how well it was done) capture patients' or clinicians' experiences (e.g., satisfaction) and can be useful for predicting outcomes, especially in longitudinal designs. In the absence of a theoretically coherent set of measures that could be used across research programs and applied setting, users should take steps to select measures with properties that are optimally matched to specific questions. Quality assessments of clinician-patient communication should take into account the timing of the assessment and use measures that drill down into specific aspects of patient experience to mitigate ceiling effects. Copyright © 2017 Elsevier B.V. All rights reserved.
Franco, Natália M; Medeiros, Gabriel F; Silva, Edson A; Murta, Angela S; Machado, Aydano P; Fidalgo, Robson N
2015-01-01
This work presents a Modeling Language and its technological infrastructure to customize the vocabulary of Communication Boards (CB), which are important tools to provide more humanization of health care. Using a technological infrastructure based on Model-Driven Development (MDD) approach, our Modelin Language (ML) creates an abstraction layer between users (e.g., health professionals such as an audiologist or speech therapist) and application code. Moreover, the use of a metamodel enables a syntactic corrector for preventing creation of wrong models. Our ML and metamodel enable more autonomy for health professionals in creating customized CB because it abstracts complexities and permits them to deal only with the domain concepts (e.g., vocabulary and patient needs). Additionally, our infrastructure provides a configuration file that can be used to share and reuse models. This way, the vocabulary modelling effort will decrease our time since people share vocabulary models. Our study provides an infrastructure that aims to abstract the complexity of CB vocabulary customization, giving more autonomy to health professionals when they need customizing, sharing and reusing vocabularies for CB.
14 CFR 1215.111 - User postponement of service.
Code of Federal Regulations, 2011 CFR
2011-01-01
... RELAY SATELLITE SYSTEM (TDRSS) Use and Reimbursement Policy for Non-U.S. Government Users § 1215.111 User postponement of service. The user may postpone the initiation of contracted service (e.g., user... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true User postponement of service. 1215.111...
14 CFR 1215.111 - User postponement of service.
Code of Federal Regulations, 2010 CFR
2010-01-01
... RELAY SATELLITE SYSTEM (TDRSS) Use and Reimbursement Policy for Non-U.S. Government Users § 1215.111 User postponement of service. The user may postpone the initiation of contracted service (e.g., user... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false User postponement of service. 1215.111...
Implementation of behavior change techniques in mobile applications for physical activity.
Yang, Chih-Hsiang; Maher, Jaclyn P; Conroy, David E
2015-04-01
Mobile applications (apps) for physical activity are popular and hold promise for promoting behavior change and reducing non-communicable disease risk. App marketing materials describe a limited number of behavior change techniques (BCTs), but apps may include unmarketed BCTs, which are important as well. To characterize the extent to which BCTs have been implemented in apps from a systematic user inspection of apps. Top-ranked physical activity apps (N=100) were identified in November 2013 and analyzed in 2014. BCTs were coded using a contemporary taxonomy following a user inspection of apps. Users identified an average of 6.6 BCTs per app and most BCTs in the taxonomy were not represented in any apps. The most common BCTs involved providing social support, information about others' approval, instructions on how to perform a behavior, demonstrations of the behavior, and feedback on the behavior. A latent class analysis of BCT configurations revealed that apps focused on providing support and feedback as well as support and education. Contemporary physical activity apps have implemented a limited number of BCTs and have favored BCTs with a modest evidence base over others with more established evidence of efficacy (e.g., social media integration for providing social support versus active self-monitoring by users). Social support is a ubiquitous feature of contemporary physical activity apps and differences between apps lie primarily in whether the limited BCTs provide education or feedback about physical activity. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.