A test of the validity of the motivational interviewing treatment integrity code.
Forsberg, Lars; Berman, Anne H; Kallmén, Håkan; Hermansson, Ulric; Helgason, Asgeir R
2008-01-01
To evaluate the Swedish version of the Motivational Interviewing Treatment Code (MITI), MITI coding was applied to tape-recorded counseling sessions. Construct validity was assessed using factor analysis on 120 MITI-coded sessions. Discriminant validity was assessed by comparing MITI coding of motivational interviewing (MI) sessions with information- and advice-giving sessions as well as by comparing MI-trained practitioners with untrained practitioners. A principal-axis factoring analysis yielded some evidence for MITI construct validity. MITI differentiated between practitioners with different levels of MI training as well as between MI practitioners and advice-giving counselors, thus supporting discriminant validity. MITI may be used as a training tool together with supervision to confirm and enhance MI practice in clinical settings. MITI can also serve as a tool for evaluating MI integrity in clinical research.
Common Errors in the Calculation of Aircrew Doses from Cosmic Rays
NASA Astrophysics Data System (ADS)
O'Brien, Keran; Felsberger, Ernst; Kindl, Peter
2010-05-01
Radiation doses to air crew are calculated using flight codes. Flight codes integrate dose rates over the aircraft flight path, which were calculated by transport codes or obtained by measurements from take off at a specific airport to landing at another. The dose rates are stored in various ways, such as by latitude and longitude, or in terms of the geomagnetic vertical cutoff. The transport codes are generally quite satisfactory, but the treatment of the boundary conditions is frequently incorrect. Both the treatment of solar modulation and of the effect of the geomagnetic field are often defective, leading to the systematic overestimate of the crew doses.
Anonymization of DICOM Electronic Medical Records for Radiation Therapy
Newhauser, Wayne; Jones, Timothy; Swerdloff, Stuart; Newhauser, Warren; Cilia, Mark; Carver, Robert; Halloran, Andy; Zhang, Rui
2014-01-01
Electronic medical records (EMR) and treatment plans are used in research on patient outcomes and radiation effects. In many situations researchers must remove protected health information (PHI) from EMRs. The literature contains several studies describing the anonymization of generic Digital Imaging and Communication in Medicine (DICOM) files and DICOM image sets but no publications were found that discuss the anonymization of DICOM radiation therapy plans, a key component of an EMR in a cancer clinic. In addition to this we were unable to find a commercial software tool that met the minimum requirements for anonymization and preservation of data integrity for radiation therapy research. The purpose of this study was to develop a prototype software code to meet the requirements for the anonymization of radiation therapy treatment plans and to develop a way to validate that code and demonstrate that it properly anonymized treatment plans and preserved data integrity. We extended an open-source code to process all relevant PHI and to allow for the automatic anonymization of multiple EMRs. The prototype code successfully anonymized multiple treatment plans in less than 1 minute per patient. We also tested commercial optical character recognition (OCR) algorithms for the detection of burned-in text on the images, but they were unable to reliably recognize text. In addition, we developed and tested an image filtering algorithm that allowed us to isolate and redact alpha-numeric text from a test radiograph. Validation tests verified that PHI was anonymized and data integrity, such as the relationship between DICOM unique identifiers (UID) was preserved. PMID:25147130
Anonymization of DICOM electronic medical records for radiation therapy.
Newhauser, Wayne; Jones, Timothy; Swerdloff, Stuart; Newhauser, Warren; Cilia, Mark; Carver, Robert; Halloran, Andy; Zhang, Rui
2014-10-01
Electronic medical records (EMR) and treatment plans are used in research on patient outcomes and radiation effects. In many situations researchers must remove protected health information (PHI) from EMRs. The literature contains several studies describing the anonymization of generic Digital Imaging and Communication in Medicine (DICOM) files and DICOM image sets but no publications were found that discuss the anonymization of DICOM radiation therapy plans, a key component of an EMR in a cancer clinic. In addition to this we were unable to find a commercial software tool that met the minimum requirements for anonymization and preservation of data integrity for radiation therapy research. The purpose of this study was to develop a prototype software code to meet the requirements for the anonymization of radiation therapy treatment plans and to develop a way to validate that code and demonstrate that it properly anonymized treatment plans and preserved data integrity. We extended an open-source code to process all relevant PHI and to allow for the automatic anonymization of multiple EMRs. The prototype code successfully anonymized multiple treatment plans in less than 1min/patient. We also tested commercial optical character recognition (OCR) algorithms for the detection of burned-in text on the images, but they were unable to reliably recognize text. In addition, we developed and tested an image filtering algorithm that allowed us to isolate and redact alpha-numeric text from a test radiograph. Validation tests verified that PHI was anonymized and data integrity, such as the relationship between DICOM unique identifiers (UID) was preserved. Copyright © 2014 Elsevier Ltd. All rights reserved.
Using QR codes to enable quick access to information in acute cancer care.
Upton, Joanne; Olsson-Brown, Anna; Marshall, Ernie; Sacco, Joseph
2017-05-25
Quick access to toxicity management information ensures timely access to steroids/immunosuppressive treatment for cancer patients experiencing immune-related adverse events, thus reducing length of hospital stays or avoiding hospital admission entirely. This article discusses a project to add a QR (quick response) code to a patient-held immunotherapy alert card. As QR code generation is free and the immunotherapy clinical management algorithms were already publicly available through the trust's clinical network website, the costs of integrating a QR code into the alert card, after printing, were low, while the potential benefits are numerous. Patient-held alert cards are widely used for patients receiving anti-cancer treatment, and this established standard of care has been modified to enable rapid access of information through the incorporation of a QR code.
ASTEC—the Aarhus STellar Evolution Code
NASA Astrophysics Data System (ADS)
Christensen-Dalsgaard, Jørgen
2008-08-01
The Aarhus code is the result of a long development, starting in 1974, and still ongoing. A novel feature is the integration of the computation of adiabatic oscillations for specified models as part of the code. It offers substantial flexibility in terms of microphysics and has been carefully tested for the computation of solar models. However, considerable development is still required in the treatment of nuclear reactions, diffusion and convective mixing.
NASA Technical Reports Server (NTRS)
Pratt, D. T.; Radhakrishnan, K.
1986-01-01
The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.
YNOGK: A New Public Code for Calculating Null Geodesics in the Kerr Spacetime
NASA Astrophysics Data System (ADS)
Yang, Xiaolin; Wang, Jiancheng
2013-07-01
Following the work of Dexter & Agol, we present a new public code for the fast calculation of null geodesics in the Kerr spacetime. Using Weierstrass's and Jacobi's elliptic functions, we express all coordinates and affine parameters as analytical and numerical functions of a parameter p, which is an integral value along the geodesic. This is the main difference between our code and previous similar ones. The advantage of this treatment is that the information about the turning points does not need to be specified in advance by the user, and many applications such as imaging, the calculation of line profiles, and the observer-emitter problem, become root-finding problems. All elliptic integrations are computed by Carlson's elliptic integral method as in Dexter & Agol, which guarantees the fast computational speed of our code. The formulae to compute the constants of motion given by Cunningham & Bardeen have been extended, which allow one to readily handle situations in which the emitter or the observer has an arbitrary distance from, and motion state with respect to, the central compact object. The validation of the code has been extensively tested through applications to toy problems from the literature. The source FORTRAN code is freely available for download on our Web site http://www1.ynao.ac.cn/~yangxl/yxl.html.
McCarty, Dennis; Rieckmann, Traci; Baker, Robin L; McConnell, K John
2017-03-01
Title 42 of the Code of Federal Regulations Part 2 (42 CFR Part 2) controls the release of patient information about treatment for substance use disorders. In 2016, the Substance Abuse and Mental Health Services Administration (SAMHSA) released a proposed rule to update the regulations, reduce provider burdens, and facilitate information exchange. Oregon's Medicaid program (Oregon Health Plan) altered the financing and structure of medical, dental, and behavioral care to promote greater integration and coordination. A qualitative analysis examined the perceived impact of 42 CFR Part 2 on care coordination and integration. Interviews with 76 stakeholders (114 interviews) conducted in 2012-2015 probed the processes of integrating behavioral health into primary care settings in Oregon and assessed issues associated with adherence to 42 CFR Part 2. Respondents expressed concerns that the regulations caused legal confusion, inhibited communication and information sharing, and required updating. Addiction treatment directors noted the challenges of obtaining patient consent to share information with primary care providers. The confidentiality regulations were perceived as a barrier to care coordination and integration. The Oregon Health Authority, therefore, requested regulatory changes. SAMHSA's proposed revisions permit a general consent to an entire health care team and allow inclusion of substance use disorder information within health information exchanges, but they mandate data segmentation of diagnostic and procedure codes related to substance use disorders and restrict access only to parties with authorized consent, possibly adding barriers to the coordination and integration of addiction treatment with primary care.
NASA Astrophysics Data System (ADS)
Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.
2014-06-01
A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.
Owens, Mandy D; Rowell, Lauren N; Moyers, Theresa
2017-10-01
Motivational Interviewing (MI) is an evidence-based approach shown to be helpful for a variety of behaviors across many populations. Treatment fidelity is an important tool for understanding how and with whom MI may be most helpful. The Motivational Interviewing Treatment Integrity coding system was recently updated to incorporate new developments in the research and theory of MI, including the relational and technical hypotheses of MI (MITI 4.2). To date, no studies have examined the MITI 4.2 with forensic populations. In this project, twenty-two brief MI interventions with jail inmates were evaluated to test the reliability of the MITI 4.2. Validity of the instrument was explored using regression models to examine the associations between global scores (Empathy, Partnership, Cultivating Change Talk and Softening Sustain Talk) and outcomes. Reliability of this coding system with these data was strong. We found that therapists had lower ratings of Empathy with participants who had more extensive criminal histories. Both Relational and Technical global scores were associated with criminal histories as well as post-intervention ratings of motivation to decrease drug use. Findings indicate that the MITI 4.2 was reliable for coding sessions with jail inmates. Additionally, results provided information related to the relational and technical hypotheses of MI. Future studies can use the MITI 4.2 to better understand the mechanisms behind how MI works with this high-risk group. Published by Elsevier Ltd.
Seng, Elizabeth K; Lovejoy, Travis I
2013-12-01
This study psychometrically evaluates the Motivational Interviewing Treatment Integrity Code (MITI) to assess fidelity to motivational interviewing to reduce sexual risk behaviors in people living with HIV/AIDS. 74 sessions from a pilot randomized controlled trial of motivational interviewing to reduce sexual risk behaviors in people living with HIV were coded with the MITI. Participants reported sexual behavior at baseline, 3-month, and 6-months. Regarding reliability, excellent inter-rater reliability was achieved for measures of behavior frequency across the 12 sessions coded by both coders; global scales demonstrated poor intraclass correlations, but adequate percent agreement. Regarding validity, principle components analyses indicated that a two-factor model accounted for an adequate amount of variance in the data. These factors were associated with decreases in sexual risk behaviors after treatment. The MITI is a reliable and valid measurement of treatment fidelity for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.
The diagnosis related groups enhanced electronic medical record.
Müller, Marcel Lucas; Bürkle, Thomas; Irps, Sebastian; Roeder, Norbert; Prokosch, Hans-Ulrich
2003-07-01
The introduction of Diagnosis Related Groups as a basis for hospital payment in Germany announced essential changes in the hospital reimbursement practice. A hospital's economical survival will depend vitally on the accuracy and completeness of the documentation of DRG relevant data like diagnosis and procedure codes. In order to enhance physicians' coding compliance, an easy-to-use interface integrating coding tasks seamlessly into clinical routine had to be developed. A generic approach should access coding and clinical guidelines from different information sources. Within the Electronic Medical Record (EMR) a user interface ('DRG Control Center') for all DRG relevant clinical and administrative data has been built. A comprehensive DRG-related web site gives online access to DRG grouping software and an electronic coding expert. Both components are linked together using an application supporting bi-directional communication. Other web based services like a guideline search engine can be integrated as well. With the proposed method, the clinician gains quick access to context sensitive clinical guidelines for appropriate treatment of his/her patient and administrative guidelines for the adequate coding of the diagnoses and procedures. This paper describes the design and current implementation and discusses our experiences.
SCISEAL: A CFD code for analysis of fluid dynamic forces in seals
NASA Technical Reports Server (NTRS)
Athavale, Mahesh; Przekwas, Andrzej
1994-01-01
A viewgraph presentation is made of the objectives, capabilities, and test results of the computer code SCISEAL. Currently, the seal code has: a finite volume, pressure-based integration scheme; colocated variables with strong conservation approach; high-order spatial differencing, up to third-order; up to second-order temporal differencing; a comprehensive set of boundary conditions; a variety of turbulence models and surface roughness treatment; moving grid formulation for arbitrary rotor whirl; rotor dynamic coefficients calculated by the circular whirl and numerical shaker methods; and small perturbation capabilities to handle centered and eccentric seals.
Jelsma, Judith G M; Mertens, Vera-Christina; Forsberg, Lisa; Forsberg, Lars
2015-07-01
Many randomized controlled trials in which motivational interviewing (MI) is a key intervention make no provision for the assessment of treatment fidelity. This methodological shortcoming makes it impossible to distinguish between high- and low-quality MI interventions, and, consequently, to know whether MI provision has contributed to any intervention effects. This article makes some practical recommendations for the collection, selection, coding and reporting of MI fidelity data, as measured using the Motivational Interviewing Treatment Integrity Code. We hope that researchers will consider these recommendations and include MI fidelity measures in future studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Lestoquoy, Anna Sophia; Laird, Lance D; Mitchell, Suzanne; Gergen-Barnett, Katherine; Negash, N Lily; McCue, Kelly; Enad, Racquel; Gardiner, Paula
2017-12-01
Little is known about the acceptance of non-pharmacological group strategies delivered to low income racially diverse patients with chronic pain and depression. This paper examines how the Integrative Medical Group Visit (IMGV) addresses many of the deficits identified with usual care. Six IMGVs cohorts were held at a safety net hospital and two federally funded community health centres. Data was gathered through focus groups. Transcripts were analysed using both a priori codes and inductive coding. The intervention included ten sessions of Integrative Medical Group Visits with a primary care provider and a meditation instructor. The curriculum uses principles of Mindfulness Based Stress Reduction and evidence based integrative medicine. The visit is structured similarly to other group medical visits. Data was gathered through four focus groups held after the cohorts were completed. Participants (N=20) were largely low income minority adults with chronic pain and comorbid depression. Six themes emerged from the coding including: chronic pain is isolating; group treatment contributes to better coping with pain; loss of control and autonomy because of the unpredictability of pain as well as dependence on medication and frequent medical appointments; groups improve agency and control over one's health condition; navigating the healthcare system and unsatisfactory treatment options; and changes after the IMGV due to non-pharmacological health management. The IMGV is a promising format of delivering integrative care for chronic pain and depression which addresses many of the problems identified by patients in usual care. Copyright © 2017 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-15
...-through devices, brachytherapy sources, intraoperative radiation therapy (IORT), brachytherapy composite... Modulated Radiation Therapy I/OCE Integrated Outpatient Code Editor IOL Intraocular lens IOM Institute of Medicine IORT Intraoperative radiation treatment IPF Inpatient Psychiatric Facility IPPS [Hospital...
Exhaust plumes and their interaction with missile airframes - A new viewpoint
NASA Technical Reports Server (NTRS)
Dash, S. M.; Sinha, N.
1992-01-01
The present, novel treatment of missile airframe-exhaust plume interactions emphasizes their simulation via a formal solution of the Reynolds-averaged Navier-Stokes (RNS) equation and is accordingly able to address the simulation requirements of novel missiles with nonconventional/integrated propulsion systems. The method is made possible by implicit RNS codes with improved artificial dissipation models, generalized geometric capabilities, and improved two-equation turbulence models, as well as by such codes' recent incorporation of plume thermochemistry and multiphase flow effects.
Zalewski, Maureen; Stepp, Stephanie D.; Whalen, Diana J.; Scott, Lori N.
2014-01-01
There are currently no empirically supported interventions to target parenting among mothers who have Borderline Personality Disorder (BPD). The current study uses Consensus Qualitative Research (CQR) methodology to: I) learn about mothers’ experiences of parenting with BPD, and II) identify treatment modifications to Dialectical Behavior Therapy (DBT) as suggested by mothers with BPD who are currently engaged in DBT skills training. Twenty-three mothers were recruited from intensive outpatient and partial hospitalization programs that teach DBT skills. A total of 9 focus groups that met one time were conducted asking women a series of questions regarding their experiences of parenting with BPD and how they would modify DBT to address parenting issues. Using the CQR approach, we coded domains and categories that were discussed by mothers in the focus groups. Coding revealed that mothers with BPD wished parenting was integrated more in their current DBT skills groups. In addition, one of the most prominent themes to emerge was that parenting is particularly stressful to mothers with BPD and is associated with guilt, uncertainty, and worry. Finally, mothers offered many ideas for how to integrate parenting-focused interventions into DBT. The CQR method revealed gaps in current treatment for mothers with BPD and provided useful ideas for how to modify DBT to target parenting and integrate these modifications into other approaches for treating mothers with BPD. PMID:26257507
[Who Benefits from Forensic Psychiatric Treatment? Results of a Catamnestic Study in Swabia].
Dudeck, Manuela; Franke, Irina; Bezzel, Adelheid; Otte, Stefanie; Ormanns, Norbert; Nigel, Stefanie; Segmiller, Felix; Streb, Judith
2018-04-17
Evaluation of treatment outcomes of forensic inpatients in the Bavarian district of Swabia (2010 - 2014). 130 inpatients were interviewed about their social reintegration, substance use and delinquency immediately after discharge from forensic psychiatry and one year after. One year after discharge 67 % of the patients referred due to substance use disorder according to § 64 of the German Penal Code were employed, 57 % were abstinent and 83 % did not reoffend. Patients who were detained due to severe mental illness according to § 63 of the German Penal Code often received inability pensions (57 %), 14 % were integrated in sheltered workshops and 100 % did not reoffend. Forensic-psychiatric treatment contributes to rehabilitation and reduces risk factors of mentally disordered offenders. © Georg Thieme Verlag KG Stuttgart · New York.
Integration of Massage Therapy in Outpatient Cancer Care
Cowen, Virginia S.; Tafuto, Barbara
2018-01-01
Background Massage therapy can be helpful in alleviating cancer-related symptoms and cancer treatment-related symptoms. While surveys have noted that cancer patients seek out massage as a nonpharmacologic approach during cancer treatment, little is known about the integration of massage in outpatient cancer care. Purpose The purpose of this study was to examine the extent to which massage is being integrated into outpatient cancer care at NCI-designated Cancer Centers. Setting This study used descriptive methods to analyze the integration of massage in NCI-designated Cancer Centers providing clinical services to patients (n = 62). Design Data were collected from 91.1% of the centers (n = 59) using content analysis and a telephone survey. A dataset was developed and coded for analysis. Main Outcome Measure The integration of massage was assessed by an algorithm that was developed from a set of five variables: 1) acceptance of treatment as therapeutic, 2) institution offers treatment to patients, 3) clinical practice guidelines in place, 4) use of evidence-based resources to inform treatment, and 5) shared knowledge about treatment among health care team. All centers were scored against all five variables using a six-point scale, with all variables rated equally. Results The integration of massage ranged from not at all (0) to very high (5) with all five levels of integration evident. Only 11 centers (17.7% of total) rated a very high level of integration; nearly one-third of the centers (n = 22) were found to have no integration of massage at all—not even provision of information about massage to patients through the center website. Conclusions The findings of this analysis suggest that research on massage is not being leveraged to integrate massage into outpatient cancer care. PMID:29593842
Integration of Massage Therapy in Outpatient Cancer Care.
Cowen, Virginia S; Tafuto, Barbara
2018-03-01
Massage therapy can be helpful in alleviating cancer-related symptoms and cancer treatment-related symptoms. While surveys have noted that cancer patients seek out massage as a nonpharmacologic approach during cancer treatment, little is known about the integration of massage in outpatient cancer care. The purpose of this study was to examine the extent to which massage is being integrated into outpatient cancer care at NCI-designated Cancer Centers. This study used descriptive methods to analyze the integration of massage in NCI-designated Cancer Centers providing clinical services to patients (n = 62). Data were collected from 91.1% of the centers (n = 59) using content analysis and a telephone survey. A dataset was developed and coded for analysis. The integration of massage was assessed by an algorithm that was developed from a set of five variables: 1) acceptance of treatment as therapeutic, 2) institution offers treatment to patients, 3) clinical practice guidelines in place, 4) use of evidence-based resources to inform treatment, and 5) shared knowledge about treatment among health care team. All centers were scored against all five variables using a six-point scale, with all variables rated equally. The integration of massage ranged from not at all (0) to very high (5) with all five levels of integration evident. Only 11 centers (17.7% of total) rated a very high level of integration; nearly one-third of the centers (n = 22) were found to have no integration of massage at all-not even provision of information about massage to patients through the center website. The findings of this analysis suggest that research on massage is not being leveraged to integrate massage into outpatient cancer care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A.A. Bingham; R.M. Ferrer; A.M. ougouag
2009-09-01
An accurate and computationally efficient two or three-dimensional neutron diffusion model will be necessary for the development, safety parameters computation, and fuel cycle analysis of a prismatic Very High Temperature Reactor (VHTR) design under Next Generation Nuclear Plant Project (NGNP). For this purpose, an analytical nodal Green’s function solution for the transverse integrated neutron diffusion equation is developed in two and three-dimensional hexagonal geometry. This scheme is incorporated into HEXPEDITE, a code first developed by Fitzpatrick and Ougouag. HEXPEDITE neglects non-physical discontinuity terms that arise in the transverse leakage due to the transverse integration procedure application to hexagonal geometry andmore » cannot account for the effects of burnable poisons across nodal boundaries. The test code being developed for this document accounts for these terms by maintaining an inventory of neutrons by using the nodal balance equation as a constraint of the neutron flux equation. The method developed in this report is intended to restore neutron conservation and increase the accuracy of the code by adding these terms to the transverse integrated flux solution and applying the nodal Green’s function solution to the resulting equation to derive a semi-analytical solution.« less
VizieR Online Data Catalog: FARGO_THORIN 1.0 hydrodynamic code (Chrenko+, 2017)
NASA Astrophysics Data System (ADS)
Chrenko, O.; Broz, M.; Lambrechts, M.
2017-07-01
This archive contains the source files, documentation and example simulation setups of the FARGO_THORIN 1.0 hydrodynamic code. The program was introduced, described and used for simulations in the paper. It is built on top of the FARGO code (Masset, 2000A&AS..141..165M, Baruteau & Masset, 2008ApJ...672.1054B) and it is also interfaced with the REBOUND integrator package (Rein & Liu, 2012A&A...537A.128R). THORIN stands for Two-fluid HydrOdynamics, the Rebound integrator Interface and Non-isothermal gas physics. The program is designed for self-consistent investigations of protoplanetary systems consisting of a gas disk, a disk of small solid particles (pebbles) and embedded protoplanets. Code features: I) Non-isothermal gas disk with implicit numerical solution of the energy equation. The implemented energy source terms are: Compressional heating, viscous heating, stellar irradiation, vertical escape of radiation, radiative diffusion in the midplane and radiative feedback to accretion heating of protoplanets. II) Planets evolved in 3D, with close encounters allowed. The orbits are integrated using the IAS15 integrator (Rein & Spiegel, 2015MNRAS.446.1424R). The code detects the collisions among planets and resolve them as mergers. III) Refined treatment of the planet-disk gravitational interaction. The code uses a vertical averaging of the gravitational potential, as outlined in Muller & Kley (2012A&A...539A..18M). IV) Pebble disk represented by an Eulerian, presureless and inviscid fluid. The pebble dynamics is affected by the Epstein gas drag and optionally by the diffusive effects. We also implemented the drag back-reaction term into the Navier-Stokes equation for the gas. Archive summary: ------------------------------------------------------------------------- directory/file Explanation ------------------------------------------------------------------------- /in_relax Contains setup of the first example simulation /in_wplanet Contains setup of the second example simulation /srcmain Contains the source files of FARGOTHORIN /src_reb Contains the source files of the REBOUND integrator package to be linked with THORIN GUNGPL3 GNU General Public License, version 3 LICENSE License agreement README Simple user's guide UserGuide.pdf Extended user's guide refman.pdf Programer's guide ----------------------------------------------------------------------------- (1 data file).
Huang, Sheng-Kang; Lai, Chih-Sung; Chang, Yuan-Shiun; Ho, Yu-Ling
2016-10-01
Patients in Taiwan with allergic rhinitis seek not only Western medicine treatment but also Traditional Chinese Medicine treatment or integrated Chinese-Western medicine treatment. Various studies have conducted pairwise comparison on Traditional Chinese Medicine, Western medicine, and integrated Chinese-Western medicine treatments. However, none conducted simultaneous analysis of the three treatments. This study analyzed patients with allergic rhinitis receiving the three treatments to identify differences in demographic characteristic and medical use and thereby to determine drug use patterns of different treatments. The National Health Insurance Research Database was the data source, and included patients were those diagnosed with allergic rhinitis (International Classification of Diseases, Ninth Revision, Clinical Modification codes 470-478). Chi-square test and Tukey studentized range (honest significant difference) test were conducted to investigate the differences among the three treatments. Visit frequency for allergic rhinitis treatment was higher in female than male patients, regardless of treatment with Traditional Chinese Medicine, Western medicine, or integrated Chinese-Western medicine. Persons aged 0-19 years ranked the highest in proportion of visits for allergic rhinitis. Traditional Chinese Medicine treatment had more medical items per person-time and daily drug cost per person-time and had the lowest total expenditure per person-time. In contrast, Western medicine had the lowest daily drug cost per person-time and the highest total expenditure per person-time. The total expenditure per person-time, daily drug cost per person-time, and medical items per person-time of integrated Chinese-Western medicine treatment lay between those seen with Traditional Chinese Medicine and Western medicine treatments. Although only 6.82 % of patients with allergic rhinitis chose integrated Chinese-Western medicine treatment, the visit frequency per person-year of integrated Chinese-Western medicine ranked highest. In addition, multiple-composition medicines were used more frequently than single-composition medicines, and mar huang (Ephedra sinica Stapf) was seldom used to decrease the risk of combining medications.
Sato, Tatsuhiko; Watanabe, Ritsuko; Sihver, Lembit; Niita, Koji
2012-01-01
Microdosimetric quantities such as lineal energy are generally considered to be better indices than linear energy transfer (LET) for expressing the relative biological effectiveness (RBE) of high charge and energy particles. To calculate their probability densities (PD) in macroscopic matter, it is necessary to integrate microdosimetric tools such as track-structure simulation codes with macroscopic particle transport simulation codes. As an integration approach, the mathematical model for calculating the PD of microdosimetric quantities developed based on track-structure simulations was incorporated into the macroscopic particle transport simulation code PHITS (Particle and Heavy Ion Transport code System). The improved PHITS enables the PD in macroscopic matter to be calculated within a reasonable computation time, while taking their stochastic nature into account. The microdosimetric function of PHITS was applied to biological dose estimation for charged-particle therapy and risk estimation for astronauts. The former application was performed in combination with the microdosimetric kinetic model, while the latter employed the radiation quality factor expressed as a function of lineal energy. Owing to the unique features of the microdosimetric function, the improved PHITS has the potential to establish more sophisticated systems for radiological protection in space as well as for the treatment planning of charged-particle therapy.
A control system based on field programmable gate array for papermaking sewage treatment
NASA Astrophysics Data System (ADS)
Zhang, Zi Sheng; Xie, Chang; Qing Xiong, Yan; Liu, Zhi Qiang; Li, Qing
2013-03-01
A sewage treatment control system is designed to improve the efficiency of papermaking wastewater treatment system. The automation control system is based on Field Programmable Gate Array (FPGA), coded with Very-High-Speed Integrate Circuit Hardware Description Language (VHDL), compiled and simulated with Quartus. In order to ensure the stability of the data used in FPGA, the data is collected through temperature sensors, water level sensor and online PH measurement system. The automatic control system is more sensitive, and both the treatment efficiency and processing power are increased. This work provides a new method for sewage treatment control.
Committed to the Honor Code: An Investment Model Analysis of Academic Integrity
ERIC Educational Resources Information Center
Dix, Emily L.; Emery, Lydia F.; Le, Benjamin
2014-01-01
Educators worldwide face challenges surrounding academic integrity. The development of honor codes can promote academic integrity, but understanding how and why honor codes affect behavior is critical to their successful implementation. To date, research has not examined how students' "relationship" to an honor code predicts…
Comparison of the thermal neutron scattering treatment in MCNP6 and GEANT4 codes
NASA Astrophysics Data System (ADS)
Tran, H. N.; Marchix, A.; Letourneau, A.; Darpentigny, J.; Menelle, A.; Ott, F.; Schwindling, J.; Chauvin, N.
2018-06-01
To ensure the reliability of simulation tools, verification and comparison should be made regularly. This paper describes the work performed in order to compare the neutron transport treatment in MCNP6.1 and GEANT4-10.3 in the thermal energy range. This work focuses on the thermal neutron scattering processes for several potential materials which would be involved in the neutron source designs of Compact Accelerator-based Neutrons Sources (CANS), such as beryllium metal, beryllium oxide, polyethylene, graphite, para-hydrogen, light water, heavy water, aluminium and iron. Both thermal scattering law and free gas model, coming from the evaluated data library ENDF/B-VII, were considered. It was observed that the GEANT4.10.03-patch2 version was not able to account properly the coherent elastic process occurring in crystal lattice. This bug is treated in this work and it should be included in the next release of the code. Cross section sampling and integral tests have been performed for both simulation codes showing a fair agreement between the two codes for most of the materials except for iron and aluminium.
Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.
Uzun, Vassilya; Bilgin, Sami
2016-01-01
For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.
Leightley, Daniel; Chui, Zoe; Jones, Margaret; Landau, Sabine; McCrone, Paul; Hayes, Richard D; Wessely, Simon; Fear, Nicola T; Goodwin, Laura
2018-05-01
Electronic Healthcare Records (EHRs) are created to capture summaries of care and contact made to healthcare services. EHRs offer a means to analyse admissions to hospitals for epidemiological research. In the United Kingdom (UK), England, Scotland and Wales maintain separate data stores, which are administered and managed exclusively by devolved Government. This independence results in harmonisation challenges, not least lack of uniformity, making it difficult to evaluate care, diagnoses and treatment across the UK. To overcome this lack of uniformity, it is important to develop methods to integrate EHRs to provide a multi-nation dataset of health. To develop and describe a method which integrates the EHRs of Armed Forces personnel in England, Scotland and Wales based on variable commonality to produce a multi-nation dataset of secondary health care. An Armed Forces cohort was used to extract and integrate three EHR datasets, using commonality as the linkage point. This was achieved by evaluating and combining variables which shared the same characteristics. EHRs representing Accident and Emergency (A&E), Admitted Patient Care (APC) and Outpatient care were combined to create a patient-level history spanning three nations. Patient-level EHRs were examined to ascertain admission differences, common diagnoses and record completeness. A total of 6,336 Armed Forces personnel were matched, of which 5,460 personnel had 7,510 A&E visits, 9,316 APC episodes and 45,005 Outpatient appointments. We observed full completeness for diagnoses in APC, whereas Outpatient admissions were sparsely coded; with 88% of diagnoses coded as "Unknown/unspecified cause of morbidity". In addition, A&E records were sporadically coded; we found five coding systems for identifying reason for admission. At present, EHRs are designed to monitor the cost of treatment, enable administrative oversight, and are not currently suited to epidemiological research. However, only small changes may be needed to take advantage of what should be a highly cost-effective means of delivering important research for the benefit of the NHS. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.
Treatment motivation among caregivers and adolescents with substance use disorders
Cornelius, T.; Earnshaw, V. A.; Menino, D.; Bogart, L. M.; Levy, S.
2017-01-01
Substance use disorders (SUDs) in adolescence have negative long-term health effects, which can be mitigated through successful treatment. Caregivers play a central role in adolescent treatment involvement; however, studies have not examined treatment motivation and pressures to enter treatment in caregiver/adolescent dyads. Research suggests that internally motivated treatment (in contrast to coerced treatment) tends to lead to better outcomes. We used Self-Determination Theory (SDT) to examine intersecting motivational narratives among caregivers and adolescents in SUD treatment. Relationships between motivation, interpretation of caregiver pressures, adolescent autonomy, and relatedness were also explored. Adolescents in SUD treatment and their caregivers (NDyads = 15) were interviewed about treatment experiences. Interviews were coded for treatment motivation, including extrinsic (e.g., motivated by punishment), introjected (e.g., motivated by guilt), and identified/integrated motivation (e.g., seeing a behavior as integral to the self). Internalization of treatment motivation, autonomy support/competence (e.g., caregiver support for adolescent decisions), and relatedness (e.g., acceptance and support) were also coded. Four dyadic categories were identified: agreement that treatment was motivated by the adolescent (intrinsic); agreement that treatment was motivated by the caregiver (extrinsic); agreement that treatment was motivated by both, or a shift towards adolescent control (mixed/transitional); and disagreement (adolescents and caregivers each claimed they motivated treatment; conflicting). Autonomy support and relatedness were most prominent in intrinsic dyads, and least prominent in extrinsic dyads. The mixed/transitional group was also high in autonomy support and relatedness. The extrinsic group characterized caregiver rules as an unwelcome mechanism for behavioral control; caregivers in the other groups saw rules as a way to build adolescent competence and repair relationships, and adolescents saw rules as indicating care rather than control. Adolescents with intrinsic motivations were the most engaged in treatment. Results suggest the importance of intrinsically motivated treatment, and highlight autonomy support and relatedness as mechanisms that might facilitate treatment engagement. PMID:28237049
Treatment motivation among caregivers and adolescents with substance use disorders.
Cornelius, T; Earnshaw, V A; Menino, D; Bogart, L M; Levy, S
2017-04-01
Substance use disorders (SUDs) in adolescence have negative long-term health effects, which can be mitigated through successful treatment. Caregivers play a central role in adolescent treatment involvement; however, studies have not examined treatment motivation and pressures to enter treatment in caregiver/adolescent dyads. Research suggests that internally motivated treatment (in contrast to coerced treatment) tends to lead to better outcomes. We used Self-determination theory (SDT) to examine intersecting motivational narratives among caregivers and adolescents in SUD treatment. Relationships between motivation, interpretation of caregiver pressures, adolescent autonomy, and relatedness were also explored. Adolescents in SUD treatment and their caregivers (N Dyads =15) were interviewed about treatment experiences. Interviews were coded for treatment motivation, including extrinsic (e.g., motivated by punishment), introjected (e.g., motivated by guilt), and identified/integrated motivation (e.g., seeing a behavior as integral to the self). Internalization of treatment motivation, autonomy support/competence (e.g., caregiver support for adolescent decisions), and relatedness (e.g., acceptance and support) were also coded. Four dyadic categories were identified: agreement that treatment was motivated by the adolescent (intrinsic); agreement that treatment was motivated by the caregiver (extrinsic); agreement that treatment was motivated by both, or a shift towards adolescent control (mixed/transitional); and disagreement (adolescents and caregivers each claimed they motivated treatment; conflicting). Autonomy support and relatedness were most prominent in intrinsic dyads, and least prominent in extrinsic dyads. The mixed/transitional group was also high in autonomy support and relatedness. The extrinsic group characterized caregiver rules as an unwelcome mechanism for behavioral control; caregivers in the other groups saw rules as a way to build adolescent competence and repair relationships, and adolescents saw rules as indicating care rather than control. Adolescents with intrinsic motivations were the most engaged in treatment. Results suggest the importance of intrinsically motivated treatment, and highlight autonomy support and relatedness as mechanisms that might facilitate treatment engagement. Copyright © 2017 Elsevier Inc. All rights reserved.
Neutronic calculation of fast reactors by the EUCLID/V1 integrated code
NASA Astrophysics Data System (ADS)
Koltashev, D. A.; Stakhanova, A. A.
2017-01-01
This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.
Code comparison for accelerator design and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parsa, Z.
1988-01-01
We present a comparison between results obtained from standard accelerator physics codes used for the design and analysis of synchrotrons and storage rings, with programs SYNCH, MAD, HARMON, PATRICIA, PATPET, BETA, DIMAD, MARYLIE and RACE-TRACK. In our analysis we have considered 5 (various size) lattices with large and small angles including AGS Booster (10/degree/ bend), RHIC (2.24/degree/), SXLS, XLS (XUV ring with 45/degree/ bend) and X-RAY rings. The differences in the integration methods used and the treatment of the fringe fields in these codes could lead to different results. The inclusion of nonlinear (e.g., dipole) terms may be necessary inmore » these calculations specially for a small ring. 12 refs., 6 figs., 10 tabs.« less
NASA Astrophysics Data System (ADS)
Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun
2017-09-01
The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.
A qualitative analysis of aspects of treatment that adolescents with anorexia identify as helpful.
Zaitsoff, Shannon; Pullmer, Rachelle; Menna, Rosanne; Geller, Josie
2016-04-30
This study aimed to identify aspects of treatment that adolescents with anorexia nervosa (AN) believe are helpful or unhelpful. Adolescent females receiving treatment for AN or subthreshold AN (n=21) were prompted during semi-structured interviews to generate responses to open-ended questions on what they felt would be most helpful or unhelpful in treating adolescents with eating disorders. Eight codes were developed and the two most frequently endorsed categories were (1) Alliance, where the therapist demonstrates clinical expertise and also expresses interest in the patient (n=21, 100.0%), and (2) Client Involvement in treatment (n=16, 76.2%). These top two categories were shared by participants with AN versus subthreshold AN and participants with high versus low readiness to change their dietary restriction behaviours. Development of the coding scheme and sample participant responses will be discussed. The integration of identified factors into empirically supported treatments for adolescent AN, such as Family-based Treatment, will be considered. This study provides initial information regarding aspects of treatment that adolescents identify as most helpful or unhelpful in their treatment. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2017-01-01
This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.
ERIC Educational Resources Information Center
Smith, Jane Ellen; Gianini, Loren M.; Garner, Bryan R.; Malek, Karen L.; Godley, Susan H.
2014-01-01
This study evaluated a process for training raters to reliably rate clinicians delivering the Adolescent Community Reinforcement Approach (A-CRA) in a national dissemination project. The unique A-CRA coding system uses specific behavioral anchors throughout its 73 procedure components. Five randomly selected raters each rated "passing"…
Case Presentations Demonstrating Periodontal Treatment Variation: PEARL Network.
Curro, Frederick A; Grill, Ashley C; Matthews, Abigail G; Martin, John; Kalenderian, Elisabeth; Craig, Ronald G; Naftolin, Frederick; Thompson, Van P
2015-06-01
Variation in periodontal terminology can affect the diagnosis and treatment plan as assessed by practicing general dentists in the Practitioners Engaged in Applied Research and Learning (PEARL) Network. General dentists participating in the PEARL Network are highly screened, credentialed, and qualified and may not be representative of the general population of dentists. Ten randomized case presentations ranging from periodontal health to gingivitis, to mild, moderate, and severe periodontitis were randomly presented to respondents. Descriptive comparisons were made between these diagnosis groups in terms of the treatment recommendations following diagnosis. PEARL practitioners assessing periodontal clinical scenarios were found to either over- or under-diagnose the case presentations, which affected treatment planning, while the remaining responses concurred with respect to the diagnosis. The predominant diagnosis was compared with that assigned by two practicing periodontists. There was variation in treatment based on the diagnosis for gingivitis and the lesser forms of periodontitis. Data suggests that a lack of clarity of periodontal terminology affects both diagnosis and treatment planning, and terminology may be improved by having diagnosis codes, which could be used to assess treatment outcomes. This article provides data to support best practice for the use of diagnosis coding and integration of dentistry with medicine using ICD-10 terminology.
Definite Integrals, Some Involving Residue Theory Evaluated by Maple Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2010-01-01
The calculus of residue is applied to evaluate certain integrals in the range (-{infinity} to {infinity}) using the Maple symbolic code. These integrals are of the form {integral}{sub -{infinity}}{sup {infinity}} cos(x)/[(x{sup 2} + a{sup 2})(x{sup 2} + b{sup 2}) (x{sup 2} + c{sup 2})]dx and similar extensions. The Maple code is also applied to expressions in maximum likelihood estimator moments when sampling from the negative binomial distribution. In general the Maple code approach to the integrals gives correct answers to specified decimal places, but the symbolic result may be extremely long and complex.
An Overview of the Greyscales Lethality Assessment Methodology
2011-01-01
code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement simulations. It can also be integrated into...incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement...capable of being incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile
Vaccarino, Anthony L; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M; Stuss, Donald T; Theriault, Elizabeth; Evans, Kenneth R
2018-01-01
Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute's "Brain-CODE" is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care.
Pre-Service Teachers' Perception of Quick Response (QR) Code Integration in Classroom Activities
ERIC Educational Resources Information Center
Ali, Nagla; Santos, Ieda M.; Areepattamannil, Shaljan
2017-01-01
Quick Response (QR) codes have been discussed in the literature as adding value to teaching and learning. Despite their potential in education, more research is needed to inform practice and advance knowledge in this field. This paper investigated the integration of the QR code in classroom activities and the perceptions of the integration by…
Mathibe, Maphuthego D; Hendricks, Stephen J H; Bergh, Anne-Marie
2015-10-02
Primary Health Care (PHC) clinicians and patients are major role players in the South African antiretroviral treatment programme. Understanding their perceptions and experiences of integrated care and the management of people living with HIV and AIDS in PHC facilities is necessary for successful implementation and sustainability of integration. This study explored clinician perceptions and patient experiences of integration of antiretroviral treatment in PHC clinics. An exploratory, qualitative study was conducted in four city of Tshwane PHC facilities. Two urban and two rural facilities following different models of integration were included. A self-administered questionnaire with open-ended items was completed by 35 clinicians and four focus group interviews were conducted with HIV-positive patients. The data were coded and categories were grouped into sub-themes and themes. Workload, staff development and support for integration affected clinicians' performance and viewpoints. They perceived promotion of privacy, reduced discrimination and increased access to comprehensive care as benefits of service integration. Delays, poor patient care and patient dissatisfaction were viewed as negative aspects of integration. In three facilities patients were satisfied with integration or semi-integration and felt common queues prevented stigma and discrimination, whilst the reverse was true in the facility with separate services. Single-month issuance of antiretroviral drugs and clinic schedule organisation was viewed negatively, as well as poor staff attitudes, poor communication and long waiting times. Although a fully integrated service model is preferable, aspects that need further attention are management support from health authorities for health facilities, improved working conditions and appropriate staff development opportunities.
The Fireball integrated code package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobranich, D.; Powers, D.A.; Harper, F.T.
1997-07-01
Many deep-space satellites contain a plutonium heat source. An explosion, during launch, of a rocket carrying such a satellite offers the potential for the release of some of the plutonium. The fireball following such an explosion exposes any released plutonium to a high-temperature chemically-reactive environment. Vaporization, condensation, and agglomeration processes can alter the distribution of plutonium-bearing particles. The Fireball code package simulates the integrated response of the physical and chemical processes occurring in a fireball and the effect these processes have on the plutonium-bearing particle distribution. This integrated treatment of multiple phenomena represents a significant improvement in the state ofmore » the art for fireball simulations. Preliminary simulations of launch-second scenarios indicate: (1) most plutonium vaporization occurs within the first second of the fireball; (2) large non-aerosol-sized particles contribute very little to plutonium vapor production; (3) vaporization and both homogeneous and heterogeneous condensation occur simultaneously; (4) homogeneous condensation transports plutonium down to the smallest-particle sizes; (5) heterogeneous condensation precludes homogeneous condensation if sufficient condensation sites are available; and (6) agglomeration produces larger-sized particles but slows rapidly as the fireball grows.« less
NASA Astrophysics Data System (ADS)
de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.
2010-10-01
In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonen, F.A.; Johnson, K.I.; Liebetrau, A.M.
The VISA-II (Vessel Integrity Simulation Analysis code was originally developed as part of the NRC staff evaluation of pressurized thermal shock. VISA-II uses Monte Carlo simulation to evaluate the failure probability of a pressurized water reactor (PWR) pressure vessel subjected to a pressure and thermal transient specified by the user. Linear elastic fracture mechanics methods are used to model crack initiation and propagation. Parameters for initial crack size and location, copper content, initial reference temperature of the nil-ductility transition, fluence, crack-initiation fracture toughness, and arrest fracture toughness are treated as random variables. This report documents an upgraded version of themore » original VISA code as described in NUREG/CR-3384. Improvements include a treatment of cladding effects, a more general simulation of flaw size, shape and location, a simulation of inservice inspection, an updated simulation of the reference temperature of the nil-ductility transition, and treatment of vessels with multiple welds and initial flaws. The code has been extensively tested and verified and is written in FORTRAN for ease of installation on different computers. 38 refs., 25 figs.« less
Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim
2016-01-01
Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458
DOE Office of Scientific and Technical Information (OSTI.GOV)
Page, R.; Jones, J.R.
1997-07-01
Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less
Riley, Andrew R; Grennan, Allison; Menousek, Kathryn; Hoffses, Kathryn W
2018-03-01
Integration of psychological services into pediatric primary care is increasingly common, but models of integration vary with regard to their level of coordination, colocation, and integration. High-integration models may provide some distinct advantages, such as preventative care and brief consultation for subclinical behavior concerns; however, psychologists face barriers to seeking reimbursement for these services. Alternatives to traditional psychotherapy and psychological testing codes, specifically Health & Behavior (H&B) codes, have been proposed as 1 method for supporting integrated care. The aim of this study was to investigate the relationships between psychologists' reported billing practices, reimbursement rates, and model of integration in pediatric primary care. As part of a larger survey study, 55 psychologists working in pediatric primary care reported on characteristics of their practice's model of integration, billing practices, and frequency of reimbursement for consultative services. Compared with those who categorized their integrated care model as colocated, psychologists who endorsed working in integrated models reported a significantly higher usage of H&B codes and more frequent reimbursement for consultations. Overall, use of H&B codes was associated with higher reported levels of coordination and integration. Survey results showed a clear pattern of higher integration being associated with greater utilization of H&B codes and better reimbursement for consultation activities. These results underscore the importance of establishing and maintaining billing and reimbursement systems that adequately support integrated care. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Biodegradation of paint stripper solvents in a modified gas lift loop bioreactor.
Vanderberg-Twary, L; Steenhoudt, K; Travis, B J; Hanners, J L; Foreman, T M; Brainard, J R
1997-07-05
Paint stripping wastes generated during the decontamination and decommissioning of former nuclear facilities contain paint stripping organics (dichloromethane, 2-propanol, and methanol) and bulk materials containing paint pigments. It is desirable to degrade the organic residues as part of an integrated chemical-biological treatment system. We have developed a modified gas lift loop bioreactor employing a defined consortium of Rhodococcus rhodochrous strain OFS and Hyphomicrobium sp. DM-2 that degrades paint stripper organics. Mass transfer coefficients and kinetic constants for biodegradation in the system were determined. It was found that transfer of organic substrates from surrogate waste into the air and further into the liquid medium in the bioreactor were rapid processes, occurring within minutes. Monod kinetics was employed to model the biodegradation of paint stripping organics. Analysis of the bioreactor process was accomplished with BIOLAB, a mathematical code that simulates coupled mass transfer and biodegradation processes. This code was used to fit experimental data to Monod kinetics and to determine kinetic parameters. The BIOLAB code was also employed to compare activities in the bioreactor of individual microbial cultures to the activities of combined cultures in the bioreactor. This code is of benefit for further optimization and scale-up of the bioreactor for treatment of paint stripping and other volatile organic wastes in bulk materials.
Bakerly, Nawar Diar; Davies, C; Dyer, M; Dhillon, P
2009-01-01
Home treatment models for acute exacerbations of chronic obstructive pulmonary disease (AECOPD) proved to be a safe alternative to hospitalization. These models have the potential to free up resources; however, in the United Kingdom, it remains unclear to whether they provide cost savings compared with hospital treatment. Over a 12-month period from August 2003, 130 patients were selected for the integrated care group (total admissions with AECOPD = 546). These patients were compared with 95 retrospective controls in the hospital treatment group. Controls were selected from admissions during the previous 12 months (total of 662 admissions) to match the integrated care group in age, sex, and postal code. Resource use data were collected for both groups and compared using National Health Service (NHS) perspective for cost minimization analysis. In the integrated care group (130 patients), 107 (82%) patients received home support with average length of stay 3.3 (SD 3.9) days compared with 10.4 (SD 7.7) in the hospital group (95 patients). Average number of visits per patients in the integrated care group was 3.08 (SD = 0.95; 95% CI = 2.9-3.2). Cost per patient in the integrated care group was pound1653 (95% CI, pound1521-1802) compared with pound2256 (95% CI, pound2126- 2407) in the hospital group. The integrated care group resulted in cost saving of approximately pound600 (P < 0.001) per patient. This integrated care model for the management of patients with AECOPD offered cost savings of pound600 per patient over the conventional hospital treatment model using the new NHS tariff from an acute trust provider perspective.
Development of a new lattice physics code robin for PWR application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Chen, G.
2013-07-01
This paper presents a description of methodologies and preliminary verification results of a new lattice physics code ROBIN, being developed for PWR application at Shanghai NuStar Nuclear Power Technology Co., Ltd. The methods used in ROBIN to fulfill various tasks of lattice physics analysis are an integration of historical methods and new methods that came into being very recently. Not only these methods like equivalence theory for resonance treatment and method of characteristics for neutron transport calculation are adopted, as they are applied in many of today's production-level LWR lattice codes, but also very useful new methods like the enhancedmore » neutron current method for Dancoff correction in large and complicated geometry and the log linear rate constant power depletion method for Gd-bearing fuel are implemented in the code. A small sample of verification results are provided to illustrate the type of accuracy achievable using ROBIN. It is demonstrated that ROBIN is capable of satisfying most of the needs for PWR lattice analysis and has the potential to become a production quality code in the future. (authors)« less
Toward a first-principles integrated simulation of tokamak edge plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, C S; Klasky, Scott A; Cummings, Julian
2008-01-01
Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less
Vaccarino, Anthony L.; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R.; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G.; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F. Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M.; Stuss, Donald T.; Theriault, Elizabeth; Evans, Kenneth R.
2018-01-01
Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute’s “Brain-CODE” is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care. PMID:29875648
NASA Astrophysics Data System (ADS)
Mosunova, N. A.
2018-05-01
The article describes the basic models included in the EUCLID/V1 integrated code intended for safety analysis of liquid metal (sodium, lead, and lead-bismuth) cooled fast reactors using fuel rods with a gas gap and pellet dioxide, mixed oxide or nitride uranium-plutonium fuel under normal operation, under anticipated operational occurrences and accident conditions by carrying out interconnected thermal-hydraulic, neutronics, and thermal-mechanical calculations. Information about the Russian and foreign analogs of the EUCLID/V1 integrated code is given. Modeled objects, equation systems in differential form solved in each module of the EUCLID/V1 integrated code (the thermal-hydraulic, neutronics, fuel rod analysis module, and the burnup and decay heat calculation modules), the main calculated quantities, and also the limitations on application of the code are presented. The article also gives data on the scope of functions performed by the integrated code's thermal-hydraulic module, using which it is possible to describe both one- and twophase processes occurring in the coolant. It is shown that, owing to the availability of the fuel rod analysis module in the integrated code, it becomes possible to estimate the performance of fuel rods in different regimes of the reactor operation. It is also shown that the models implemented in the code for calculating neutron-physical processes make it possible to take into account the neutron field distribution over the fuel assembly cross section as well as other features important for the safety assessment of fast reactors.
Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana
2015-01-01
Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology's Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1-4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities.
An address geocoding method for improving rural spatial information infrastructure
NASA Astrophysics Data System (ADS)
Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing
2010-11-01
The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.
Subramanian, Amarnath; Westra, Bonnie; Matney, Susan; Wilson, Patricia S; Delaney, Connie W; Huff, Stan; Huff, Stanley M; Huber, Diane
2008-11-06
This poster describes the process used to integrate the Nursing Management Minimum Data Set (NMMDS), an instrument to measure the nursing context of care, into the Logical Observation Identifier Names and Codes (LOINC) system to facilitate contextualization of quality measures. Integration of the first three of 18 elements resulted in 48 new codes including five panels. The LOINC Clinical Committee has approved the presented mapping for their next release.
Huang, Sheng-Kang; Ho, Yu-Ling; Chang, Yuan-Shiun
2015-09-15
Allergic rhinitis has long been a worldwide health problem with a global growth trend. The use of traditional Chinese medicines alone or integrated Chinese-Western medicines for its treatment is quite common in Taiwan. Respiratory diseases account for the majority of outpatient traditional Chinese medicine treatment, while allergic rhinitis accounts for the majority of respiratory diseases. We hereby conduct a comparative analysis between traditional Chinese medicine treatments and western medicine treatments for allergic rhinitis in Taiwan. The results of the analysis on the prescription difference of traditional Chinese medicine and western medicine treatments would be helpful to clinical guide and health policy decision making of ethnopharmacological therapy. Patients diagnosed as allergic rhinitis with diagnostic code 470-478 (ICD-9-CM) were selected as subjects from 2009-2010 National Health Insurance Research Database based on the claim data from the nationwide National Health Insurance in Taiwan. This retrospective study used Chi-Square test to test the effects of gender and age on visit of traditional Chinese medicine, western medicine, and integrated Chinese-Western medicine treatments. A total of 45,804 patients diagnosed as allergic rhinitis with ICD-9-CM 470-478 were identified from 2009-2010 NHIRD. There were 36,874 subjects for western medicine treatment alone, 5829 subjects for traditional Chinese medicine treatment alone, and 3101 subjects for integrated Chinese-Western medicine treatment. Female patients were more than male in three treatments. 0-9 years children had the highest visit frequency in western medicine and integrated Chinese-Western medicine groups, while 10-19 years young-age rank the highest in traditional Chinese medicine group. The Chi-square test of independence showed that the effects of gender and age on visit of three treatments were significant. The prescription drugs of western medicine treatment alone were almost for relieving the symptoms of allergic rhinitis. That leads to the little difference between 2009 and 2010. The same phenomenon occurs in integrated Chinese-Western medicine. However, the prescription drugs of traditional Chinese medicine treatment vary considerably. Multiple-composition medicine been replaced by single-composition medicine implied that syndrome differentiation and treatment were used and the synergistic effects of multiple-composition medicine were no longer suitable for the most patients of 2010. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Angel, Vini M; Friedman, Marvin H; Friedman, Andrea L
This article describes an innovative project involving the integration of bar-code medication administration technology competencies in the nursing curriculum through interprofessional collaboration among nursing, pharmacy, and computer science disciplines. A description of the bar-code medication administration technology project and lessons learned are presented.
Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana
2015-01-01
Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology’s Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1–4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities. PMID:26192805
ISSYS: An integrated synergistic Synthesis System
NASA Technical Reports Server (NTRS)
Dovi, A. R.
1980-01-01
Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.
FLiT: a field line trace code for magnetic confinement devices
NASA Astrophysics Data System (ADS)
Innocente, P.; Lorenzini, R.; Terranova, D.; Zanca, P.
2017-04-01
This paper presents a field line tracing code (FLiT) developed to study particle and energy transport as well as other phenomena related to magnetic topology in reversed-field pinch (RFP) and tokamak experiments. The code computes magnetic field lines in toroidal geometry using curvilinear coordinates (r, ϑ, ϕ) and calculates the intersections of these field lines with specified planes. The code also computes the magnetic and thermal diffusivity due to stochastic magnetic field in the collisionless limit. Compared to Hamiltonian codes, there are no constraints on the magnetic field functional formulation, which allows the integration of whichever magnetic field is required. The code uses the magnetic field computed by solving the zeroth-order axisymmetric equilibrium and the Newcomb equation for the first-order helical perturbation matching the edge magnetic field measurements in toroidal geometry. Two algorithms are developed to integrate the field lines: one is a dedicated implementation of a first-order semi-implicit volume-preserving integration method, and the other is based on the Adams-Moulton predictor-corrector method. As expected, the volume-preserving algorithm is accurate in conserving divergence, but slow because the low integration order requires small amplitude steps. The second algorithm proves to be quite fast and it is able to integrate the field lines in many partially and fully stochastic configurations accurately. The code has already been used to study the core and edge magnetic topology of the RFX-mod device in both the reversed-field pinch and tokamak magnetic configurations.
Integration of Dakota into the NEAMS Workbench
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, Laura Painton; Lefebvre, Robert A.; Langley, Brandon R.
2017-07-01
This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on integrating Dakota into the NEAMS Workbench. The NEAMS Workbench, developed at Oak Ridge National Laboratory, is a new software framework that provides a graphical user interface, input file creation, parsing, validation, job execution, workflow management, and output processing for a variety of nuclear codes. Dakota is a tool developed at Sandia National Laboratories that provides a suite of uncertainty quantification and optimization algorithms. Providing Dakota within the NEAMS Workbench allows users of nuclear simulation codes to perform uncertainty and optimization studies on their nuclear codes frommore » within a common, integrated environment. Details of the integration and parsing are provided, along with an example of Dakota running a sampling study on the fuels performance code, BISON, from within the NEAMS Workbench.« less
High-fidelity large eddy simulation for supersonic jet noise prediction
NASA Astrophysics Data System (ADS)
Aikens, Kurt M.
The problem of intense sound radiation from supersonic jets is a concern for both civil and military applications. As a result, many experimental and computational efforts are focused at evaluating possible noise suppression techniques. Large-eddy simulation (LES) is utilized in many computational studies to simulate the turbulent jet flowfield. Integral methods such as the Ffowcs Williams-Hawkings (FWH) method are then used for propagation of the sound waves to the farfield. Improving the accuracy of this two-step methodology and evaluating beveled converging-diverging nozzles for noise suppression are the main tasks of this work. First, a series of numerical experiments are undertaken to ensure adequate numerical accuracy of the FWH methodology. This includes an analysis of different treatments for the downstream integration surface: with or without including an end-cap, averaging over multiple end-caps, and including an approximate surface integral correction term. Secondly, shock-capturing methods based on characteristic filtering and adaptive spatial filtering are used to extend a highly-parallelizable multiblock subsonic LES code to enable simulations of supersonic jets. The code is based on high-order numerical methods for accurate prediction of the acoustic sources and propagation of the sound waves. Furthermore, this new code is more efficient than the legacy version, allows cylindrical multiblock topologies, and is capable of simulating nozzles with resolved turbulent boundary layers when coupled with an approximate turbulent inflow boundary condition. Even though such wall-resolved simulations are more physically accurate, their expense is often prohibitive. To make simulations more economical, a wall model is developed and implemented. The wall modeling methodology is validated for turbulent quasi-incompressible and compressible zero pressure gradient flat plate boundary layers, and for subsonic and supersonic jets. The supersonic code additions and the wall model treatment are then utilized to simulate military-style nozzles with and without beveling of the nozzle exit plane. Experiments of beveled converging-diverging nozzles have found reduced noise levels for some observer locations. Predicting the noise for these geometries provides a good initial test of the overall methodology for a more complex nozzle. The jet flowfield and acoustic data are analyzed and compared to similar experiments and excellent agreement is found. Potential areas of improvement are discussed for future research.
VizieR Online Data Catalog: ynogkm: code for calculating time-like geodesics (Yang+, 2014)
NASA Astrophysics Data System (ADS)
Yang, X.-L.; Wang, J.-C.
2013-11-01
Here we present the source file for a new public code named ynogkm, aim on calculating the time-like geodesics in a Kerr-Newmann spacetime fast. In the code the four Boyer-Lindquis coordinates and proper time are expressed as functions of a parameter p semi-analytically, i.e., r(p), μ(p), φ(p), t(p), and σ(p), by using the Weiers- trass' and Jacobi's elliptic functions and integrals. All of the ellip- tic integrals are computed by Carlson's elliptic integral method, which guarantees the fast speed of the code.The source Fortran file ynogkm.f90 contains three modules: constants, rootfind, ellfunction, and blcoordinates. (3 data files).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamberg, L.D.
1998-02-23
This document serves as a notice of construction (NOC), pursuant to the requirements of Washington Administrative Code (WAC) 246-247-060, and as a request for approval to construct, pursuant to 40 Code of Federal Regulations (CFR) 61.07, for the Integrated Water Treatment System (IWTS) Filter Vessel Sparging Vent at 105-KW Basin. Additionally, the following description, and references are provided as the notices of startup, pursuant to 40 CFR 61.09(a)(1) and (2) in accordance with Title 40 Code of Federal Regulations, Part 61, National Emission Standards for Hazardous Air Pollutants. The 105-K West Reactor and its associated spent nuclear fuel (SNF) storagemore » basin were constructed in the early 1950s and are located on the Hanford Site in the 100-K Area about 1,400 feet from the Columbia River. The 105-KW Basin contains 964 Metric Tons of SNF stored under water in approximately 3,800 closed canisters. This SNF has been stored for varying periods of time ranging from 8 to 17 years. The 105-KW Basin is constructed of concrete with an epoxy coating and contains approximately 1.3 million gallons of water with an asphaltic membrane beneath the pool. The IWTS, which has been described in the Radioactive Air Emissions NOC for Fuel Removal for 105-KW Basin (DOE/RL-97-28 and page changes per US Department of Energy, Richland Operations Office letter 97-EAP-814) will be used to remove radionuclides from the basin water during fuel removal operations. The purpose of the modification described herein is to provide operational flexibility for the IWTS at the 105-KW basin. The proposed modification is scheduled to begin in calendar year 1998.« less
75 FR 39472 - Airworthiness Directives; Eclipse Aerospace, Inc. Model EA500 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-09
..., altitude preselect, and/or transponder codes. We are proposing this AD to correct faulty integration of... determined to be a software communication integration issue between the EFIS display interface and associated... transponder codes. We are issuing this AD to correct faulty integration of hardware and software, which could...
CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.; Flach, G.; BROWN, K.
2013-06-01
This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less
Monte Carlo treatment planning for molecular targeted radiotherapy within the MINERVA system
NASA Astrophysics Data System (ADS)
Lehmann, Joerg; Hartmann Siantar, Christine; Wessol, Daniel E.; Wemple, Charles A.; Nigg, David; Cogliati, Josh; Daly, Tom; Descalle, Marie-Anne; Flickinger, Terry; Pletcher, David; DeNardo, Gerald
2005-03-01
The aim of this project is to extend accurate and patient-specific treatment planning to new treatment modalities, such as molecular targeted radiation therapy, incorporating previously crafted and proven Monte Carlo and deterministic computation methods. A flexible software environment is being created that allows planning radiation treatment for these new modalities and combining different forms of radiation treatment with consideration of biological effects. The system uses common input interfaces, medical image sets for definition of patient geometry and dose reporting protocols. Previously, the Idaho National Engineering and Environmental Laboratory (INEEL), Montana State University (MSU) and Lawrence Livermore National Laboratory (LLNL) had accrued experience in the development and application of Monte Carlo based, three-dimensional, computational dosimetry and treatment planning tools for radiotherapy in several specialized areas. In particular, INEEL and MSU have developed computational dosimetry systems for neutron radiotherapy and neutron capture therapy, while LLNL has developed the PEREGRINE computational system for external beam photon-electron therapy. Building on that experience, the INEEL and MSU are developing the MINERVA (modality inclusive environment for radiotherapeutic variable analysis) software system as a general framework for computational dosimetry and treatment planning for a variety of emerging forms of radiotherapy. In collaboration with this development, LLNL has extended its PEREGRINE code to accommodate internal sources for molecular targeted radiotherapy (MTR), and has interfaced it with the plugin architecture of MINERVA. Results from the extended PEREGRINE code have been compared to published data from other codes, and found to be in general agreement (EGS4—2%, MCNP—10%) (Descalle et al 2003 Cancer Biother. Radiopharm. 18 71-9). The code is currently being benchmarked against experimental data. The interpatient variability of the drug pharmacokinetics in MTR can only be properly accounted for by image-based, patient-specific treatment planning, as has been common in external beam radiation therapy for many years. MINERVA offers 3D Monte Carlo-based MTR treatment planning as its first integrated operational capability. The new MINERVA system will ultimately incorporate capabilities for a comprehensive list of radiation therapies. In progress are modules for external beam photon-electron therapy and boron neutron capture therapy (BNCT). Brachytherapy and proton therapy are planned. Through the open application programming interface (API), other groups can add their own modules and share them with the community.
A Code of Ethics and Integrity for HRD Research and Practice.
ERIC Educational Resources Information Center
Hatcher, Tim; Aragon, Steven R.
2000-01-01
Describes the rationale for a code of ethics and integrity in human resource development (HRD). Outlines the Academy of Human Resource Development's standards. Reviews ethical issues faced by the HRD profession. (SK)
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less
Use of whole exome sequencing for the identification of Ito-based arrhythmia mechanism and therapy.
Sturm, Amy C; Kline, Crystal F; Glynn, Patric; Johnson, Benjamin L; Curran, Jerry; Kilic, Ahmet; Higgins, Robert S D; Binkley, Philip F; Janssen, Paul M L; Weiss, Raul; Raman, Subha V; Fowler, Steven J; Priori, Silvia G; Hund, Thomas J; Carnes, Cynthia A; Mohler, Peter J
2015-05-26
Identified genetic variants are insufficient to explain all cases of inherited arrhythmia. We tested whether the integration of whole exome sequencing with well-established clinical, translational, and basic science platforms could provide rapid and novel insight into human arrhythmia pathophysiology and disease treatment. We report a proband with recurrent ventricular fibrillation, resistant to standard therapeutic interventions. Using whole-exome sequencing, we identified a variant in a previously unidentified exon of the dipeptidyl aminopeptidase-like protein-6 (DPP6) gene. This variant is the first identified coding mutation in DPP6 and augments cardiac repolarizing current (Ito) causing pathological changes in Ito and action potential morphology. We designed a therapeutic regimen incorporating dalfampridine to target Ito. Dalfampridine, approved for multiple sclerosis, normalized the ECG and reduced arrhythmia burden in the proband by >90-fold. This was combined with cilostazol to accelerate the heart rate to minimize the reverse-rate dependence of augmented Ito. We describe a novel arrhythmia mechanism and therapeutic approach to ameliorate the disease. Specifically, we identify the first coding variant of DPP6 in human ventricular fibrillation. These findings illustrate the power of genetic approaches for the elucidation and treatment of disease when carefully integrated with clinical and basic/translational research teams. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Open-Source, Distributed Computational Environment for Virtual Materials Exploration
2015-01-01
compromising structural integrity. For example, advanced designs could specify advanced materials processing techniques such as heat treatments in specific...orchestration of execution of multiple standalone codes at varying length scales will need advanced high ‐performance computing (HPC) integration in...possible hooks that could be used to coordinate larger workflows spanning tools developed by different groups. The high level approach explored
Fostering integrity in postgraduate research: an evidence-based policy and support framework.
Mahmud, Saadia; Bretag, Tracey
2014-01-01
Postgraduate research students have a unique position in the debate on integrity in research as students and novice researchers. To assess how far policies for integrity in postgraduate research meet the needs of students as "research trainees," we reviewed online policies for integrity in postgraduate research at nine particular Australian universities against the Australian Code for Responsible Conduct of Research (the Code) and the five core elements of exemplary academic integrity policy identified by Bretag et al. (2011 ), i.e., access, approach, responsibility, detail, and support. We found inconsistency with the Code in the definition of research misconduct and a lack of adequate detail and support. Based on our analysis, previous research, and the literature, we propose a framework for policy and support for postgraduate research that encompasses a consistent and educative approach to integrity maintained across the university at all levels of scholarship and for all stakeholders.
MCNP-based computational model for the Leksell gamma knife.
Trnka, Jiri; Novotny, Josef; Kluson, Jaroslav
2007-01-01
We have focused on the usage of MCNP code for calculation of Gamma Knife radiation field parameters with a homogenous polystyrene phantom. We have investigated several parameters of the Leksell Gamma Knife radiation field and compared the results with other studies based on EGS4 and PENELOPE code as well as the Leksell Gamma Knife treatment planning system Leksell GammaPlan (LGP). The current model describes all 201 radiation beams together and simulates all the sources in the same time. Within each beam, it considers the technical construction of the source, the source holder, collimator system, the spherical phantom, and surrounding material. We have calculated output factors for various sizes of scoring volumes, relative dose distributions along basic planes including linear dose profiles, integral doses in various volumes, and differential dose volume histograms. All the parameters have been calculated for each collimator size and for the isocentric configuration of the phantom. We have found the calculated output factors to be in agreement with other authors' works except the case of 4 mm collimator size, where averaging over the scoring volume and statistical uncertainties strongly influences the calculated results. In general, all the results are dependent on the choice of the scoring volume. The calculated linear dose profiles and relative dose distributions also match independent studies and the Leksell GammaPlan, but care must be taken about the fluctuations within the plateau, which can influence the normalization, and accuracy in determining the isocenter position, which is important for comparing different dose profiles. The calculated differential dose volume histograms and integral doses have been compared with data provided by the Leksell GammaPlan. The dose volume histograms are in good agreement as well as integral doses calculated in small calculation matrix volumes. However, deviations in integral doses up to 50% can be observed for large volumes such as for the total skull volume. The differences observed in treatment of scattered radiation between the MC method and the LGP may be important in this case. We have also studied the influence of differential direction sampling of primary photons and have found that, due to the anisotropic sampling, doses around the isocenter deviate from each other by up to 6%. With caution about the details of the calculation settings, it is possible to employ the MCNP Monte Carlo code for independent verification of the Leksell Gamma Knife radiation field properties.
MODTRAN6: a major upgrade of the MODTRAN radiative transfer code
NASA Astrophysics Data System (ADS)
Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette
2014-06-01
The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.
ERIC Educational Resources Information Center
Biswas, Ann E.
2013-01-01
Although most colleges strive to nurture a culture of integrity, incidents of dishonest behavior are on the rise. This article examines the role student development plays in students' perceptions of academic dishonesty and in their willingness to adhere to a code of conduct that may be in sharp contrast to traditional integrity policies.
Computer codes developed and under development at Lewis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1992-01-01
The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.
Study of SOL in DIII-D tokamak with SOLPS suite of codes.
NASA Astrophysics Data System (ADS)
Pankin, Alexei; Bateman, Glenn; Brennan, Dylan; Coster, David; Hogan, John; Kritz, Arnold; Kukushkin, Andrey; Schnack, Dalton; Snyder, Phil
2005-10-01
The scrape-of-layer (SOL) region in DIII-D tokamak is studied with the SOLPS integrated suite of codes. The SOLPS package includes the 3D multi-species Monte-Carlo neutral code EIRINE and 2D multi-fluid code B2. The EIRINE and B2 codes are cross-coupled through B2-EIRINE interface. The results of SOLPS simulations are used in the integrated modeling of the plasma edge in DIII-D tokamak with the ASTRA transport code. Parameterized dependences for neutral particle fluxes that are computed with the SOLPS code are implemented in a model for the H-mode pedestal and ELMs [1] in the ASTRA code. The effects of neutrals on the H-mode pedestal and ELMs are studied in this report. [1] A. Y. Pankin, I. Voitsekhovitch, G. Bateman, et al., Plasma Phys. Control. Fusion 47, 483 (2005).
NASA Technical Reports Server (NTRS)
Radhakrishnan, K.
1984-01-01
The efficiency and accuracy of several algorithms recently developed for the efficient numerical integration of stiff ordinary differential equations are compared. The methods examined include two general-purpose codes, EPISODE and LSODE, and three codes (CHEMEQ, CREK1D, and GCKP84) developed specifically to integrate chemical kinetic rate equations. The codes are applied to two test problems drawn from combustion kinetics. The comparisons show that LSODE is the fastest code currently available for the integration of combustion kinetic rate equations. An important finding is that an interactive solution of the algebraic energy conservation equation to compute the temperature does not result in significant errors. In addition, this method is more efficient than evaluating the temperature by integrating its time derivative. Significant reductions in computational work are realized by updating the rate constants (k = at(supra N) N exp(-E/RT) only when the temperature change exceeds an amount delta T that is problem dependent. An approximate expression for the automatic evaluation of delta T is derived and is shown to result in increased efficiency.
Toward an integrated knowledge environment to support modern oncology.
Blake, Patrick M; Decker, David A; Glennon, Timothy M; Liang, Yong Michael; Losko, Sascha; Navin, Nicholas; Suh, K Stephen
2011-01-01
Around the world, teams of researchers continue to develop a wide range of systems to capture, store, and analyze data including treatment, patient outcomes, tumor registries, next-generation sequencing, single-nucleotide polymorphism, copy number, gene expression, drug chemistry, drug safety, and toxicity. Scientists mine, curate, and manually annotate growing mountains of data to produce high-quality databases, while clinical information is aggregated in distant systems. Databases are currently scattered, and relationships between variables coded in disparate datasets are frequently invisible. The challenge is to evolve oncology informatics from a "systems" orientation of standalone platforms and silos into an "integrated knowledge environments" that will connect "knowable" research data with patient clinical information. The aim of this article is to review progress toward an integrated knowledge environment to support modern oncology with a focus on supporting scientific discovery and improving cancer care.
An integrity measure to benchmark quantum error correcting memories
NASA Astrophysics Data System (ADS)
Xu, Xiaosi; de Beaudrap, Niel; O'Gorman, Joe; Benjamin, Simon C.
2018-02-01
Rapidly developing experiments across multiple platforms now aim to realise small quantum codes, and so demonstrate a memory within which a logical qubit can be protected from noise. There is a need to benchmark the achievements in these diverse systems, and to compare the inherent power of the codes they rely upon. We describe a recently introduced performance measure called integrity, which relates to the probability that an ideal agent will successfully ‘guess’ the state of a logical qubit after a period of storage in the memory. Integrity is straightforward to evaluate experimentally without state tomography and it can be related to various established metrics such as the logical fidelity and the pseudo-threshold. We offer a set of experimental milestones that are steps towards demonstrating unconditionally superior encoded memories. Using intensive numerical simulations we compare memories based on the five-qubit code, the seven-qubit Steane code, and a nine-qubit code which is the smallest instance of a surface code; we assess both the simple and fault-tolerant implementations of each. While the ‘best’ code upon which to base a memory does vary according to the nature and severity of the noise, nevertheless certain trends emerge.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebetrau, A.M.
Work is underway at Pacific Northwest Laboratory (PNL) to improve the probabilistic analysis used to model pressurized thermal shock (PTS) incidents in reactor pressure vessels, and, further, to incorporate these improvements into the existing Vessel Integrity Simulation Analysis (VISA) code. Two topics related to work on input distributions in VISA are discussed in this paper. The first involves the treatment of flaw size distributions and the second concerns errors in the parameters in the (Guthrie) equation which is used to compute ..delta..RT/sub NDT/, the shift in reference temperature for nil ductility transition.
Comparison of the CENTRM resonance processor to the NITAWL resonance processor in SCALE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollenbach, D.F.; Petrie, L.M.
1998-01-01
This report compares the MTAWL and CENTRM resonance processors in the SCALE code system. The cases examined consist of the International OECD/NEA Criticality Working Group Benchmark 20 problem. These cases represent fuel pellets partially dissolved in a borated solution. The assumptions inherent to the Nordheim Integral Treatment, used in MTAWL, are not valid for these problems. CENTRM resolves this limitation by explicitly calculating a problem dependent point flux from point cross sections, which is then used to create group cross sections.
Computer modeling of test particle acceleration at oblique shocks
NASA Technical Reports Server (NTRS)
Decker, Robert B.
1988-01-01
The present evaluation of the basic techniques and illustrative results of charged particle-modeling numerical codes suitable for particle acceleration at oblique, fast-mode collisionless shocks emphasizes the treatment of ions as test particles, calculating particle dynamics through numerical integration along exact phase-space orbits. Attention is given to the acceleration of particles at planar, infinitessimally thin shocks, as well as to plasma simulations in which low-energy ions are injected and accelerated at quasi-perpendicular shocks with internal structure.
Nelson, Judith E.; Weissman, David E.; Hays, Ross M.; Mosenthal, Anne C.; Mulkerin, Colleen; Puntillo, Kathleen A.; Ray, Daniel E.; Bassett, Rick; Boss, Renee D.; Brasel, Karen J.; Campbell, Margaret L.; Cortez, Therese B.; Curtis, J. Randall
2012-01-01
Patients with advanced illness often spend time in an ICU, while nearly one-third of patients with advanced cancer who receive Medicare die in hospitals, often with failed ICU care. For most, death occurs following the withdrawal or withholding of life-sustaining treatments. The integration of palliative care is essential for high-quality critical care. Although palliative care specialists are becoming increasingly available, intensivists and other physicians are also expected to provide basic palliative care, including symptom treatment and communication about goals of care. Patients who are critically ill are often unable to make decisions about their care. In these situations, physicians must meet with family members or other surrogates to determine appropriate medical treatments. These meetings require clinical expertise to ensure that patient values are explored for medical decision making about therapeutic options, including palliative care. Meetings with families take time. Issues related to the disease process, prognosis, and treatment plan are complex, and decisions about the use or limitation of intensive care therapies have life-or-death implications. Inadequate reimbursement for physician services may be a barrier to the optimal delivery of high-quality palliative care, including effective communication. Appropriate documentation of time spent integrating palliative and critical care for patients who are critically ill can be consistent with the Current Procedural Terminology codes (99291 and 99292) for critical care services. The purpose of this article is to help intensivists and other providers understand the circumstances in which integration of palliative and critical care meets the definition of critical care services for billing purposes. PMID:22396564
Return-to-work intervention during cancer treatment - The providers' experiences.
Petersen, K S; Momsen, A H; Stapelfeldt, C M; Olsen, P R; Nielsen, C V
2018-03-01
To explore in-depth understanding of providers' experiences when involved in a return-to-work (RTW) intervention offered during cancer treatment. Semi-structured individual interviews and participant observations at a hospital department and two municipal job centers were carried out, including ten providers (physicians, nurses and social workers). A phenomenological-hermeneutic approach was applied, involving coding, identification of themes and interpretation of findings. Three major themes were identified: Treatment first, Work as an integrated component in cancer rehabilitation, and Challenges in bringing up work issues. Differences in providers' experiences of the RTW intervention offered to cancer patients were found: in the hospital setting RTW was a second priority, whereas in the municipality job centers it was an integrated component. Further studies are needed to investigate how and when occupational rehabilitation services can be implemented across sectors to support cancer patients' RTW. In the future, work issues ought to be systematically presented by providers across sectors as early as possible to support cancer patients' RTW. Cancer patients' individual needs and thoughts about RTW are to be identified by both health care providers during treatment and social workers at the municipality level and shared across sectors. © 2017 John Wiley & Sons Ltd.
Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy
ERIC Educational Resources Information Center
Hutchison, Amy; Nadolny, Larysa; Estapa, Anne
2016-01-01
In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…
Asymmetric Memory Circuit Would Resist Soft Errors
NASA Technical Reports Server (NTRS)
Buehler, Martin G.; Perlman, Marvin
1990-01-01
Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.
NASA Technical Reports Server (NTRS)
Cartier, D. E.
1976-01-01
This concise paper considers the effect on the autocorrelation function of a pseudonoise (PN) code when the acquisition scheme only integrates coherently over part of the code and then noncoherently combines these results. The peak-to-null ratio of the effective PN autocorrelation function is shown to degrade to the square root of n, where n is the number of PN symbols over which coherent integration takes place.
The Integration of COTS/GOTS within NASA's HST Command and Control System
NASA Technical Reports Server (NTRS)
Pfarr, Thomas; Reis, James E.; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with COTS/GOTS and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes 31 COTS/GOTS products with an additional 12 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the pack-age selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons teamed for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. Command and control systems designed with traditional custom code development efforts will be compared with command and control systems designed with new development techniques relying heavily on COTS/COTS integration. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.
Dong, Lu; Zhao, Xin; Ong, Stacie L; Harvey, Allison G
2017-10-01
The current study examined whether and which specific contents of patients' memory for cognitive therapy (CT) were associated with treatment adherence and outcome. Data were drawn from a pilot RCT of forty-eight depressed adults, who received either CT plus Memory Support Intervention (CT + Memory Support) or CT-as-usual. Patients' memory for treatment was measured using the Patient Recall Task and responses were coded into cognitive behavioral therapy (CBT) codes, such as CBT Model and Cognitive Restructuring, and non-CBT codes, such as individual coping strategies and no code. Treatment adherence was measured using therapist and patient ratings during treatment. Depression outcomes included treatment response, remission, and recurrence. Total number of CBT codes recalled was not significantly different comparing CT + Memory Support to CT-as-usual. Total CBT codes recalled were positively associated with adherence, while non-CBT codes recalled were negatively associated with adherence. Treatment responders (vs. non-responders) exhibited a significant increase in their recall of Cognitive Restructuring from session 7 to posttreatment. Greater recall of Cognitive Restructuring was marginally significantly associated with remission. Greater total number of CBT codes recalled (particularly CBT Model) was associated with non-recurrence of depression. Results highlight the important relationships between patients' memory for treatment and treatment adherence and outcome. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Andre, R.; Carlsson, J.; Gorelenkova, M.; Jardin, S.; Kaye, S.; Poli, F.; Yuan, X.
2016-10-01
TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT- SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP incorporates high fidelity heating and current drive source models, such as NUBEAM for neutral beam injection, the beam tracing code TORBEAM for EC, TORIC for ICRF, the ray tracing TORAY and GENRAY for EC. The implementation of selected components makes efficient use of MPI for speed up of code calculations. Recently the GENRAY-CQL3D solver for modeling of LH heating and current drive has been implemented and currently being extended to multiple antennas, to allow modeling of EAST discharges. Also, GENRAY+CQL3D is being extended to the use of EC/EBW and of HHFW for NSTX-U. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Work supported by the US Department of Energy under DE-AC02-CH0911466.
An Implementation Of Elias Delta Code And ElGamal Algorithm In Image Compression And Security
NASA Astrophysics Data System (ADS)
Rachmawati, Dian; Andri Budiman, Mohammad; Saffiera, Cut Amalia
2018-01-01
In data transmission such as transferring an image, confidentiality, integrity, and efficiency of data storage aspects are highly needed. To maintain the confidentiality and integrity of data, one of the techniques used is ElGamal. The strength of this algorithm is found on the difficulty of calculating discrete logs in a large prime modulus. ElGamal belongs to the class of Asymmetric Key Algorithm and resulted in enlargement of the file size, therefore data compression is required. Elias Delta Code is one of the compression algorithms that use delta code table. The image was first compressed using Elias Delta Code Algorithm, then the result of the compression was encrypted by using ElGamal algorithm. Prime test was implemented using Agrawal Biswas Algorithm. The result showed that ElGamal method could maintain the confidentiality and integrity of data with MSE and PSNR values 0 and infinity. The Elias Delta Code method generated compression ratio and space-saving each with average values of 62.49%, and 37.51%.
Cellular Response to Ionizing Radiation: A MicroRNA Story
Halimi, Mohammad; Asghari, S. Mohsen; Sariri, Reyhaneh; Moslemi, Dariush; Parsian, Hadi
2012-01-01
MicroRNAs (miRNAs) represent a class of small non-coding RNA molecules that regulate gene expression at the post-transcriptional level. They play a crucial role in diverse cellular pathways. Ionizing radiation (IR) is one of the most important treatment protocols for patients that suffer from cancer and affects directly or indirectly cellular integration. Recently it has been discovered that microRNA-mediated gene regulation interferes with radio-related pathways in ionizing radiation. Here, we review the recent discoveries about miRNAs in cellular response to IR. Thoroughly understanding the mechanism of miRNAs in radiation response, it will be possible to design new strategies for improving radiotherapy efficiency and ultimately cancer treatment. PMID:24551775
Computation of Sound Propagation by Boundary Element Method
NASA Technical Reports Server (NTRS)
Guo, Yueping
2005-01-01
This report documents the development of a Boundary Element Method (BEM) code for the computation of sound propagation in uniform mean flows. The basic formulation and implementation follow the standard BEM methodology; the convective wave equation and the boundary conditions on the surfaces of the bodies in the flow are formulated into an integral equation and the method of collocation is used to discretize this equation into a matrix equation to be solved numerically. New features discussed here include the formulation of the additional terms due to the effects of the mean flow and the treatment of the numerical singularities in the implementation by the method of collocation. The effects of mean flows introduce terms in the integral equation that contain the gradients of the unknown, which is undesirable if the gradients are treated as additional unknowns, greatly increasing the sizes of the matrix equation, or if numerical differentiation is used to approximate the gradients, introducing numerical error in the computation. It is shown that these terms can be reformulated in terms of the unknown itself, making the integral equation very similar to the case without mean flows and simple for numerical implementation. To avoid asymptotic analysis in the treatment of numerical singularities in the method of collocation, as is conventionally done, we perform the surface integrations in the integral equation by using sub-triangles so that the field point never coincide with the evaluation points on the surfaces. This simplifies the formulation and greatly facilitates the implementation. To validate the method and the code, three canonic problems are studied. They are respectively the sound scattering by a sphere, the sound reflection by a plate in uniform mean flows and the sound propagation over a hump of irregular shape in uniform flows. The first two have analytical solutions and the third is solved by the method of Computational Aeroacoustics (CAA), all of which are used to compare the BEM solutions. The comparisons show very good agreements and validate the accuracy of the BEM approach implemented here.
Integrated modelling framework for short pulse high energy density physics experiments
NASA Astrophysics Data System (ADS)
Sircombe, N. J.; Hughes, S. J.; Ramsay, M. G.
2016-03-01
Modelling experimental campaigns on the Orion laser at AWE, and developing a viable point-design for fast ignition (FI), calls for a multi-scale approach; a complete description of the problem would require an extensive range of physics which cannot realistically be included in a single code. For modelling the laser-plasma interaction (LPI) we need a fine mesh which can capture the dispersion of electromagnetic waves, and a kinetic model for each plasma species. In the dense material of the bulk target, away from the LPI region, collisional physics dominates. The transport of hot particles generated by the action of the laser is dependent on their slowing and stopping in the dense material and their need to draw a return current. These effects will heat the target, which in turn influences transport. On longer timescales, the hydrodynamic response of the target will begin to play a role as the pressure generated from isochoric heating begins to take effect. Recent effort at AWE [1] has focussed on the development of an integrated code suite based on: the particle in cell code EPOCH, to model LPI; the Monte-Carlo electron transport code THOR, to model the onward transport of hot electrons; and the radiation hydrodynamics code CORVUS, to model the hydrodynamic response of the target. We outline the methodology adopted, elucidate on the advantages of a robustly integrated code suite compared to a single code approach, demonstrate the integrated code suite's application to modelling the heating of buried layers on Orion, and assess the potential of such experiments for the validation of modelling capability in advance of more ambitious HEDP experiments, as a step towards a predictive modelling capability for FI.
Beyond Honour Codes: Bringing Students into the Academic Integrity Equation
ERIC Educational Resources Information Center
Richards, Deborah; Saddiqui, Sonia; McGuigan, Nicholas; Homewood, Judi
2016-01-01
Honour codes represent a successful and unique, student-led, "bottom-up" approach to the promotion of academic integrity (AI). With increased flexibility, globalisation and distance or blended education options, most institutions operate in very different climates and cultures from the US institutions that have a long-established culture…
Incorporating Manual and Autonomous Code Generation
NASA Technical Reports Server (NTRS)
McComas, David
1998-01-01
Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.
NASA Astrophysics Data System (ADS)
Nitadori, Keigo; Makino, Junichiro; Hut, Piet
2006-12-01
The main performance bottleneck of gravitational N-body codes is the force calculation between two particles. We have succeeded in speeding up this pair-wise force calculation by factors between 2 and 10, depending on the code and the processor on which the code is run. These speed-ups were obtained by writing highly fine-tuned code for x86_64 microprocessors. Any existing N-body code, running on these chips, can easily incorporate our assembly code programs. In the current paper, we present an outline of our overall approach, which we illustrate with one specific example: the use of a Hermite scheme for a direct N2 type integration on a single 2.0 GHz Athlon 64 processor, for which we obtain an effective performance of 4.05 Gflops, for double-precision accuracy. In subsequent papers, we will discuss other variations, including the combinations of N log N codes, single-precision implementations, and performance on other microprocessors.
Greenham, Stuart; Manley, Stephen; Turnbull, Kirsty; Hoffmann, Matthew; Fonseca, Amara; Westhuyzen, Justin; Last, Andrew; Aherne, Noel J; Shakespeare, Thomas P
2018-01-01
To develop and apply a clinical incident taxonomy for radiation therapy. Capturing clinical incident information that focuses on near-miss events is critical for achieving higher levels of safety and reliability. A clinical incident taxonomy for radiation therapy was established; coding categories were prescription, consent, simulation, voluming, dosimetry, treatment, bolus, shielding, imaging, quality assurance and coordination of care. The taxonomy was applied to all clinical incidents occurring at three integrated cancer centres for the years 2011-2015. Incidents were managed locally, audited and feedback disseminated to all centres. Across the five years the total incident rate (per 100 courses) was 8.54; the radiotherapy-specific coded rate was 6.71. The rate of true adverse events (unintended treatment and potential patient harm) was 1.06. Adverse events, where no harm was identified, occurred at a rate of 2.76 per 100 courses. Despite workload increases, overall and actual rates both exhibited downward trends over the 5-year period. The taxonomy captured previously unidentified quality assurance failures; centre-specific issues that contributed to variations in incident trends were also identified. The application of a taxonomy developed for radiation therapy enhances incident investigation and facilitates strategic interventions. The practice appears to be effective in our institution and contributes to the safety culture. The ratio of near miss to actual incidents could serve as a possible measure of incident reporting culture and could be incorporated into large scale incident reporting systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
Jouhet, Vianney; Mougin, Fleur; Bréchat, Bérénice; Thiessard, Frantz
2017-02-07
Identifying incident cancer cases within a population remains essential for scientific research in oncology. Data produced within electronic health records can be useful for this purpose. Due to the multiplicity of providers, heterogeneous terminologies such as ICD-10 and ICD-O-3 are used for oncology diagnosis recording purpose. To enable disease identification based on these diagnoses, there is a need for integrating disease classifications in oncology. Our aim was to build a model integrating concepts involved in two disease classifications, namely ICD-10 (diagnosis) and ICD-O-3 (topography and morphology), despite their structural heterogeneity. Based on the NCIt, a "derivative" model for linking diagnosis and topography-morphology combinations was defined and built. ICD-O-3 and ICD-10 codes were then used to instantiate classes of the "derivative" model. Links between terminologies obtained through the model were then compared to mappings provided by the Surveillance, Epidemiology, and End Results (SEER) program. The model integrated 42% of neoplasm ICD-10 codes (excluding metastasis), 98% of ICD-O-3 morphology codes (excluding metastasis) and 68% of ICD-O-3 topography codes. For every codes instantiating at least a class in the "derivative" model, comparison with SEER mappings reveals that all mappings were actually available in the model as a link between the corresponding codes. We have proposed a method to automatically build a model for integrating ICD-10 and ICD-O-3 based on the NCIt. The resulting "derivative" model is a machine understandable resource that enables an integrated view of these heterogeneous terminologies. The NCIt structure and the available relationships can help to bridge disease classifications taking into account their structural and granular heterogeneities. However, (i) inconsistencies exist within the NCIt leading to misclassifications in the "derivative" model, (ii) the "derivative" model only integrates a part of ICD-10 and ICD-O-3. The NCIt is not sufficient for integration purpose and further work based on other termino-ontological resources is needed in order to enrich the model and avoid identified inconsistencies.
Health facility challenges to the provision of Option B+ in western Kenya: a qualitative study
Akama, Eliud; Bukusi, Elizabeth A; Musoke, Pamela; Nalwa, Wafula Z; Odeny, Thomas A; Onono, Maricianah; Spangler, Sydney A; Turan, Janet M; Wanga, Iris; Abuogi, Lisa L
2017-01-01
Current WHO guidelines recommend lifelong antiretroviral therapy (ART) for all HIV-positive individuals, including pregnant and breastfeeding women (Option B+) in settings with generalized HIV epidemics. While Option B+ is scaled-up in Kenya, insufficient adherence and retention to care could undermine the expected positive impact of Option B+. To explore challenges to the provision of Option B+ at the health facility level, we conducted forty individual gender-matched in-depth interviews with HIV-positive pregnant/postpartum women and their male partners, and four focus groups with thirty health care providers at four health facilities in western Kenya between September-November 2014. Transcripts were coded with the Dedoose software using a coding framework based on the literature, topics from interview guides, and emerging themes from transcripts. Excerpts from broad codes were then fine-coded using an inductive approach. Three major themes emerged: 1) Option B+ specific challenges (same-day initiation into treatment, health care providers unconvinced of the benefits of Option B+, insufficient training); 2) facility resource constraints (staff and drug shortages, long queues, space limitations); and 3) lack of client-friendly services (scolding of patients, inconvenient operating hours, lack of integration of services, administrative requirements). This study highlights important challenges at the health facility level related to Option B+ rollout in western Kenya. Addressing these specific challenges may increase linkage, retention and adherence to life-long ART treatment for pregnant HIV-positive women in Kenya, contribute towards elimination of mother-to-child HIV transmission, and improve maternal and child outcomes. PMID:28207061
Health facility challenges to the provision of Option B+ in western Kenya: a qualitative study.
Helova, Anna; Akama, Eliud; Bukusi, Elizabeth A; Musoke, Pamela; Nalwa, Wafula Z; Odeny, Thomas A; Onono, Maricianah; Spangler, Sydney A; Turan, Janet M; Wanga, Iris; Abuogi, Lisa L
2017-03-01
Current WHO guidelines recommend lifelong antiretroviral therapy (ART) for all HIV-positive individuals, including pregnant and breastfeeding women (Option B+) in settings with generalized HIV epidemics. While Option B+ is scaled-up in Kenya, insufficient adherence and retention to care could undermine the expected positive impact of Option B+. To explore challenges to the provision of Option B+ at the health facility level, we conducted forty individual gender-matched in-depth interviews with HIV-positive pregnant/postpartum women and their male partners, and four focus groups with thirty health care providers at four health facilities in western Kenya between September-November 2014. Transcripts were coded with the Dedoose software using a coding framework based on the literature, topics from interview guides, and emerging themes from transcripts. Excerpts from broad codes were then fine-coded using an inductive approach. Three major themes emerged: 1) Option B+ specific challenges (same-day initiation into treatment, health care providers unconvinced of the benefits of Option B+, insufficient training); 2) facility resource constraints (staff and drug shortages, long queues, space limitations); and 3) lack of client-friendly services (scolding of patients, inconvenient operating hours, lack of integration of services, administrative requirements). This study highlights important challenges at the health facility level related to Option B+ rollout in western Kenya. Addressing these specific challenges may increase linkage, retention and adherence to life-long ART treatment for pregnant HIV-positive women in Kenya, contribute towards elimination of mother-to-child HIV transmission, and improve maternal and child outcomes.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
... Treatment (Code 521D), Pond Sealing or Lining--Soil Dispersant Treatment (Code 521B), Salinity and Sodic Soil Management (Code 610), Stream Habitat Improvement and Management (Code 395), Vertical Drain (Code... the criteria section; an expansion of the considerations section to include fish and wildlife and soil...
Scherzinger, William M.
2016-05-01
The numerical integration of constitutive models in computational solid mechanics codes allows for the solution of boundary value problems involving complex material behavior. Metal plasticity models, in particular, have been instrumental in the development of these codes. Here, most plasticity models implemented in computational codes use an isotropic von Mises yield surface. The von Mises, of J 2, yield surface has a simple predictor-corrector algorithm - the radial return algorithm - to integrate the model.
ARC integration into the NEAMS Workbench
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stauff, N.; Gaughan, N.; Kim, T.
2017-01-01
One of the objectives of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Integration Product Line (IPL) is to facilitate the deployment of the high-fidelity codes developed within the program. The Workbench initiative was launched in FY-2017 by the IPL to facilitate the transition from conventional tools to high fidelity tools. The Workbench provides a common user interface for model creation, real-time validation, execution, output processing, and visualization for integrated codes.
Spohr, Stephanie A; Taxman, Faye S; Rodriguez, Mayra; Walters, Scott T
2016-06-01
Although substance use is common among people in the U.S. criminal justice system, treatment initiation remains an ongoing problem. This study assessed the reliability and predictive validity of the Motivational Interviewing Treatment Integrity 3.1.1. (MITI) coding instrument in a community corrections sample. We used data from 80 substance-using clients who were participating in a clinical trial of MI in a probation setting. We analyzed 124 MI counseling sessions using the MITI, a coding system for documenting MI fidelity. Bivariate associations and logistic regression modeling were used to determine if MI-consistent behaviors predicted substance use or treatment initiation at a 2-month follow-up. We found a high level of agreement between coders on behavioral utterance counts. Counselors met at least beginning proficiency on most MITI summary scores. Probationers who initiated treatment at 2-month follow-up had significantly higher ratings of clinician empathy and MI spirit than clients who did not initiate treatment. Other MITI summary scores were not significantly different between clients who had initiated treatment and those who did not. MI spirit and empathy ratings were entered into a forward logistic regression in which MI spirit significantly predicted 2-month treatment initiation (χ(2) (1)=4.10, p<.05, R(2)=.05) but counselor empathy did not. MITI summary scores did not predict substance use at 2-month follow-up. Counselor MI-consistent relational skills were an important predictor of client treatment initiation. Counselor behaviors such as empathy and MI spirit may be important for developing client rapport with people in a probation setting. Copyright © 2015. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoekstra, Robert J.; Hammond, Simon David; Richards, David
2017-09-01
This milestone is a tri-lab deliverable supporting ongoing Co-Design efforts impacting applications in the Integrated Codes (IC) program element Advanced Technology Development and Mitigation (ATDM) program element. In FY14, the trilabs looked at porting proxy application to technologies of interest for ATS procurements. In FY15, a milestone was completed evaluating proxy applications in multiple programming models and in FY16, a milestone was completed focusing on the migration of lessons learned back into production code development. This year, the co-design milestone focuses on extracting the knowledge gained and/or code revisions back into production applications.
Computational tools and lattice design for the PEP-II B-Factory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Y.; Irwin, J.; Nosochkov, Y.
1997-02-01
Several accelerator codes were used to design the PEP-II lattices, ranging from matrix-based codes, such as MAD and DIMAD, to symplectic-integrator codes, such as TRACY and DESPOT. In addition to element-by-element tracking, we constructed maps to determine aberration strengths. Furthermore, we have developed a fast and reliable method (nPB tracking) to track particles with a one-turn map. This new technique allows us to evaluate performance of the lattices on the entire tune-plane. Recently, we designed and implemented an object-oriented code in C++ called LEGO which integrates and expands upon TRACY and DESPOT. {copyright} {ital 1997 American Institute of Physics.}
Computational tools and lattice design for the PEP-II B-Factory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai Yunhai; Irwin, John; Nosochkov, Yuri
1997-02-01
Several accelerator codes were used to design the PEP-II lattices, ranging from matrix-based codes, such as MAD and DIMAD, to symplectic-integrator codes, such as TRACY and DESPOT. In addition to element-by-element tracking, we constructed maps to determine aberration strengths. Furthermore, we have developed a fast and reliable method (nPB tracking) to track particles with a one-turn map. This new technique allows us to evaluate performance of the lattices on the entire tune-plane. Recently, we designed and implemented an object-oriented code in C++ called LEGO which integrates and expands upon TRACY and DESPOT.
RAY-RAMSES: a code for ray tracing on the fly in N-body simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barreira, Alexandre; Llinares, Claudio; Bose, Sownak
2016-05-01
We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less
Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic
NASA Astrophysics Data System (ADS)
Li, Yan; Dai, Shifang; Wu, Weiwei
2016-12-01
Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.
Setiawan, B B
2002-01-01
The settlement along the bank of the Code River in Yogyakarta, Indonesia provides housing for a large mass of the city's poor. Its strategic location and the fact that most urban poor do not have access to land, attracts people to "illegally" settle along the bank of the river. This brings negative consequences for the environment, particularly the increasing domestic waste along the river and the annual flooding in the rainy season. While the public controversies regarding the existence of the settlement along the Code River were still not resolved, at the end of the 1980s, a group of architects, academics and community members proposed the idea of constructing a dike along the River as part of a broader settlement improvement program. From 1991 to 1998, thousands of local people mobilized their resources and were able to construct 6,000 metres of riverside dike along the Code River. The construction of the riverside dike along the River has become an important "stimulant" that generated not only settlement improvement, but also a better treatment of river water. As all housing units located along the River are now facing the River, the River itself is considered the "front-yard". Before the dike was constructed, the inhabitants used to treat the River as the "backyard" and therefore just throw waste into the River. They now really want to have a cleaner river, since the River is an important part of their settlement. The settlement along the Code River presents a complex range of persistent problems with informal settlements in Indonesia; such problems are related to the issues of how to provide more affordable and adequate housing for the poor, while at the same time, to improve the water quality of the river. The project represents a good case, which shows that through a mutual partnership among stakeholders, it is possible to integrate environmental goals into urban redevelopment schemes.
Synnot, Anneliese J; Hill, Sophie J; Garner, Kerryn A; Summers, Michael P; Filippini, Graziella; Osborne, Richard H; Shapland, Sue D P; Colombo, Cinzia; Mosconi, Paola
2016-06-01
The Internet is increasingly prominent as a source of health information for people with multiple sclerosis (MS). But there has been little exploration of the needs, experiences and preferences of people with MS for integrating treatment information into decision making, in the context of searching on the Internet. This was the aim of our study. Sixty participants (51 people with MS; nine family members) took part in a focus group or online forum. They were asked to describe how they find and assess reliable treatment information (particularly online) and how this changes over time. Thematic analysis was underpinned by a coding frame. Participants described that there was both too much information online and too little that applied to them. They spoke of wariness and scepticism but also empowerment. The availability of up-to-date and unbiased treatment information, including practical and lifestyle-related information, was important to many. Many participants were keen to engage in a 'research partnership' with health professionals and developed a range of strategies to enhance the trustworthiness of online information. We use the term 'self-regulation' to capture the variations in information seeking behaviour that participants described over time, as they responded to their changing information needs, their emotional state and growing expertise about MS. People with MS have developed a number of strategies to both find and integrate treatment information from a range of sources. Their reflections informed the development of an evidence-based consumer web site based on summaries of MS Cochrane reviews. © 2014 John Wiley & Sons Ltd.
Legha, Rupinder Kaur; Novins, Douglas
2012-07-01
Culture figures prominently in discussions regarding the etiology of alcohol and substance abuse in American Indian and Alaska Native (AI/AN) communities, and a substantial body of literature suggests that it is critical to developing meaningful treatment interventions. However, no study has characterized how programs integrate culture into their services. Furthermore, reports regarding the associated challenges are limited. Twenty key informant interviews with administrators and 15 focus groups with clinicians were conducted in 18 alcohol and substance abuse treatment programs serving AI/AN communities. Transcripts were coded to identify relevant themes. Substance abuse treatment programs for AI/AN communities are integrating culture into their services in two discrete ways: by implementing specific cultural practices and by adapting Western treatment models. More important, however, are the fundamental principles that shape these programs and their interactions with the people and communities they serve. These foundational beliefs and values, defined in this study as the core cultural constructs that validate and incorporate AI/AN experience and world view, include an emphasis on community and family, meaningful relationships with and respect for clients, a homelike atmosphere within the program setting, and an “open door” policy for clients. The primary challenges for integrating these cultural practices include AI/AN communities' cultural diversity and limited socioeconomic resources to design and implement these practices. The prominence of foundational beliefs and values is striking and suggests a broader definition of culture when designing services. This definition of foundational beliefs and values should help other diverse communities culturally adapt their substance abuse interventions in more meaningful ways.
Boritz, Tali Z; Bryntwick, Emily; Angus, Lynne; Greenberg, Leslie S; Constantino, Michael J
2014-01-01
While the individual contributions of narrative and emotion processes to psychotherapy outcome have been the focus of recent interest in psychotherapy research literature, the empirical analysis of narrative and emotion integration has rarely been addressed. The Narrative-Emotion Processes Coding System (NEPCS) was developed to provide researchers with a systematic method for identifying specific narrative and emotion process markers, for application to therapy session videos. The present study examined the relationship between NEPCS-derived problem markers (same old storytelling, empty storytelling, unstoried emotion, abstract storytelling) and change markers (competing plotlines storytelling, inchoate storytelling, unexpected outcome storytelling, and discovery storytelling), and treatment outcome (recovered versus unchanged at therapy termination) and stage of therapy (early, middle, late) in brief emotion-focused (EFT), client-centred (CCT), and cognitive (CT) therapies for depression. Hierarchical linear modelling analyses demonstrated a significant Outcome effect for inchoate storytelling (p = .037) and discovery storytelling (p = .002), a Stage × Outcome effect for abstract storytelling (p = .05), and a Stage × Outcome × Treatment effect for competing plotlines storytelling (p = .001). There was also a significant Stage × Outcome effect for NEPCS problem markers (p = .007) and change markers (p = .03). The results provide preliminary support for the importance of assessing the contribution of narrative-emotion processes to efficacious treatment outcomes in EFT, CCT, and CT treatments of depression.
Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation
NASA Technical Reports Server (NTRS)
Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.
2000-01-01
An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.
Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu
2016-10-01
This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.
Lindqvist, Helena; Forsberg, Lars; Enebrink, Pia; Andersson, Gerhard; Rosendahl, Ingvar
2017-06-01
The technical component of Motivational Interviewing (MI) posits that client language mediates the relationship between counselor techniques and subsequent client behavioral outcomes. The purpose of this study was to examine this hypothesized technical component of MI in smoking cessation treatment in more depth. Secondary analysis of 106 first treatment sessions, derived from the Swedish National Tobacco Quitline, and previously rated using the Motivational Interviewing Sequential Code for Observing Process Exchanges (MI-SCOPE) Coder's Manual and the Motivational Interviewing Treatment Integrity code (MITI) Manual, version 3.1. The outcome measure was self-reported 6-month continuous abstinence at 12-month follow-up. Sequential analyses indicated that clients were significantly more likely than expected by chance to argue for change (change talk) following MI-consistent behaviors and questions and reflections favoring change. Conversely, clients were more likely to argue against change (sustain talk) following questions and reflections favoring status-quo. Parallel mediation analysis revealed that a counselor technique (reflections of client sustain talk) had an indirect effect on smoking outcome at follow-up through client language mediators. The study makes a significant contribution to our understanding of how MI works in smoking cessation treatment and adds further empirical support for the hypothesized technical component in MI. The results emphasize the importance of counselors avoiding unintentional reinforcement of sustain talk and underline the need for a greater emphasis on the direction of questions and reflections in MI trainings and fidelity measures. Copyright © 2017 Elsevier Inc. All rights reserved.
On the limits of numerical astronomical solutions used in paleoclimate studies
NASA Astrophysics Data System (ADS)
Zeebe, Richard E.
2017-04-01
Numerical solutions of the equations of the Solar System estimate Earth's orbital parameters in the past and represent the backbone of cyclostratigraphy and astrochronology, now widely applied in geology and paleoclimatology. Given one numerical realization of a Solar System model (i.e., obtained using one code or integrator package), various parameters determine the properties of the solution and usually limit its validity to a certain time period. Such limitations are denoted here as "internal" and include limitations due to (i) the underlying physics/physical model and (ii) numerics. The physics include initial coordinates and velocities of Solar System bodies, treatment of the Moon and asteroids, the Sun's quadrupole moment, and the intrinsic dynamics of the Solar System itself, i.e., its chaotic nature. Numerical issues include solver algorithm, numerical accuracy (e.g., time step), and round-off errors. At present, internal limitations seem to restrict the validity of astronomical solutions to perhaps the past 50 or 60 myr. However, little is currently known about "external" limitations, that is, how do different numerical realizations compare, say, between different investigators using different codes and integrators? Hitherto only two solutions for Earth's eccentricity appear to be used in paleoclimate studies, provided by two different groups that integrated the full Solar System equations over the past >100 myr (Laskar and coworkers and Varadi et al. 2003). In this contribution, I will present results from new Solar System integrations for Earth's eccentricity obtained using the integrator package HNBody (Rauch and Hamilton 2002). I will discuss the various internal limitations listed above within the framework of the present simulations. I will also compare the results to the existing solutions, the details of which are still being sorted out as several simulations are still running at the time of writing.
NASA Astrophysics Data System (ADS)
Williams, C. A.; Dicaprio, C.; Simons, M.
2003-12-01
With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T; Lin, H; Xu, X
Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analyticallymore » derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.« less
R3D: Reduction Package for Integral Field Spectroscopy
NASA Astrophysics Data System (ADS)
Sánchez, Sebastián. F.
2011-06-01
R3D was developed to reduce fiber-based integral field spectroscopy (IFS) data. The package comprises a set of command-line routines adapted for each of these steps, suitable for creating pipelines. The routines have been tested against simulations, and against real data from various integral field spectrographs (PMAS, PPAK, GMOS, VIMOS and INTEGRAL). Particular attention is paid to the treatment of cross-talk. R3D unifies the reduction techniques for the different IFS instruments to a single one, in order to allow the general public to reduce different instruments data in an homogeneus, consistent and simple way. Although still in its prototyping phase, it has been proved to be useful to reduce PMAS (both in the Larr and the PPAK modes), VIMOS and INTEGRAL data. The current version has been coded in Perl, using PDL, in order to speed-up the algorithm testing phase. Most of the time critical algorithms have been translated to C[float=][/float], and it is our intention to translate all of them. However, even in this phase R3D is fast enough to produce valuable science frames in reasonable time.
Xiao, W; Rank, G H
1989-03-15
The yeast SMR1 gene was used as a dominant resistance-selectable marker for industrial yeast transformation and for targeting integration of an economically important gene at the homologous ILV2 locus. A MEL1 gene, which codes for alpha-galactosidase, was inserted into a dispensable upstream region of SMR1 in vitro; different treatments of the plasmid (pWX813) prior to transformation resulted in 3' end, 5' end and replacement integrations that exhibited distinct integrant structures. One-step replacement within a nonessential region of the host genome generated a stable integration of MEL1 devoid of bacterial plasmid DNA. Using this method, we have constructed several alpha-galactosidase positive industrial Saccharomyces strains. Our study provides a general method for stable gene transfer in most industrial Saccharomyces yeasts, including those used in the baking, brewing (ale and lager), distilling, wine and sake industries, with solely nucleotide sequences of interest. The absence of bacterial DNA in the integrant structure facilitates the commercial application of recombinant DNA technology in the food and beverage industry.
Health information system strengthening and malaria elimination in Papua New Guinea.
Rosewell, Alexander; Makita, Leo; Muscatello, David; John, Lucy Ninmongo; Bieb, Sibauk; Hutton, Ross; Ramamurthy, Sundar; Shearman, Phil
2017-07-05
The objective of the study was to describe an m-health initiative to strengthen malaria surveillance in a 184-health facility, multi-province, project aimed at strengthening the National Health Information System (NHIS) in a country with fragmented malaria surveillance, striving towards enhanced control, pre-elimination. A remote-loading mobile application and secure online platform for health professionals was created to interface with the new system (eNHIS). A case-based malaria testing register was developed and integrated geo-coded households, villages and health facilities. A malaria programme management dashboard was created, with village-level malaria mapping tools, and statistical algorithms to identify malaria outbreaks. Since its inception in 2015, 160,750 malaria testing records, including village of residence, have been reported to the eNHIS. These case-based, geo-coded malaria data are 100% complete, with a median data entry delay of 9 days from the date of testing. The system maps malaria to the village level in near real-time as well as the availability of treatment and diagnostics to health facility level. Data aggregation, analysis, outbreak detection, and reporting are automated. The study demonstrates that using mobile technologies and GIS in the capture and reporting of NHIS data in Papua New Guinea provides timely, high quality, geo-coded, case-based malaria data required for malaria elimination. The health systems strengthening approach of integrating malaria information management into the eNHIS optimizes sustainability and provides enormous flexibility to cater for future malaria programme needs.
Morley, Katherine I; Wallace, Joshua; Denaxas, Spiros C; Hunter, Ross J; Patel, Riyaz S; Perel, Pablo; Shah, Anoop D; Timmis, Adam D; Schilling, Richard J; Hemingway, Harry
2014-01-01
National electronic health records (EHR) are increasingly used for research but identifying disease cases is challenging due to differences in information captured between sources (e.g. primary and secondary care). Our objective was to provide a transparent, reproducible model for integrating these data using atrial fibrillation (AF), a chronic condition diagnosed and managed in multiple ways in different healthcare settings, as a case study. Potentially relevant codes for AF screening, diagnosis, and management were identified in four coding systems: Read (primary care diagnoses and procedures), British National Formulary (BNF; primary care prescriptions), ICD-10 (secondary care diagnoses) and OPCS-4 (secondary care procedures). From these we developed a phenotype algorithm via expert review and analysis of linked EHR data from 1998 to 2010 for a cohort of 2.14 million UK patients aged ≥ 30 years. The cohort was also used to evaluate the phenotype by examining associations between incident AF and known risk factors. The phenotype algorithm incorporated 286 codes: 201 Read, 63 BNF, 18 ICD-10, and four OPCS-4. Incident AF diagnoses were recorded for 72,793 patients, but only 39.6% (N = 28,795) were recorded in primary care and secondary care. An additional 7,468 potential cases were inferred from data on treatment and pre-existing conditions. The proportion of cases identified from each source differed by diagnosis age; inferred diagnoses contributed a greater proportion of younger cases (≤ 60 years), while older patients (≥ 80 years) were mainly diagnosed in SC. Associations of risk factors (hypertension, myocardial infarction, heart failure) with incident AF defined using different EHR sources were comparable in magnitude to those from traditional consented cohorts. A single EHR source is not sufficient to identify all patients, nor will it provide a representative sample. Combining multiple data sources and integrating information on treatment and comorbid conditions can substantially improve case identification.
Coding for Single-Line Transmission
NASA Technical Reports Server (NTRS)
Madison, L. G.
1983-01-01
Digital transmission code combines data and clock signals into single waveform. MADCODE needs four standard integrated circuits in generator and converter plus five small discrete components. MADCODE allows simple coding and decoding for transmission of digital signals over single line.
Lossless compression of VLSI layout image data.
Dai, Vito; Zakhor, Avideh
2006-09-01
We present a novel lossless compression algorithm called Context Copy Combinatorial Code (C4), which integrates the advantages of two very disparate compression techniques: context-based modeling and Lempel-Ziv (LZ) style copying. While the algorithm can be applied to many lossless compression applications, such as document image compression, our primary target application has been lossless compression of integrated circuit layout image data. These images contain a heterogeneous mix of data: dense repetitive data better suited to LZ-style coding, and less dense structured data, better suited to context-based encoding. As part of C4, we have developed a novel binary entropy coding technique called combinatorial coding which is simultaneously as efficient as arithmetic coding, and as fast as Huffman coding. Compression results show C4 outperforms JBIG, ZIP, BZIP2, and two-dimensional LZ, and achieves lossless compression ratios greater than 22 for binary layout image data, and greater than 14 for gray-pixel image data.
Mean Line Pump Flow Model in Rocket Engine System Simulation
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Lavelle, Thomas M.
2000-01-01
A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.
2011-02-01
This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less
... Public Home » Hepatitis C » Hepatitis C Treatment Viral Hepatitis Menu Menu Viral Hepatitis Viral Hepatitis Home For ... Enter ZIP code here Enter ZIP code here Hepatitis C Treatment for Veterans and the Public Treatment ...
Suitability of point kernel dose calculation techniques in brachytherapy treatment planning
Lakshminarayanan, Thilagam; Subbaiah, K. V.; Thayalan, K.; Kannan, S. E.
2010-01-01
Brachytherapy treatment planning system (TPS) is necessary to estimate the dose to target volume and organ at risk (OAR). TPS is always recommended to account for the effect of tissue, applicator and shielding material heterogeneities exist in applicators. However, most brachytherapy TPS software packages estimate the absorbed dose at a point, taking care of only the contributions of individual sources and the source distribution, neglecting the dose perturbations arising from the applicator design and construction. There are some degrees of uncertainties in dose rate estimations under realistic clinical conditions. In this regard, an attempt is made to explore the suitability of point kernels for brachytherapy dose rate calculations and develop new interactive brachytherapy package, named as BrachyTPS, to suit the clinical conditions. BrachyTPS is an interactive point kernel code package developed to perform independent dose rate calculations by taking into account the effect of these heterogeneities, using two regions build up factors, proposed by Kalos. The primary aim of this study is to validate the developed point kernel code package integrated with treatment planning computational systems against the Monte Carlo (MC) results. In the present work, three brachytherapy applicators commonly used in the treatment of uterine cervical carcinoma, namely (i) Board of Radiation Isotope and Technology (BRIT) low dose rate (LDR) applicator and (ii) Fletcher Green type LDR applicator (iii) Fletcher Williamson high dose rate (HDR) applicator, are studied to test the accuracy of the software. Dose rates computed using the developed code are compared with the relevant results of the MC simulations. Further, attempts are also made to study the dose rate distribution around the commercially available shielded vaginal applicator set (Nucletron). The percentage deviations of BrachyTPS computed dose rate values from the MC results are observed to be within plus/minus 5.5% for BRIT LDR applicator, found to vary from 2.6 to 5.1% for Fletcher green type LDR applicator and are up to −4.7% for Fletcher-Williamson HDR applicator. The isodose distribution plots also show good agreements with the results of previous literatures. The isodose distributions around the shielded vaginal cylinder computed using BrachyTPS code show better agreement (less than two per cent deviation) with MC results in the unshielded region compared to shielded region, where the deviations are observed up to five per cent. The present study implies that the accurate and fast validation of complicated treatment planning calculations is possible with the point kernel code package. PMID:20589118
Wolff, Sebastian; Bucher, Christian
2013-01-01
This article presents asynchronous collision integrators and a simple asynchronous method treating nodal restraints. Asynchronous discretizations allow individual time step sizes for each spatial region, improving the efficiency of explicit time stepping for finite element meshes with heterogeneous element sizes. The article first introduces asynchronous variational integration being expressed by drift and kick operators. Linear nodal restraint conditions are solved by a simple projection of the forces that is shown to be equivalent to RATTLE. Unilateral contact is solved by an asynchronous variant of decomposition contact response. Therein, velocities are modified avoiding penetrations. Although decomposition contact response is solving a large system of linear equations (being critical for the numerical efficiency of explicit time stepping schemes) and is needing special treatment regarding overconstraint and linear dependency of the contact constraints (for example from double-sided node-to-surface contact or self-contact), the asynchronous strategy handles these situations efficiently and robust. Only a single constraint involving a very small number of degrees of freedom is considered at once leading to a very efficient solution. The treatment of friction is exemplified for the Coulomb model. Special care needs the contact of nodes that are subject to restraints. Together with the aforementioned projection for restraints, a novel efficient solution scheme can be presented. The collision integrator does not influence the critical time step. Hence, the time step can be chosen independently from the underlying time-stepping scheme. The time step may be fixed or time-adaptive. New demands on global collision detection are discussed exemplified by position codes and node-to-segment integration. Numerical examples illustrate convergence and efficiency of the new contact algorithm. Copyright © 2013 The Authors. International Journal for Numerical Methods in Engineering published by John Wiley & Sons, Ltd. PMID:23970806
New GOES satellite synchronized time code generation
NASA Technical Reports Server (NTRS)
Fossler, D. E.; Olson, R. K.
1984-01-01
The TRAK Systems' GOES Satellite Synchronized Time Code Generator is described. TRAK Systems has developed this timing instrument to supply improved accuracy over most existing GOES receiver clocks. A classical time code generator is integrated with a GOES receiver.
Roanoke College Student Conduct Code 1990-91.
ERIC Educational Resources Information Center
Roanoke Coll., VA.
This Roanoke College (Virginia) 1990-91 conduct code manual is intended for distribution to students. A reproduction of the Academic Integrity and Student Conduct Code Form which all students must sign leads off the document. A section detailing the student conduct code explains the delegation of authority within the institution and describes the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan
2015-02-16
CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less
How Object-Specific Are Object Files? Evidence for Integration by Location
ERIC Educational Resources Information Center
van Dam, Wessel O.; Hommel, Bernhard
2010-01-01
Given the distributed representation of visual features in the human brain, binding mechanisms are necessary to integrate visual information about the same perceptual event. It has been assumed that feature codes are bound into object files--pointers to the neural codes of the features of a given event. The present study investigated the…
ERIC Educational Resources Information Center
Clinton, Virginia; Morsanyi, Kinga; Alibali, Martha W.; Nathan, Mitchell J.
2016-01-01
Learning from visual representations is enhanced when learners appropriately integrate corresponding visual and verbal information. This study examined the effects of two methods of promoting integration, color coding and labeling, on learning about probabilistic reasoning from a table and text. Undergraduate students (N = 98) were randomly…
Integration of a supersonic unsteady aerodynamic code into the NASA FASTEX system
NASA Technical Reports Server (NTRS)
Appa, Kari; Smith, Michael J. C.
1987-01-01
A supersonic unsteady aerodynamic loads prediction method based on the constant pressure method was integrated into the NASA FASTEX system. The updated FASTEX code can be employed for aeroelastic analyses in subsonic and supersonic flow regimes. A brief description of the supersonic constant pressure panel method, as applied to lifting surfaces and body configurations, is followed by a documentation of updates required to incorporate this method in the FASTEX code. Test cases showing correlations of predicted pressure distributions, flutter solutions, and stability derivatives with available data are reported.
High Speed Solution of Spacecraft Trajectory Problems Using Taylor Series Integration
NASA Technical Reports Server (NTRS)
Scott, James R.; Martini, Michael C.
2008-01-01
Taylor series integration is implemented in a spacecraft trajectory analysis code-the Spacecraft N-body Analysis Program (SNAP) - and compared with the code s existing eighth-order Runge-Kutta Fehlberg time integration scheme. Nine trajectory problems, including near Earth, lunar, Mars and Europa missions, are analyzed. Head-to-head comparison at five different error tolerances shows that, on average, Taylor series is faster than Runge-Kutta Fehlberg by a factor of 15.8. Results further show that Taylor series has superior convergence properties. Taylor series integration proves that it can provide rapid, highly accurate solutions to spacecraft trajectory problems.
Family Planning in Substance Use Disorder Treatment Centers: Opportunities and Challenges.
Robinowitz, Natanya; Muqueeth, Sadiya; Scheibler, Jill; Salisbury-Afshar, Elizabeth; Terplan, Mishka
2016-09-18
Alcohol, tobacco, and drug use during pregnancy can cause a range of adverse birth outcomes. Promoting family planning among women with substance use disorders (SUD) can help reduce substance exposed pregnancies. We conducted qualitative research to determine the acceptability and feasibility of offering family planning education and services SUD treatment centers. Focus groups and in-depth interviews were conducted with clients, staff and medical providers at three treatment centers. Interviews were transcribed and data was analyzed using a flexible coding scheme. Clients reported being interested in family planning services while they were in treatment. Most preferred to receive these services onsite. Providers also felt that services should be received onsite, though cited several barriers to implementation, including time constraints and staff levels of comfort with the subject. Women in SUD treatment are open to the integration of family planning services into treatment. Treatment centers have the opportunity to serve as models of client-centered health homes that offer a variety of educational, preventive, and medical services for women in both treatment and recovery.
Solving differential equations for Feynman integrals by expansions near singular points
NASA Astrophysics Data System (ADS)
Lee, Roman N.; Smirnov, Alexander V.; Smirnov, Vladimir A.
2018-03-01
We describe a strategy to solve differential equations for Feynman integrals by powers series expansions near singular points and to obtain high precision results for the corresponding master integrals. We consider Feynman integrals with two scales, i.e. non-trivially depending on one variable. The corresponding algorithm is oriented at situations where canonical form of the differential equations is impossible. We provide a computer code constructed with the help of our algorithm for a simple example of four-loop generalized sunset integrals with three equal non-zero masses and two zero masses. Our code gives values of the master integrals at any given point on the real axis with a required accuracy and a given order of expansion in the regularization parameter ɛ.
Status and Plans for the TRANSP Interpretive and Predictive Simulation Code
NASA Astrophysics Data System (ADS)
Kaye, Stanley; Andre, Robert; Marina, Gorelenkova; Yuan, Xingqui; Hawryluk, Richard; Jardin, Steven; Poli, Francesca
2015-11-01
TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT_SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP also incorporates such source models as NUBEAM for neutral beam injection, GENRAY, TORAY, TORBEAM, TORIC and CQL3D for ICRH, LHCD, ECH and HHFW. The implementation of selected components makes efficient use of MPI for speed up of code calculations. TRANSP has a wide international user-base, and it is run on the FusionGrid to allow for timely support and quick turnaround by the PPPL Computational Plasma Physics Group. It is being used as a basis for both analysis and development of control algorithms and discharge operational scenarios, including simulation of ITER plasmas. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Progress on implementing TRANSP as a component in the ITER IMAS will also be described. This research was supported by the U.S. Department of Energy under contracts DE-AC02-09CH11466.
NASA Astrophysics Data System (ADS)
Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi
2018-01-01
Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.
An active inference theory of allostasis and interoception in depression
Quigley, Karen S.; Hamilton, Paul
2016-01-01
In this paper, we integrate recent theoretical and empirical developments in predictive coding and active inference accounts of interoception (including the Embodied Predictive Interoception Coding model) with working hypotheses from the theory of constructed emotion to propose a biologically plausible unified theory of the mind that places metabolism and energy regulation (i.e. allostasis), as well as the sensory consequences of that regulation (i.e. interoception), at its core. We then consider the implications of this approach for understanding depression. We speculate that depression is a disorder of allostasis, whose myriad symptoms result from a ‘locked in’ brain that is relatively insensitive to its sensory context. We conclude with a brief discussion of the ways our approach might reveal new insights for the treatment of depression. This article is part of the themed issue ‘Interoception beyond homeostasis: affect, cognition and mental health’. PMID:28080969
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakhai, B.
A new method for solving radiation transport problems is presented. The heart of the technique is a new cross section processing procedure for the calculation of group-to-point and point-to-group cross sections sets. The method is ideally suited for problems which involve media with highly fluctuating cross sections, where the results of the traditional multigroup calculations are beclouded by the group averaging procedures employed. Extensive computational efforts, which would be required to evaluate double integrals in the multigroup treatment numerically, prohibit iteration to optimize the energy boundaries. On the other hand, use of point-to-point techniques (as in the stochastic technique) ismore » often prohibitively expensive due to the large computer storage requirement. The pseudo-point code is a hybrid of the two aforementioned methods (group-to-group and point-to-point) - hence the name pseudo-point - that reduces the computational efforts of the former and the large core requirements of the latter. The pseudo-point code generates the group-to-point or the point-to-group transfer matrices, and can be coupled with the existing transport codes to calculate pointwise energy-dependent fluxes. This approach yields much more detail than is available from the conventional energy-group treatments. Due to the speed of this code, several iterations could be performed (in affordable computing efforts) to optimize the energy boundaries and the weighting functions. The pseudo-point technique is demonstrated by solving six problems, each depicting a certain aspect of the technique. The results are presented as flux vs energy at various spatial intervals. The sensitivity of the technique to the energy grid and the savings in computational effort are clearly demonstrated.« less
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2012 CFR
2012-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2014 CFR
2014-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2013 CFR
2013-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2011 CFR
2011-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2010 CFR
2010-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Bernhard; Janka, Hans-Thomas; Dimmelmeier, Harald, E-mail: bjmuellr@mpa-garching.mpg.d, E-mail: thj@mpa-garching.mpg.d, E-mail: harrydee@mpa-garching.mpg.d
We present a new general relativistic code for hydrodynamical supernova simulations with neutrino transport in spherical and azimuthal symmetry (one dimension and two dimensions, respectively). The code is a combination of the COCONUT hydro module, which is a Riemann-solver-based, high-resolution shock-capturing method, and the three-flavor, fully energy-dependent VERTEX scheme for the transport of massless neutrinos. VERTEX integrates the coupled neutrino energy and momentum equations with a variable Eddington factor closure computed from a model Boltzmann equation and uses the 'ray-by-ray plus' approximation in two dimensions, assuming the neutrino distribution to be axially symmetric around the radial direction at every pointmore » in space, and thus the neutrino flux to be radial. Our spacetime treatment employs the Arnowitt-Deser-Misner 3+1 formalism with the conformal flatness condition for the spatial three metric. This approach is exact for the one-dimensional case and has previously been shown to yield very accurate results for spherical and rotational stellar core collapse. We introduce new formulations of the energy equation to improve total energy conservation in relativistic and Newtonian hydro simulations with grid-based Eulerian finite-volume codes. Moreover, a modified version of the VERTEX scheme is developed that simultaneously conserves energy and lepton number in the neutrino transport with better accuracy and higher numerical stability in the high-energy tail of the spectrum. To verify our code, we conduct a series of tests in spherical symmetry, including a detailed comparison with published results of the collapse, shock formation, shock breakout, and accretion phases. Long-time simulations of proto-neutron star cooling until several seconds after core bounce both demonstrate the robustness of the new COCONUT-VERTEX code and show the approximate treatment of relativistic effects by means of an effective relativistic gravitational potential as in PROMETHEUS-VERTEX to be remarkably accurate in spherical symmetry.« less
NASA Technical Reports Server (NTRS)
Radhadrishnan, Krishnan
1993-01-01
A detailed analysis of the accuracy of several techniques recently developed for integrating stiff ordinary differential equations is presented. The techniques include two general-purpose codes EPISODE and LSODE developed for an arbitrary system of ordinary differential equations, and three specialized codes CHEMEQ, CREK1D, and GCKP4 developed specifically to solve chemical kinetic rate equations. The accuracy study is made by application of these codes to two practical combustion kinetics problems. Both problems describe adiabatic, homogeneous, gas-phase chemical reactions at constant pressure, and include all three combustion regimes: induction, heat release, and equilibration. To illustrate the error variation in the different combustion regimes the species are divided into three types (reactants, intermediates, and products), and error versus time plots are presented for each species type and the temperature. These plots show that CHEMEQ is the most accurate code during induction and early heat release. During late heat release and equilibration, however, the other codes are more accurate. A single global quantity, a mean integrated root-mean-square error, that measures the average error incurred in solving the complete problem is used to compare the accuracy of the codes. Among the codes examined, LSODE is the most accurate for solving chemical kinetics problems. It is also the most efficient code, in the sense that it requires the least computational work to attain a specified accuracy level. An important finding is that use of the algebraic enthalpy conservation equation to compute the temperature can be more accurate and efficient than integrating the temperature differential equation.
Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components
NASA Technical Reports Server (NTRS)
Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.
1999-01-01
This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.
Yang, Yiwei; Xu, Yuejin; Miu, Jichang; Zhou, Linghong; Xiao, Zhongju
2012-10-01
To apply the classic leakage integrate-and-fire models, based on the mechanism of the generation of physiological auditory stimulation, in the information processing coding of cochlear implants to improve the auditory result. The results of algorithm simulation in digital signal processor (DSP) were imported into Matlab for a comparative analysis. Compared with CIS coding, the algorithm of membrane potential integrate-and-fire (MPIF) allowed more natural pulse discharge in a pseudo-random manner to better fit the physiological structures. The MPIF algorithm can effectively solve the problem of the dynamic structure of the delivered auditory information sequence issued in the auditory center and allowed integration of the stimulating pulses and time coding to ensure the coherence and relevance of the stimulating pulse time.
NASA Technical Reports Server (NTRS)
Logan, Terry G.
1994-01-01
The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.
The kinetics of aerosol particle formation and removal in NPP severe accidents
NASA Astrophysics Data System (ADS)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.
2016-06-01
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.
The kinetics of aerosol particle formation and removal in NPP severe accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.
2016-06-08
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less
openQ*D simulation code for QCD+QED
NASA Astrophysics Data System (ADS)
Campos, Isabel; Fritzsch, Patrick; Hansen, Martin; Krstić Marinković, Marina; Patella, Agostino; Ramos, Alberto; Tantalo, Nazario
2018-03-01
The openQ*D code for the simulation of QCD+QED with C* boundary conditions is presented. This code is based on openQCD-1.6, from which it inherits the core features that ensure its efficiency: the locally-deflated SAP-preconditioned GCR solver, the twisted-mass frequency splitting of the fermion action, the multilevel integrator, the 4th order OMF integrator, the SSE/AVX intrinsics, etc. The photon field is treated as fully dynamical and C* boundary conditions can be chosen in the spatial directions. We discuss the main features of openQ*D, and we show basic test results and performance analysis. An alpha version of this code is publicly available and can be downloaded from http://rcstar.web.cern.ch/.
The integration of laser communication and ranging
NASA Astrophysics Data System (ADS)
Xu, Mengmeng; Sun, Jianfeng; Zhou, Yu; Zhang, Bo; Zhang, Guo; Li, Guangyuan; He, Hongyu; Lao, Chenzhe
2017-08-01
The method to realize the integration of laser communication and ranging is proposed in this paper. In the transmitter of two places, the ranging codes with uniqueness, good autocorrelation and cross-correlation properties are embed in the communication data and the encoded with the communication data to realize serial communication. And then the encoded data are modulated and send to each other, which can realize high speed two one-way laser communication. At the receiver, we can get the received ranging code after the demodulation, decoding and clock recovery. The received ranging codes and the local ranging codes do the autocorrelation to get a roughly range, while the phase difference between the local clock and the recovery clock to achieve the precision of the distance.
Modisenyane, Simon Moeketsi; Hendricks, Stephen James Heinrich; Fineberg, Harvey
2017-01-01
South Africa, as an emerging middle-income country, is becoming increasingly influential in global health diplomacy (GHD). However, little empirical research has been conducted to inform arguments for the integration of domestic health into foreign policy by state and non-state actors. This study seeks to address this knowledge gap. It takes the form of an empirical case study which analyses how South Africa integrates domestic health into its foreign policy, using the lens of access to antiretroviral (ARV) medicines. To explore state and non-state actors' perceptions regarding how domestic health policy is integrated into foreign policy. The ultimate goal of this study was to achieve better insights into the health and foreign policy processes at the national level. Employing qualitative approaches, we examined changes in the South African and global AIDS policy environment. Purposive sampling was used to select key informants, a sample of state and non-state actors who participated in in-depth interviews. Secondary data were collected through a systematic literature review of documents retrieved from five electronic databases, including review of key policy documents. Qualitative data were analysed for content. This content was coded, and the codes were collated into tentative categories and sub-categories using Atlas.ti v.7 software. The findings of this work illustrate the interplay among social, political, economic and institutional conditions in determining the success of this integration process. Our study shows that a series of national and external developments, stakeholders, and advocacy efforts and collaboration created these integrative processes. South Africa's domestic HIV/AIDS constituencies, in partnership with the global advocacy movement, catalysed the mobilization of support for universal access to ARV treatment nationally and globally, and the promotion of access to healthcare as a human right. Transnational networks may influence government's decision making by providing information and moving issues up the agenda.
Modisenyane, Simon Moeketsi; Hendricks, Stephen James Heinrich; Fineberg, Harvey
2017-01-01
ABSTRACT Background: South Africa, as an emerging middle-income country, is becoming increasingly influential in global health diplomacy (GHD). However, little empirical research has been conducted to inform arguments for the integration of domestic health into foreign policy by state and non-state actors. This study seeks to address this knowledge gap. It takes the form of an empirical case study which analyses how South Africa integrates domestic health into its foreign policy, using the lens of access to antiretroviral (ARV) medicines. Objective: To explore state and non-state actors’ perceptions regarding how domestic health policy is integrated into foreign policy. The ultimate goal of this study was to achieve better insights into the health and foreign policy processes at the national level. Methods: Employing qualitative approaches, we examined changes in the South African and global AIDS policy environment. Purposive sampling was used to select key informants, a sample of state and non-state actors who participated in in-depth interviews. Secondary data were collected through a systematic literature review of documents retrieved from five electronic databases, including review of key policy documents. Qualitative data were analysed for content. This content was coded, and the codes were collated into tentative categories and sub-categories using Atlas.ti v.7 software. Results: The findings of this work illustrate the interplay among social, political, economic and institutional conditions in determining the success of this integration process. Our study shows that a series of national and external developments, stakeholders, and advocacy efforts and collaboration created these integrative processes. South Africa’s domestic HIV/AIDS constituencies, in partnership with the global advocacy movement, catalysed the mobilization of support for universal access to ARV treatment nationally and globally, and the promotion of access to healthcare as a human right. Conclusions: Transnational networks may influence government’s decision making by providing information and moving issues up the agenda. PMID:28685669
Saunders, D G
1996-01-01
At a community-based domestic violence program, 218 men with a history of partner abuse were randomly assigned to either feminist-cognitive-behavioral or process-psychodynamic group treatments. The treatments were not hypothesized to differ in outcome. However, men with particular characteristics were expected to have lower recidivism rates depending on the type of treatment received. Treatment integrity was verified through audio-taped codings of each session. The partners of 79% of the 136 treatment completers gave reports of the men's behavior an average of 2 years post-treatment. These reports were supplemented with arrest records and self-reports. Rates of violence did not differ significantly between the two types of treatment nor did reports from the women of their fear level, general changes perceived in the men, and conflict resolution methods. However, interaction effects were found between some offender traits and the two treatments. As predicted, men with dependent personalities had better outcomes in the process-psychodynamic groups and those with antisocial traits had better outcomes in the cognitive-behavioral groups. The results suggest that more effective treatment may occur if it is tailored to specific characteristics of offenders.
Psychometric challenges and proposed solutions when scoring facial emotion expression codes.
Olderbak, Sally; Hildebrandt, Andrea; Pinkpank, Thomas; Sommer, Werner; Wilhelm, Oliver
2014-12-01
Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on comparing competing scoring procedures of these codes. Then, on the basis of a time series data set collected to assess individual differences in facial emotion expression ability, we derive, apply, and evaluate several statistical procedures, including four scoring methods and four data treatments, to score software-coded emotion expression data. These scoring procedures are illustrated to inform analysis decisions pertaining to the scoring and data treatment of other emotion expression questions and under different experimental circumstances. Overall, we found applying loess smoothing and controlling for baseline facial emotion expression and facial plasticity are recommended methods of data treatment. When scoring facial emotion expression ability, maximum score is preferred. Finally, we discuss the scoring methods and data treatments in the larger context of emotion expression research.
A Comprehensive Validation Approach Using The RAVEN Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J
2015-06-01
The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less
The Magnetic Reconnection Code: an AMR-based fully implicit simulation suite
NASA Astrophysics Data System (ADS)
Germaschewski, K.; Bhattacharjee, A.; Ng, C.-S.
2006-12-01
Extended MHD models, which incorporate two-fluid effects, are promising candidates to enhance understanding of collisionless reconnection phenomena in laboratory, space and astrophysical plasma physics. In this paper, we introduce two simulation codes in the Magnetic Reconnection Code suite which integrate reduced and full extended MHD models. Numerical integration of these models comes with two challenges: Small-scale spatial structures, e.g. thin current sheets, develop and must be well resolved by the code. Adaptive mesh refinement (AMR) is employed to provide high resolution where needed while maintaining good performance. Secondly, the two-fluid effects in extended MHD give rise to dispersive waves, which lead to a very stringent CFL condition for explicit codes, while reconnection happens on a much slower time scale. We use a fully implicit Crank--Nicholson time stepping algorithm. Since no efficient preconditioners are available for our system of equations, we instead use a direct solver to handle the inner linear solves. This requires us to actually compute the Jacobian matrix, which is handled by a code generator that calculates the derivative symbolically and then outputs code to calculate it.
Shadowfax: Moving mesh hydrodynamical integration code
NASA Astrophysics Data System (ADS)
Vandenbroucke, Bert
2016-05-01
Shadowfax simulates galaxy evolution. Written in object-oriented modular C++, it evolves a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. For the hydrodynamical integration, it makes use of a (co-) moving Lagrangian mesh. The code has a 2D and 3D version, contains utility programs to generate initial conditions and visualize simulation snapshots, and its input/output is compatible with a number of other simulation codes, e.g. Gadget2 (ascl:0003.001) and GIZMO (ascl:1410.003).
Mun, Eluned; Umbarger, Lillian; Ceria-Ulep, Clementina; Nakatsuka, Craig
2018-01-01
Palliative Care Teams have been shown to be instrumental in the early identification of multiple aspects of advanced care planning. Despite an increased number of services to meet the rising consultation demand, it is conceivable that the numbers of palliative care consultations generated from an ICU alone could become overwhelming for an existing palliative care team. Improve end-of-life care in the ICU by incorporating basic palliative care processes into the daily routine ICU workflow, thereby reserving the palliative care team for refractory situations. A structured, palliative care, quality-improvement program was implemented and evaluated in the ICU at Kaiser Permanente Medical Center in Hawaii. This included selecting trigger criteria, a care model, forming guidelines, and developing evaluation criteria. These included the early identification of the multiple features of advanced care planning, numbers of proactive ICU and palliative care family meetings, and changes in code status and treatment upon completion of either meeting. Early identification of Goals-of-Care, advance directives, and code status by the ICU staff led to a proactive ICU family meeting with resultant increases in changes in code status and treatment. The numbers of palliative care consultations also rose, but not significantly. Palliative care processes could be incorporated into a daily ICU workflow allowing for integration of aspects of advanced care planning to be identified in a systematic and proactive manner. This reserved the palliative care team for situations when palliative care efforts performed by the ICU staff were ineffective.
Fast transform decoding of nonsystematic Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Truong, T. K.; Cheung, K.-M.; Reed, I. S.; Shiozaki, A.
1989-01-01
A Reed-Solomon (RS) code is considered to be a special case of a redundant residue polynomial (RRP) code, and a fast transform decoding algorithm to correct both errors and erasures is presented. This decoding scheme is an improvement of the decoding algorithm for the RRP code suggested by Shiozaki and Nishida, and can be realized readily on very large scale integration chips.
Novel Integration of Frame Rate Up Conversion and HEVC Coding Based on Rate-Distortion Optimization.
Guo Lu; Xiaoyun Zhang; Li Chen; Zhiyong Gao
2018-02-01
Frame rate up conversion (FRUC) can improve the visual quality by interpolating new intermediate frames. However, high frame rate videos by FRUC are confronted with more bitrate consumption or annoying artifacts of interpolated frames. In this paper, a novel integration framework of FRUC and high efficiency video coding (HEVC) is proposed based on rate-distortion optimization, and the interpolated frames can be reconstructed at encoder side with low bitrate cost and high visual quality. First, joint motion estimation (JME) algorithm is proposed to obtain robust motion vectors, which are shared between FRUC and video coding. What's more, JME is embedded into the coding loop and employs the original motion search strategy in HEVC coding. Then, the frame interpolation is formulated as a rate-distortion optimization problem, where both the coding bitrate consumption and visual quality are taken into account. Due to the absence of original frames, the distortion model for interpolated frames is established according to the motion vector reliability and coding quantization error. Experimental results demonstrate that the proposed framework can achieve 21% ~ 42% reduction in BDBR, when compared with the traditional methods of FRUC cascaded with coding.
NASA Technical Reports Server (NTRS)
Hicks, Raymond M.; Cliff, Susan E.
1991-01-01
Full-potential, Euler, and Navier-Stokes computational fluid dynamics (CFD) codes were evaluated for use in analyzing the flow field about airfoils sections operating at Mach numbers from 0.20 to 0.60 and Reynolds numbers from 500,000 to 2,000,000. The potential code (LBAUER) includes weakly coupled integral boundary layer equations for laminar and turbulent flow with simple transition and separation models. The Navier-Stokes code (ARC2D) uses the thin-layer formulation of the Reynolds-averaged equations with an algebraic turbulence model. The Euler code (ISES) includes strongly coupled integral boundary layer equations and advanced transition and separation calculations with the capability to model laminar separation bubbles and limited zones of turbulent separation. The best experiment/CFD correlation was obtained with the Euler code because its boundary layer equations model the physics of the flow better than the other two codes. An unusual reversal of boundary layer separation with increasing angle of attack, following initial shock formation on the upper surface of the airfoil, was found in the experiment data. This phenomenon was not predicted by the CFD codes evaluated.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 7 2010-04-01 2010-04-01 true Treatment under the Internal Revenue Code of 1939..., in the case of oil and gas wells. 1.614-4 Section 1.614-4 Internal Revenue INTERNAL REVENUE SERVICE....614-4 Treatment under the Internal Revenue Code of 1939 with respect to separate operating mineral...
NASA Astrophysics Data System (ADS)
Michalsky, Tova
2013-07-01
This study investigated the effectiveness of cognitive-metacognitive versus motivational components of the IMPROVE self-regulatory model, used while reading scientific texts, for 10th graders' scientific literacy and self-regulated learning (SRL). Three treatment groups (N = 198) received one type of self-addressable questions while reading scientific texts: cognitive-metacognitive (CogMet), motivational (Mot), or combined (CogMetMot). Control group received no self-addressed questions (noSRL). One measure assessed scientific literacy, and two measures assessed SRL: (a) as an aptitude-pre/post questionnaires assessing self-perceived SRL, and (b) as an event-audiotaping participants' thinking-aloud SRL behaviors in real-time learning experiences and data coding illustrating SRL changes. Findings indicated that treatment groups significantly outperformed the non-treatment group. No differences emerged between CogMet and Mot, whereas fully combined SRL support (CogMetMot) was most effective. Theoretical and practical implications of this preliminary study are discussed.
Bichutskiy, Vadim Y.; Colman, Richard; Brachmann, Rainer K.; Lathrop, Richard H.
2006-01-01
Complex problems in life science research give rise to multidisciplinary collaboration, and hence, to the need for heterogeneous database integration. The tumor suppressor p53 is mutated in close to 50% of human cancers, and a small drug-like molecule with the ability to restore native function to cancerous p53 mutants is a long-held medical goal of cancer treatment. The Cancer Research DataBase (CRDB) was designed in support of a project to find such small molecules. As a cancer informatics project, the CRDB involved small molecule data, computational docking results, functional assays, and protein structure data. As an example of the hybrid strategy for data integration, it combined the mediation and data warehousing approaches. This paper uses the CRDB to illustrate the hybrid strategy as a viable approach to heterogeneous data integration in biomedicine, and provides a design method for those considering similar systems. More efficient data sharing implies increased productivity, and, hopefully, improved chances of success in cancer research. (Code and database schemas are freely downloadable, http://www.igb.uci.edu/research/research.html.) PMID:19458771
Using Quick Response Codes in the Classroom: Quality Outcomes.
Zurmehly, Joyce; Adams, Kellie
2017-10-01
With smart device technology emerging, educators are challenged with redesigning teaching strategies using technology to allow students to participate dynamically and provide immediate answers. To facilitate integration of technology and to actively engage students, quick response codes were included in a medical surgical lecture. Quick response codes are two-dimensional square patterns that enable the coding or storage of more than 7000 characters that can be accessed via a quick response code scanning application. The aim of this quasi-experimental study was to explore quick response code use in a lecture and measure students' satisfaction (met expectations, increased interest, helped understand, and provided practice and prompt feedback) and engagement (liked most, liked least, wanted changed, and kept involved), assessed using an investigator-developed instrument. Although there was no statistically significant correlation of quick response use to examination scores, satisfaction scores were high, and there was a small yet positive association between how students perceived their learning with quick response codes and overall examination scores. Furthermore, on open-ended survey questions, students responded that they were satisfied with the use of quick response codes, appreciated the immediate feedback, and planned to use them in the clinical setting. Quick response codes offer a way to integrate technology into the classroom to provide students with instant positive feedback.
Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam
2012-02-01
To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. Paediatric residency program at BC Children's Hospital, Vancouver, British Columbia. The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes.
Liu, Yangyang; Han, Xiao; Yuan, Junting; Geng, Tuoyu; Chen, Shihao; Hu, Xuming; Cui, Isabelle H; Cui, Hengmi
2017-04-07
The type II bacterial CRISPR/Cas9 system is a simple, convenient, and powerful tool for targeted gene editing. Here, we describe a CRISPR/Cas9-based approach for inserting a poly(A) transcriptional terminator into both alleles of a targeted gene to silence protein-coding and non-protein-coding genes, which often play key roles in gene regulation but are difficult to silence via insertion or deletion of short DNA fragments. The integration of 225 bp of bovine growth hormone poly(A) signals into either the first intron or the first exon or behind the promoter of target genes caused efficient termination of expression of PPP1R12C , NSUN2 (protein-coding genes), and MALAT1 (non-protein-coding gene). Both NeoR and PuroR were used as markers in the selection of clonal cell lines with biallelic integration of a poly(A) signal. Genotyping analysis indicated that the cell lines displayed the desired biallelic silencing after a brief selection period. These combined results indicate that this CRISPR/Cas9-based approach offers an easy, convenient, and efficient novel technique for gene silencing in cell lines, especially for those in which gene integration is difficult because of a low efficiency of homology-directed repair. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Sexual health and religion: a primer for the sexual health clinician (CME).
Kellogg Spadt, Susan; Rosenbaum, Talli Y; Dweck, Alyssa; Millheiser, Leah; Pillai-Friedman, Sabitha; Krychman, Michael
2014-07-01
Sexual health is an integral part of the multifaceted human experience that is driven both by biological factors and psychological facets. Religion may provide a moral code of conduct or a sexual compass as to sexual norms and behaviors. The aim of this study was to summarize the integration of sexuality and religion. A review of published literature and religious texts was conducted. The integration of religion with country or state politics and laws is a complicated dilemma and will not be discussed in the scope of this article. The extent to which an individual incorporates their religious doctrine into their sexual life is a personal and individualized choice. The sexual medicine health professional will likely encounter a diverse patient population of distinct religious backgrounds, and a primer on religion and sexuality is a much needed adjunctive tool for the clinician. Because religion can influence sexuality and dictate, in part, the behavioral and medical treatments for sexual complaints, the clinician should be familiar with religious guidelines regarding sexuality, and treatment should be customized and individualized. Failure to do so can impact compliance with the therapeutic interventions. Religious awareness also solidifies the therapeutic alliance between clinician and patient as it demonstrates respect and acknowledgment for patient's beliefs and autonomy. © 2014 International Society for Sexual Medicine.
Kolehmainen, Christine; Brennan, Meghan; Filut, Amarette; Isaac, Carol; Carnes, Molly
2014-09-01
Ineffective leadership during cardiopulmonary resuscitation ("code") can negatively affect a patient's likelihood of survival. In most teaching hospitals, internal medicine residents lead codes. In this study, the authors explored internal medicine residents' experiences leading codes, with a particular focus on how gender influences the code leadership experience. The authors conducted individual, semistructured telephone or in-person interviews with 25 residents (May 2012 to February 2013) from 9 U.S. internal medicine residency programs. They audio recorded and transcribed the interviews and then thematically analyzed the transcribed text. Participants viewed a successful code as one with effective leadership. They agreed that the ideal code leader was an authoritative presence; spoke with a deep, loud voice; used clear, direct communication; and appeared calm. Although equally able to lead codes as their male colleagues, female participants described feeling stress from having to violate gender behavioral norms in the role of code leader. In response, some female participants adopted rituals to signal the suspension of gender norms while leading a code. Others apologized afterwards for their counternormative behavior. Ideal code leadership embodies highly agentic, stereotypical male behaviors. Female residents employed strategies to better integrate the competing identities of code leader and female gender. In the future, residency training should acknowledge how female gender stereotypes may conflict with the behaviors required to enact code leadership and offer some strategies, such as those used by the female residents in this study, to help women integrate these dual identities.
Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY
2018-01-01
A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853
Shinan-Altman, Shiri; Ayalon, Liat
2017-02-01
Hospitalization is a major risk for older adults; therefore, it is crucial to provide the appropriate treatment during hospitalization. This study examined hospitalized older adults' perceptions regarding three groups of treatment providers: nursing staff, family members, migrant home care workers. Qualitative interviews were conducted with 17 hospitalized older adults. Data were gathered by in-depth interviews. Content analysis included open coding, axial coding and integration of the main findings using constant comparisons. Three themes emerged: (1) 'What is my worth?' This theme was focused on the participants' perceptions of themselves as helpless and dependent on others. (2) 'What would I do without them?' This theme referred to the perception of the migrant home care workers and nursing staff as essential. It meant immense gratitude, but also a sense of dependency on paid caregivers. (3) 'They have their own busy life.' This theme concerned participants' low treatment expectations from their family members due to their perception of their family members as having multiple obligations. Hospitalized older adults prefer to turn to paid caregivers rather than to their families. Findings are discussed in light of the tension between formal and informal care in countries that are transitioning from traditional family values to modern values, placing the care of older adults by paid caregivers.
NASA Astrophysics Data System (ADS)
Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong
2013-05-01
Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.
NASA Astrophysics Data System (ADS)
Frerichs, Heinke; Effenberg, Florian; Schmitz, Oliver; Stephey, Laurie; W7-X Team
2016-10-01
Interpretation of spectroscopic measurements in the edge region of high-temperature plasmas can be a challenge due to line of sight integration effects. The EMC3-EIRENE code - a 3D fluid edge plasma and kinetic neutral gas transport code - is a suitable tool for full 3D reconstruction of such signals. A versatile synthetic diagnostic module has been developed recently which allows the realistic three dimensional setup of various plasma edge diagnostics to be captured. We present an analysis of recycling on the inboard limiter of W7-X during its startup phase in terms of a synthetic camera for Hα light observations and reconstruct the particle flux from these synthetic images based on ionization per photon coefficients (S/XB). We find that line of sight integration effects can lead to misinterpretation of data (redistribution of particle flux due to neutral gas diffusion), and that local plasma effects are important for the correct treatment of photon emissions. This work was supported by the U.S. Department of Energy (DOE) under Grant DE-SC0014210, by startup funds of the Department of Engineering Physics at the University of Wisconsin - Madison, and by the EUROfusion Consortium under Euratom Grant No 633053.
Software Considerations for Subscale Flight Testing of Experimental Control Laws
NASA Technical Reports Server (NTRS)
Murch, Austin M.; Cox, David E.; Cunningham, Kevin
2009-01-01
The NASA AirSTAR system has been designed to address the challenges associated with safe and efficient subscale flight testing of research control laws in adverse flight conditions. In this paper, software elements of this system are described, with an emphasis on components which allow for rapid prototyping and deployment of aircraft control laws. Through model-based design and automatic coding a common code-base is used for desktop analysis, piloted simulation and real-time flight control. The flight control system provides the ability to rapidly integrate and test multiple research control laws and to emulate component or sensor failures. Integrated integrity monitoring systems provide aircraft structural load protection, isolate the system from control algorithm failures, and monitor the health of telemetry streams. Finally, issues associated with software configuration management and code modularity are briefly discussed.
Some Practical Universal Noiseless Coding Techniques
NASA Technical Reports Server (NTRS)
Rice, Robert F.
1994-01-01
Report discusses noiseless data-compression-coding algorithms, performance characteristics and practical consideration in implementation of algorithms in coding modules composed of very-large-scale integrated circuits. Report also has value as tutorial document on data-compression-coding concepts. Coding techniques and concepts in question "universal" in sense that, in principle, applicable to streams of data from variety of sources. However, discussion oriented toward compression of high-rate data generated by spaceborne sensors for lower-rate transmission back to earth.
The association between patient-therapist MATRIX congruence and treatment outcome.
Mendlovic, Shlomo; Saad, Amit; Roll, Uri; Ben Yehuda, Ariel; Tuval-Mashiah, Rivka; Atzil-Slonim, Dana
2018-03-14
The present study aimed to examine the association between patient-therapist micro-level congruence/incongruence ratio and psychotherapeutic outcome. Nine good- and nine poor-outcome psychodynamic treatments (segregated by comparing pre- and post-treatment BDI-II) were analyzed (N = 18) moment by moment using the MATRIX (total number of MATRIX codes analyzed = 11,125). MATRIX congruence was defined as similar adjacent MATRIX codes. the congruence/incongruence ratio tended to increase as the treatment progressed only in good-outcome treatments. Progression of MATRIX codes' congruence/incongruence ratio is associated with good outcome of psychotherapy.
A comparison of cosmological hydrodynamic codes
NASA Technical Reports Server (NTRS)
Kang, Hyesung; Ostriker, Jeremiah P.; Cen, Renyue; Ryu, Dongsu; Hernquist, Lars; Evrard, August E.; Bryan, Greg L.; Norman, Michael L.
1994-01-01
We present a detailed comparison of the simulation results of various hydrodynamic codes. Starting with identical initial conditions based on the cold dark matter scenario for the growth of structure, with parameters h = 0.5 Omega = Omega(sub b) = 1, and sigma(sub 8) = 1, we integrate from redshift z = 20 to z = O to determine the physical state within a representative volume of size L(exp 3) where L = 64 h(exp -1) Mpc. Five indenpendent codes are compared: three of them Eulerian mesh-based and two variants of the smooth particle hydrodynamics 'SPH' Lagrangian approach. The Eulerian codes were run at N(exp 3) = (32(exp 3), 64(exp 3), 128(exp 3), and 256(exp 3)) cells, the SPH codes at N(exp 3) = 32(exp 3) and 64(exp 3) particles. Results were then rebinned to a 16(exp 3) grid with the exception that the rebinned data should converge, by all techniques, to a common and correct result as N approaches infinity. We find that global averages of various physical quantities do, as expected, tend to converge in the rebinned model, but that uncertainites in even primitive quantities such as (T), (rho(exp 2))(exp 1/2) persists at the 3%-17% level achieve comparable and satisfactory accuracy for comparable computer time in their treatment of the high-density, high-temeprature regions as measured in the rebinned data; the variance among the five codes (at highest resolution) for the mean temperature (as weighted by rho(exp 2) is only 4.5%. Examined at high resolution we suspect that the density resolution is better in the SPH codes and the thermal accuracy in low-density regions better in the Eulerian codes. In the low-density, low-temperature regions the SPH codes have poor accuracy due to statiscal effects, and the Jameson code gives the temperatures which are too high, due to overuse of artificial viscosity in these high Mach number regions. Overall the comparison allows us to better estimate errors; it points to ways of improving this current generation ofhydrodynamic codes and of suiting their use to problems which exploit their best individual features.
Batagov, Arsen O; Yarmishyn, Aliaksandr A; Jenjaroenpun, Piroon; Tan, Jovina Z; Nishida, Yuichiro; Kurochkin, Igor V
2013-10-16
Mammalian genomes are extensively transcribed producing thousands of long non-protein-coding RNAs (lncRNAs). The biological significance and function of the vast majority of lncRNAs remain unclear. Recent studies have implicated several lncRNAs as playing important roles in embryonic development and cancer progression. LncRNAs are characterized with different genomic architectures in relationship with their associated protein-coding genes. Our study aimed at bridging lncRNA architecture with dynamical patterns of their expression using differentiating human neuroblastoma cells model. LncRNA expression was studied in a 120-hours timecourse of differentiation of human neuroblastoma SH-SY5Y cells into neurons upon treatment with retinoic acid (RA), the compound used for the treatment of neuroblastoma. A custom microarray chip was utilized to interrogate expression levels of 9,267 lncRNAs in the course of differentiation. We categorized lncRNAs into 19 architecture classes according to their position relatively to protein-coding genes. For each architecture class, dynamics of expression of lncRNAs was studied in association with their protein-coding partners. It allowed us to demonstrate positive correlation of lncRNAs with their associated protein-coding genes at bidirectional promoters and for sense-antisense transcript pairs. In contrast, lncRNAs located in the introns and downstream of the protein-coding genes were characterized with negative correlation modes. We further classified the lncRNAs by the temporal patterns of their expression dynamics. We found that intronic and bidirectional promoter architectures are associated with rapid RA-dependent induction or repression of the corresponding lncRNAs, followed by their constant expression. At the same time, lncRNAs expressed downstream of protein-coding genes are characterized by rapid induction, followed by transcriptional repression. Quantitative RT-PCR analysis confirmed the discovered functional modes for several selected lncRNAs associated with proteins involved in cancer and embryonic development. This is the first report detailing dynamical changes of multiple lncRNAs during RA-induced neuroblastoma differentiation. Integration of genomic and transcriptomic levels of information allowed us to demonstrate specific behavior of lncRNAs organized in different genomic architectures. This study also provides a list of lncRNAs with possible roles in neuroblastoma.
Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions
2012-07-01
Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Argo, P.E.; DeLapp, D.; Sutherland, C.D.
TRACKER is an extension of a three-dimensional Hamiltonian raytrace code developed some thirty years ago by R. Michael Jones. Subsequent modifications to this code, which is commonly called the {open_quotes}Jones Code,{close_quotes} were documented by Jones and Stephensen (1975). TRACKER incorporates an interactive user`s interface, modern differential equation integrators, graphical outputs, homing algorithms, and the Ionospheric Conductivity and Electron Density (ICED) ionosphere. TRACKER predicts the three-dimensional paths of radio waves through model ionospheres by numerically integrating Hamilton`s equations, which are a differential expression of Fermat`s principle of least time. By using continuous models, the Hamiltonian method avoids false caustics and discontinuousmore » raypath properties often encountered in other raytracing methods. In addition to computing the raypath, TRACKER also calculates the group path (or pulse travel time), the phase path, the geometrical (or {open_quotes}real{close_quotes}) pathlength, and the Doppler shift (if the time variation of the ionosphere is explicitly included). Computational speed can be traded for accuracy by specifying the maximum allowable integration error per step in the integration. Only geometrical optics are included in the main raytrace code; no partial reflections or diffraction effects are taken into account. In addition, TRACKER does not lend itself to statistical descriptions of propagation -- it requires a deterministic model of the ionosphere.« less
New double-byte error-correcting codes for memory systems
NASA Technical Reports Server (NTRS)
Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.
1996-01-01
Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.
NASA Astrophysics Data System (ADS)
Coindreau, O.; Duriez, C.; Ederli, S.
2010-10-01
Progress in the treatment of air oxidation of zirconium in severe accident (SA) codes are required for a reliable analysis of severe accidents involving air ingress. Air oxidation of zirconium can actually lead to accelerated core degradation and increased fission product release, especially for the highly-radiotoxic ruthenium. This paper presents a model to simulate air oxidation kinetics of Zircaloy-4 in the 600-1000 °C temperature range. It is based on available experimental data, including separate-effect experiments performed at IRSN and at Forschungszentrum Karlsruhe. The kinetic transition, named "breakaway", from a diffusion-controlled regime to an accelerated oxidation is taken into account in the modeling via a critical mass gain parameter. The progressive propagation of the locally initiated breakaway is modeled by a linear increase in oxidation rate with time. Finally, when breakaway propagation is completed, the oxidation rate stabilizes and the kinetics is modeled by a linear law. This new modeling is integrated in the severe accident code ASTEC, jointly developed by IRSN and GRS. Model predictions and experimental data from thermogravimetric results show good agreement for different air flow rates and for slow temperature transient conditions.
NASA Technical Reports Server (NTRS)
Agrawal, Gagan; Sussman, Alan; Saltz, Joel
1993-01-01
Scientific and engineering applications often involve structured meshes. These meshes may be nested (for multigrid codes) and/or irregularly coupled (called multiblock or irregularly coupled regular mesh problems). A combined runtime and compile-time approach for parallelizing these applications on distributed memory parallel machines in an efficient and machine-independent fashion was described. A runtime library which can be used to port these applications on distributed memory machines was designed and implemented. The library is currently implemented on several different systems. To further ease the task of application programmers, methods were developed for integrating this runtime library with compilers for HPK-like parallel programming languages. How this runtime library was integrated with the Fortran 90D compiler being developed at Syracuse University is discussed. Experimental results to demonstrate the efficacy of our approach are presented. A multiblock Navier-Stokes solver template and a multigrid code were experimented with. Our experimental results show that our primitives have low runtime communication overheads. Further, the compiler parallelized codes perform within 20 percent of the code parallelized by manually inserting calls to the runtime library.
The Integration of COTS/GOTS within NASA's HST Command and Control System
NASA Technical Reports Server (NTRS)
Pfarr, Thomas; Reis, James E.
2001-01-01
NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with commercial-off-the-shelf (COTS/GOTS) and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes more than 30 COTS/GOTS products with an additional 1/2 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the package selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons learned for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Co-simulation coupling spectral/finite elements for 3D soil/structure interaction problems
NASA Astrophysics Data System (ADS)
Zuchowski, Loïc; Brun, Michael; De Martin, Florent
2018-05-01
The coupling between an implicit finite elements (FE) code and an explicit spectral elements (SE) code has been explored for solving the elastic wave propagation in the case of soil/structure interaction problem. The coupling approach is based on domain decomposition methods in transient dynamics. The spatial coupling at the interface is managed by a standard coupling mortar approach, whereas the time integration is dealt with an hybrid asynchronous time integrator. An external coupling software, handling the interface problem, has been set up in order to couple the FE software Code_Aster with the SE software EFISPEC3D.
Integrating advanced practice providers into medical critical care teams.
McCarthy, Christine; O'Rourke, Nancy C; Madison, J Mark
2013-03-01
Because there is increasing demand for critical care providers in the United States, many medical ICUs for adults have begun to integrate nurse practitioners and physician assistants into their medical teams. Studies suggest that such advanced practice providers (APPs), when appropriately trained in acute care, can be highly effective in helping to deliver high-quality medical critical care and can be important elements of teams with multiple providers, including those with medical house staff. One aspect of building an integrated team is a practice model that features appropriate coding and billing of services by all providers. Therefore, it is important to understand an APP's scope of practice, when they are qualified for reimbursement, and how they may appropriately coordinate coding and billing with other team providers. In particular, understanding when and how to appropriately code for critical care services (Current Procedural Terminology [CPT] code 99291, critical care, evaluation and management of the critically ill or critically injured patient, first 30-74 min; CPT code 99292, critical care, each additional 30 min) and procedures is vital for creating a sustainable program. Because APPs will likely play a growing role in medical critical care units in the future, more studies are needed to compare different practice models and to determine the best way to deploy this talent in specific ICU settings.
The agents of natural genome editing.
Witzany, Guenther
2011-06-01
The DNA serves as a stable information storage medium and every protein which is needed by the cell is produced from this blueprint via an RNA intermediate code. More recently it was found that an abundance of various RNA elements cooperate in a variety of steps and substeps as regulatory and catalytic units with multiple competencies to act on RNA transcripts. Natural genome editing on one side is the competent agent-driven generation and integration of meaningful DNA nucleotide sequences into pre-existing genomic content arrangements, and the ability to (re-)combine and (re-)regulate them according to context-dependent (i.e. adaptational) purposes of the host organism. Natural genome editing on the other side designates the integration of all RNA activities acting on RNA transcripts without altering DNA-encoded genes. If we take the genetic code seriously as a natural code, there must be agents that are competent to act on this code because no natural code codes itself as no natural language speaks itself. As code editing agents, viral and subviral agents have been suggested because there are several indicators that demonstrate viruses competent in both RNA and DNA natural genome editing.
Multidisciplinary Modeling Software for Analysis, Design, and Optimization of HRRLS Vehicles
NASA Technical Reports Server (NTRS)
Spradley, Lawrence W.; Lohner, Rainald; Hunt, James L.
2011-01-01
The concept for Highly Reliable Reusable Launch Systems (HRRLS) under the NASA Hypersonics project is a two-stage-to-orbit, horizontal-take-off / horizontal-landing, (HTHL) architecture with an air-breathing first stage. The first stage vehicle is a slender body with an air-breathing propulsion system that is highly integrated with the airframe. The light weight slender body will deflect significantly during flight. This global deflection affects the flow over the vehicle and into the engine and thus the loads and moments on the vehicle. High-fidelity multi-disciplinary analyses that accounts for these fluid-structures-thermal interactions are required to accurately predict the vehicle loads and resultant response. These predictions of vehicle response to multi physics loads, calculated with fluid-structural-thermal interaction, are required in order to optimize the vehicle design over its full operating range. This contract with ResearchSouth addresses one of the primary objectives of the Vehicle Technology Integration (VTI) discipline: the development of high-fidelity multi-disciplinary analysis and optimization methods and tools for HRRLS vehicles. The primary goal of this effort is the development of an integrated software system that can be used for full-vehicle optimization. This goal was accomplished by: 1) integrating the master code, FEMAP, into the multidiscipline software network to direct the coupling to assure accurate fluid-structure-thermal interaction solutions; 2) loosely-coupling the Euler flow solver FEFLO to the available and proven aeroelasticity and large deformation (FEAP) code; 3) providing a coupled Euler-boundary layer capability for rapid viscous flow simulation; 4) developing and implementing improved Euler/RANS algorithms into the FEFLO CFD code to provide accurate shock capturing, skin friction, and heat-transfer predictions for HRRLS vehicles in hypersonic flow, 5) performing a Reynolds-averaged Navier-Stokes computation on an HRRLS configuration; 6) integrating the RANS solver with the FEAP code for coupled fluid-structure-thermal capability; and 7) integrating the existing NASA SRGULL propulsion flow path prediction software with the FEFLO software for quasi-3D propulsion flow path predictions, 8) improving and integrating into the network, an existing adjoint-based design optimization code.
Muñoz, P; García-Olcina, R; Habib, C; Chen, L R; Leijtens, X J M; de Vries, T; Robbins, D; Capmany, J
2011-07-04
In this paper the design, fabrication and experimental characterization of an spectral amplitude coded (SAC) optical label swapper monolithically integrated on Indium Phosphide (InP) is presented. The device has a footprint of 4.8x1.5 mm2 and is able to perform label swapping operations required in SAC at a speed of 155 Mbps. The device was manufactured in InP using a multiple purpose generic integration scheme. Compared to previous SAC label swapper demonstrations, using discrete component assembly, this label swapper chip operates two order of magnitudes faster.
Vhuromu, E N; Davhana-Maselesele, M
2009-09-01
Treatment of the under five years is a national priority as an attempt in curbing deaths and deformities affecting children. Primary health care was implemented in the clinics in order to help in the treatment of illnesses affecting the community, including children. As a result of childhood illnesses; the World Health Organization (WHO) and United Nation Children's Fund (UNICEF) came up with Integrated Management of Childhood illnesses (IMCI) strategy to enhance treatment of such illnesses in developing countries. Primary health care nurses (PHCNS) in Limpopo Province were also trained to implement the strategy. This study is intended to explore and describe the experiences of PHCNS in implementing the IMCI strategy at selected clinics in Vhembe District in the Limpopo Province. A qualitative, explorative, descriptive and contextual design was used. In-depth interviews were conducted with PHCNS who are IMCI trained and have implemented the strategy for a period of not less than two years. Data analysis was done through using Tesch 's method of open coding for qualitative analysis. Findings revealed that PHCNS had difficulty in rendering IMCI services due to lack of resources and poor working conditions. Recommendations address the difficulties experienced by PHCNS when implementing the IMCI strategy.
How Effective Is Honor Code Reporting over Instructor- Implemented Measures? A Pilot Study
ERIC Educational Resources Information Center
Barnard-Brak, Lucy; Schmidt, Marcelo; Wei, Tianlan
2013-01-01
Honor codes have become increasingly popular at institutions of higher education as a means of reducing violations of academic integrity such as cheating and other academically dishonest acts. Previous research on honor code effectiveness has been limited to correlational research designs that preclude the examination of cause-and-effect…
Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses
ERIC Educational Resources Information Center
Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan
2013-01-01
Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…
Automatic Coding of Short Text Responses via Clustering in Educational Assessment
ERIC Educational Resources Information Center
Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank
2016-01-01
Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…
40 CFR 265.340 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...
40 CFR 265.340 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...
40 CFR 265.340 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...
ERIC Educational Resources Information Center
Guillen-Diaz, Carmen
1990-01-01
A classroom approach that brings oral and written language learning closer together is outlined. The strategy focuses on proper pronunciation using minimal pairs and uses exercises designed for listening and visualization, production, discrimination, re-use and reinforcement, and computer-assisted instruction. (MSE)
Development of the US3D Code for Advanced Compressible and Reacting Flow Simulations
NASA Technical Reports Server (NTRS)
Candler, Graham V.; Johnson, Heath B.; Nompelis, Ioannis; Subbareddy, Pramod K.; Drayna, Travis W.; Gidzak, Vladimyr; Barnhardt, Michael D.
2015-01-01
Aerothermodynamics and hypersonic flows involve complex multi-disciplinary physics, including finite-rate gas-phase kinetics, finite-rate internal energy relaxation, gas-surface interactions with finite-rate oxidation and sublimation, transition to turbulence, large-scale unsteadiness, shock-boundary layer interactions, fluid-structure interactions, and thermal protection system ablation and thermal response. Many of the flows have a large range of length and time scales, requiring large computational grids, implicit time integration, and large solution run times. The University of Minnesota NASA US3D code was designed for the simulation of these complex, highly-coupled flows. It has many of the features of the well-established DPLR code, but uses unstructured grids and has many advanced numerical capabilities and physical models for multi-physics problems. The main capabilities of the code are described, the physical modeling approaches are discussed, the different types of numerical flux functions and time integration approaches are outlined, and the parallelization strategy is overviewed. Comparisons between US3D and the NASA DPLR code are presented, and several advanced simulations are presented to illustrate some of novel features of the code.
A long-term, integrated impact assessment of alternative building energy code scenarios in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Sha; Eom, Jiyong; Evans, Meredydd
2014-04-01
China is the second largest building energy user in the world, ranking first and third in residential and commercial energy consumption. Beginning in the early 1980s, the Chinese government has developed a variety of building energy codes to improve building energy efficiency and reduce total energy demand. This paper studies the impact of building energy codes on energy use and CO2 emissions by using a detailed building energy model that represents four distinct climate zones each with three building types, nested in a long-term integrated assessment framework GCAM. An advanced building stock module, coupled with the building energy model, ismore » developed to reflect the characteristics of future building stock and its interaction with the development of building energy codes in China. This paper also evaluates the impacts of building codes on building energy demand in the presence of economy-wide carbon policy. We find that building energy codes would reduce Chinese building energy use by 13% - 22% depending on building code scenarios, with a similar effect preserved even under the carbon policy. The impact of building energy codes shows regional and sectoral variation due to regionally differentiated responses of heating and cooling services to shell efficiency improvement.« less
Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.
2012-01-01
An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.
Dual CRISPR-Cas9 Cleavage Mediated Gene Excision and Targeted Integration in Yarrowia lipolytica.
Gao, Difeng; Smith, Spencer; Spagnuolo, Michael; Rodriguez, Gabriel; Blenner, Mark
2018-05-29
CRISPR-Cas9 technology has been successfully applied in Yarrowia lipolytica for targeted genomic editing including gene disruption and integration; however, disruptions by existing methods typically result from small frameshift mutations caused by indels within the coding region, which usually resulted in unnatural protein. In this study, a dual cleavage strategy directed by paired sgRNAs is developed for gene knockout. This method allows fast and robust gene excision, demonstrated on six genes of interest. The targeted regions for excision vary in length from 0.3 kb up to 3.5 kb and contain both non-coding and coding regions. The majority of the gene excisions are repaired by perfect nonhomologous end-joining without indel. Based on this dual cleavage system, two targeted markerless integration methods are developed by providing repair templates. While both strategies are effective, homology mediated end joining (HMEJ) based method are twice as efficient as homology recombination (HR) based method. In both cases, dual cleavage leads to similar or improved gene integration efficiencies compared to gene excision without integration. This dual cleavage strategy will be useful for not only generating more predictable and robust gene knockout, but also for efficient targeted markerless integration, and simultaneous knockout and integration in Y. lipolytica. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.
1995-01-01
Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.
Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations
NASA Astrophysics Data System (ADS)
Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.
2010-11-01
We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.
NASA Astrophysics Data System (ADS)
Caplan, R. M.
2013-04-01
We present a simple to use, yet powerful code package called NLSEmagic to numerically integrate the nonlinear Schrödinger equation in one, two, and three dimensions. NLSEmagic is a high-order finite-difference code package which utilizes graphic processing unit (GPU) parallel architectures. The codes running on the GPU are many times faster than their serial counterparts, and are much cheaper to run than on standard parallel clusters. The codes are developed with usability and portability in mind, and therefore are written to interface with MATLAB utilizing custom GPU-enabled C codes with the MEX-compiler interface. The packages are freely distributed, including user manuals and set-up files. Catalogue identifier: AEOJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOJ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 124453 No. of bytes in distributed program, including test data, etc.: 4728604 Distribution format: tar.gz Programming language: C, CUDA, MATLAB. Computer: PC, MAC. Operating system: Windows, MacOS, Linux. Has the code been vectorized or parallelized?: Yes. Number of processors used: Single CPU, number of GPU processors dependent on chosen GPU card (max is currently 3072 cores on GeForce GTX 690). Supplementary material: Setup guide, Installation guide. RAM: Highly dependent on dimensionality and grid size. For typical medium-large problem size in three dimensions, 4GB is sufficient. Keywords: Nonlinear Schröodinger Equation, GPU, high-order finite difference, Bose-Einstien condensates. Classification: 4.3, 7.7. Nature of problem: Integrate solutions of the time-dependent one-, two-, and three-dimensional cubic nonlinear Schrödinger equation. Solution method: The integrators utilize a fully-explicit fourth-order Runge-Kutta scheme in time and both second- and fourth-order differencing in space. The integrators are written to run on NVIDIA GPUs and are interfaced with MATLAB including built-in visualization and analysis tools. Restrictions: The main restriction for the GPU integrators is the amount of RAM on the GPU as the code is currently only designed for running on a single GPU. Unusual features: Ability to visualize real-time simulations through the interaction of MATLAB and the compiled GPU integrators. Additional comments: Setup guide and Installation guide provided. Program has a dedicated web site at www.nlsemagic.com. Running time: A three-dimensional run with a grid dimension of 87×87×203 for 3360 time steps (100 non-dimensional time units) takes about one and a half minutes on a GeForce GTX 580 GPU card.
Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam
2012-01-01
OBJECTIVE: To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. DESIGN: Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. SETTING: Paediatric residency program at BC Children’s Hospital, Vancouver, British Columbia. INTERVENTIONS: The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. RESULTS: A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. CONCLUSIONS: A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes. PMID:23372405
DOE Office of Scientific and Technical Information (OSTI.GOV)
BRISC is a developmental prototype for a nextgeneration systems-level integrated performance and safety code (IPSC) for nuclear reactors. Its development served to demonstrate how a lightweight multi-physics coupling approach can be used to tightly couple the physics models in several different physics codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled burner nuclear reactor. For example, the RIO Fluid Flow and Heat transfer code developed at Sandia (SNL: Chris Moen, Dept. 08005) is used in BRISC to model fluid flow and heat transfer, as well as conduction heat transfermore » in solids. Because BRISC is a prototype, its most practical application is as a foundation or starting point for developing a true production code. The sub-codes and the associated models and correlations currently employed within BRISC were chosen to cover the required application space and demonstrate feasibility, but were not optimized or validated against experimental data within the context of their use in BRISC.« less
Ferlaino, Michael; Rogers, Mark F.; Shihab, Hashem A.; Mort, Matthew; Cooper, David N.; Gaunt, Tom R.; Campbell, Colin
2018-01-01
Background Small insertions and deletions (indels) have a significant influence in human disease and, in terms of frequency, they are second only to single nucleotide variants as pathogenic mutations. As the majority of mutations associated with complex traits are located outside the exome, it is crucial to investigate the potential pathogenic impact of indels in non-coding regions of the human genome. Results We present FATHMM-indel, an integrative approach to predict the functional effect, pathogenic or neutral, of indels in non-coding regions of the human genome. Our method exploits various genomic annotations in addition to sequence data. When validated on benchmark data, FATHMM-indel significantly outperforms CADD and GAVIN, state of the art models in assessing the pathogenic impact of non-coding variants. FATHMM-indel is available via a web server at indels.biocompute.org.uk. Conclusions FATHMM-indel can accurately predict the functional impact and prioritise small indels throughout the whole non-coding genome. PMID:28985712
Ferlaino, Michael; Rogers, Mark F; Shihab, Hashem A; Mort, Matthew; Cooper, David N; Gaunt, Tom R; Campbell, Colin
2017-10-06
Small insertions and deletions (indels) have a significant influence in human disease and, in terms of frequency, they are second only to single nucleotide variants as pathogenic mutations. As the majority of mutations associated with complex traits are located outside the exome, it is crucial to investigate the potential pathogenic impact of indels in non-coding regions of the human genome. We present FATHMM-indel, an integrative approach to predict the functional effect, pathogenic or neutral, of indels in non-coding regions of the human genome. Our method exploits various genomic annotations in addition to sequence data. When validated on benchmark data, FATHMM-indel significantly outperforms CADD and GAVIN, state of the art models in assessing the pathogenic impact of non-coding variants. FATHMM-indel is available via a web server at indels.biocompute.org.uk. FATHMM-indel can accurately predict the functional impact and prioritise small indels throughout the whole non-coding genome.
Farina, Giuseppina; Menditto, Enrica; Corea, Gabriella; Manna, Sonia; Pagliaro, Claudia; Troncone, Chiara; Linguiti, Claudio; Orlando, Valentina; Putignano, Daria; Tari, Daniele Ugo; Buffardi, Gianfranco
2015-01-01
The aim is to evaluate prescriptive patterns of atypical antipsychotic drugs for the treatment of schizophrenia in the LHU Caserta in 2011-2013, and to indicate potentially inappropriate therapy; to plan or schedule corrective/preventive activities to support the continuous improvement of health services. Retrospective cohort study, based on integration of health records and clinical audit. The study was performed in the following steps: data retrieval and analysis; comparison of data with international literature; editing of the Diagnostic-Therapeutic Path. The analysis was performed by using the administrative database of drug prescriptions and treatment plans in the SANIARP portal, a web platform available to specialist facilities and private and public pharmacies of LHU Caserta. The subject of our analysis was to gain information about the diagnosis and treatment of users of atypical antipsychotics in the LHU of Caserta in the years 2011-2013. We identified 2,768 patients with at least one prescription of atypical antipsychotics and diagnosis coded in the study period. Schizophrenia is the most frequent diagnosis (31.1%) and the most common drug in use is olanzapine (29.1%). About 70% of schizophrenics were on monotherapy with no change in drug, 23.6% were under polytherapy and 7.9% made a switch. Our findings were a starting point for editing Diagnostic and Therapeutic Paths aimed at raising the awareness of the scientific community about the appropriateness of diagnosis and treatment in schizophrenia. Pharmacological treatment of schizophrenia should be focused on improving the overall quality of life aimed at remission and possible recovery, although difficult.
NASA Astrophysics Data System (ADS)
Markman, A.; Javidi, B.
2016-06-01
Quick-response (QR) codes are barcodes that can store information such as numeric data and hyperlinks. The QR code can be scanned using a QR code reader, such as those built into smartphone devices, revealing the information stored in the code. Moreover, the QR code is robust to noise, rotation, and illumination when scanning due to error correction built in the QR code design. Integral imaging is an imaging technique used to generate a three-dimensional (3D) scene by combining the information from two-dimensional (2D) elemental images (EIs) each with a different perspective of a scene. Transferring these 2D images in a secure manner can be difficult. In this work, we overview two methods to store and encrypt EIs in multiple QR codes. The first method uses run-length encoding with Huffman coding and the double-random-phase encryption (DRPE) to compress and encrypt an EI. This information is then stored in a QR code. An alternative compression scheme is to perform photon-counting on the EI prior to compression. Photon-counting is a non-linear transformation of data that creates redundant information thus improving image compression. The compressed data is encrypted using the DRPE. Once information is stored in the QR codes, it is scanned using a smartphone device. The information scanned is decompressed and decrypted and an EI is recovered. Once all EIs have been recovered, a 3D optical reconstruction is generated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenthal, Andrew
The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stovemore » pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.« less
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2016-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2017-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
Treatment provider's knowledge of the Health and Disability Commissioner's Code of Consumer Rights.
Townshend, Philip L; Sellman, J Douglas
2002-06-01
The Health and Disability Commissioner's (HDC) Code of Health and and Disability Consumers' Rights (the Code) defines in law the rights of consumers of health and disability services in New Zealand. In the first few years after the publication health educators, service providers and the HDC extensively promoted the Code. Providers of health and disability services would be expected to be knowledgeable about the areas covered by the Code if it is routinely used in the development and monitoring of treatment plans. In this study knowledge of the Code was tested in a random sample of 217 clinical staff that included medical staff, psychologists and counsellors working in Alcohol and Drug Treatment (A&D) centres in New Zealand. Any response showing awareness of a right, regardless of wording, was taken as a positive response as it was the areas covered by rights rather than their actual wording that was considered to be the important knowledge for providers. The main finding of this research was that 23% of staff surveyed were aware of none of the ten rights in the Code and only 6% were aware of more than five of the ten rights. Relating these data to results from a wider sample of treatment providers raises the possibility that A&D treatment providers are slightly more aware of the content of the Code than a general sample of health and disability service providers however overall awareness of the content of the Code by health providers is very low. These results imply that consumer rights issues are not prominent in the minds of providers perhaps indicating an ethical blind spot on their part. Ignorance of the content of the Code may indicate that the treatment community do not find it a useful working document or alternatively that clinicians are content to rely on their own good intentions to preserve the rights of their patients. Further research will be required to explain this lack of knowledge, however the current situation is that consumers cannot rely on clinicians being aware of the consumer's rights in health and disability services.
Integrated Devices and Systems | Grid Modernization | NREL
storage models Microgrids Microgrids Grid Simulation and Power Hardware-in-the-Loop Grid simulation and power hardware-in-the-loop Grid Standards and Codes Standards and codes Contact Barry Mather, Ph.D
Fidelity of the Integrated Force Method Solution
NASA Technical Reports Server (NTRS)
Hopkins, Dale; Halford, Gary; Coroneos, Rula; Patnaik, Surya
2002-01-01
The theory of strain compatibility of the solid mechanics discipline was incomplete since St. Venant's 'strain formulation' in 1876. We have addressed the compatibility condition both in the continuum and the discrete system. This has lead to the formulation of the Integrated Force Method. A dual Integrated Force Method with displacement as the primal variable has also been formulated. A modest finite element code (IFM/Analyzers) based on the IFM theory has been developed. For a set of standard test problems the IFM results were compared with the stiffness method solutions and the MSC/Nastran code. For the problems IFM outperformed the existing methods. Superior IFM performance is attributed to simultaneous compliance of equilibrium equation and compatibility condition. MSC/Nastran organization expressed reluctance to accept the high fidelity IFM solutions. This report discusses the solutions to the examples. No inaccuracy was detected in the IFM solutions. A stiffness method code with a small programming effort can be improved to reap the many IFM benefits when implemented with the IFMD elements. Dr. Halford conducted a peer-review on the Integrated Force Method. Reviewers' response is included.
Integration of the SSPM and STAGE with the MPACT Virtual Facility Distributed Test Bed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cipiti, Benjamin B.; Shoman, Nathan
The Material Protection Accounting and Control Technologies (MPACT) program within DOE NE is working toward a 2020 milestone to demonstrate a Virtual Facility Distributed Test Bed. The goal of the Virtual Test Bed is to link all MPACT modeling tools, technology development, and experimental work to create a Safeguards and Security by Design capability for fuel cycle facilities. The Separation and Safeguards Performance Model (SSPM) forms the core safeguards analysis tool, and the Scenario Toolkit and Generation Environment (STAGE) code forms the core physical security tool. These models are used to design and analyze safeguards and security systems and generatemore » performance metrics. Work over the past year has focused on how these models will integrate with the other capabilities in the MPACT program and specific model changes to enable more streamlined integration in the future. This report describes the model changes and plans for how the models will be used more collaboratively. The Virtual Facility is not designed to integrate all capabilities into one master code, but rather to maintain stand-alone capabilities that communicate results between codes more effectively.« less
An Integrated Model of Cognitive Control in Task Switching
ERIC Educational Resources Information Center
Altmann, Erik M.; Gray, Wayne D.
2008-01-01
A model of cognitive control in task switching is developed in which controlled performance depends on the system maintaining access to a code in episodic memory representing the most recently cued task. The main constraint on access to the current task code is proactive interference from old task codes. This interference and the mechanisms that…
Telemetry advances in data compression and channel coding
NASA Technical Reports Server (NTRS)
Miller, Warner H.; Morakis, James C.; Yeh, Pen-Shu
1990-01-01
Addressed in this paper is the dependence of telecommunication channel, forward error correcting coding and source data compression coding on integrated circuit technology. Emphasis is placed on real time high speed Reed Solomon (RS) decoding using full custom VLSI technology. Performance curves of NASA's standard channel coder and a proposed standard lossless data compression coder are presented.
NASA Technical Reports Server (NTRS)
Walton, J. T.
1994-01-01
The development of a single-stage-to-orbit aerospace vehicle intended to be launched horizontally into low Earth orbit, such as the National Aero-Space Plane (NASP), has concentrated on the use of the supersonic combustion ramjet (scramjet) propulsion cycle. SRGULL, a scramjet cycle analysis code, is an engineer's tool capable of nose-to-tail, hydrogen-fueled, airframe-integrated scramjet simulation in a real gas flow with equilibrium thermodynamic properties. This program facilitates initial estimates of scramjet cycle performance by linking a two-dimensional forebody, inlet and nozzle code with a one-dimensional combustor code. Five computer codes (SCRAM, SEAGUL, INLET, Progam HUD, and GASH) originally developed at NASA Langley Research Center in support of hypersonic technology are integrated in this program to analyze changing flow conditions. The one-dimensional combustor code is based on the combustor subroutine from SCRAM and the two-dimensional coding is based on an inviscid Euler program (SEAGUL). Kinetic energy efficiency input for sidewall area variation modeling can be calculated by the INLET program code. At the completion of inviscid component analysis, Program HUD, an integral boundary layer code based on the Spaulding-Chi method, is applied to determine the friction coefficient which is then used in a modified Reynolds Analogy to calculate heat transfer. Real gas flow properties such as flow composition, enthalpy, entropy, and density are calculated by the subroutine GASH. Combustor input conditions are taken from one-dimensionalizing the two-dimensional inlet exit flow. The SEAGUL portions of this program are limited to supersonic flows, but the combustor (SCRAM) section can handle supersonic and dual-mode operation. SRGULL has been compared to scramjet engine tests with excellent results. SRGULL was written in FORTRAN 77 on an IBM PC compatible using IBM's FORTRAN/2 or Microway's NDP386 F77 compiler. The program is fully user interactive, but can also run in batch mode. It operates under the UNIX, VMS, NOS, and DOS operating systems. The source code is not directly compatible with all PC compilers (e.g., Lahey or Microsoft FORTRAN) due to block and segment size requirements. SRGULL executable code requires about 490K RAM and a math coprocessor on PC's. The SRGULL program was developed in 1989, although the component programs originated in the 1960's and 1970's. IBM, IBM PC, and DOS are registered trademarks of International Business Machines. VMS is a registered trademark of Digital Equipment Corporation. UNIX is a registered trademark of Bell Laboratories. NOS is a registered trademark of Control Data Corporation.
Kolehmainen, Christine; Brennan, Meghan; Filut, Amarette; Isaac, Carol; Carnes, Molly
2014-01-01
Purpose Ineffective leadership during cardiopulmonary resuscitation (“code”) can negatively affect a patient’s likelihood of survival. In most teaching hospitals, internal medicine residents lead codes. In this study, the authors explored internal medicine residents’ experiences leading codes, with a particular focus on how gender influences the code leadership experience. Method The authors conducted individual, semi-structured telephone or in-person interviews with 25 residents (May 2012 to February 2013) from 9 U.S. internal medicine residency programs. They audio recorded and transcribed the interviews then thematically analyzed the transcribed text. Results Participants viewed a successful code as one with effective leadership. They agreed that the ideal code leader was an authoritative presence; spoke with a deep, loud voice; used clear, direct communication; and appeared calm. Although equally able to lead codes as their male colleagues, female participants described feeling stress from having to violate gender behavioral norms in the role of code leader. In response, some female participants adopted rituals to signal the suspension of gender norms while leading a code. Others apologized afterwards for their counter normative behavior. Conclusions Ideal code leadership embodies highly agentic, stereotypical male behaviors. Female residents employed strategies to better integrate the competing identities of code leader and female gender. In the future, residency training should acknowledge how female gender stereotypes may conflict with the behaviors required to enact code leadership and offer some strategies, such as those used by the female residents in this study, to help women integrate these dual identities. PMID:24979289
RNA Systems Biology for Cancer: From Diagnosis to Therapy.
Amirkhah, Raheleh; Farazmand, Ali; Wolkenhauer, Olaf; Schmitz, Ulf
2016-01-01
It is due to the advances in high-throughput omics data generation that RNA species have re-entered the focus of biomedical research. International collaborate efforts, like the ENCODE and GENCODE projects, have spawned thousands of previously unknown functional non-coding RNAs (ncRNAs) with various but primarily regulatory roles. Many of these are linked to the emergence and progression of human diseases. In particular, interdisciplinary studies integrating bioinformatics, systems biology, and biotechnological approaches have successfully characterized the role of ncRNAs in different human cancers. These efforts led to the identification of a new tool-kit for cancer diagnosis, monitoring, and treatment, which is now starting to enter and impact on clinical practice. This chapter is to elaborate on the state of the art in RNA systems biology, including a review and perspective on clinical applications toward an integrative RNA systems medicine approach. The focus is on the role of ncRNAs in cancer.
Nonequilibrium chemistry boundary layer integral matrix procedure
NASA Technical Reports Server (NTRS)
Tong, H.; Buckingham, A. C.; Morse, H. L.
1973-01-01
The development of an analytic procedure for the calculation of nonequilibrium boundary layer flows over surfaces of arbitrary catalycities is described. An existing equilibrium boundary layer integral matrix code was extended to include nonequilibrium chemistry while retaining all of the general boundary condition features built into the original code. For particular application to the pitch-plane of shuttle type vehicles, an approximate procedure was developed to estimate the nonequilibrium and nonisentropic state at the edge of the boundary layer.
Parzeller, Markus; Zedler, Barbara
2013-01-01
The article deals with the new regulations in the German Civil Code (BGB) which came into effect in Germany on 26 Feb 2013 as the Patient Rights Act (PatRG). In Part I, the legislative procedure, the treatment contract and the contracting parties (Section 630a Civil Code), the applicable regulations (Section 630b Civil Code) and the obligations to cooperate and inform (Section 630c Civil Code) are discussed and critically analysed.
Identification and integration of Picorna-like viruses in multiple insect taxa
USDA-ARS?s Scientific Manuscript database
Virus infection often leads to incorporation of a piece of the virus genetic code into the genome of the host organism, referred to as integration. Determining if the virus has integrated into the host genome provides valuable information needed to monitor disease spread. Detection of integrated vir...
NASA Astrophysics Data System (ADS)
Bruton, Samuel V.
2003-05-01
While the usefulness of the case study method in teaching research ethics is frequently emphasized, less often noted is the educational value of professional codes of ethics. Much can be gained by having students examine codes and reflect on their significance. This paper argues that codes such as the American Chemical Society‘s The Chemist‘s Code of Conduct are an important supplement to the use of cases and describes one way in which they can be integrated profitably into a class discussion of research ethics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheverton, R.D.; Dickson, T.L.; Merkle, J.G.
1992-03-01
The Yankee Atomic Electric Company has performed an Integrated Pressurized Thermal Shock (IPTS)-type evaluation of the Yankee Rowe reactor pressure vessel in accordance with the PTS Rule (10 CFR 50. 61) and a US Regulatory Guide 1.154. The Oak Ridge National Laboratory (ORNL) reviewed the YAEC document and performed an independent probabilistic fracture-mechnics analysis. The review included a comparison of the Pacific Northwest Laboratory (PNL) and the ORNL probabilistic fracture-mechanics codes (VISA-II and OCA-P, respectively). The review identified minor errors and one significant difference in philosophy. Also, the two codes have a few dissimilar peripheral features. Aside from these differences,more » VISA-II and OCA-P are very similar and with errors corrected and when adjusted for the difference in the treatment of fracture toughness distribution through the wall, yield essentially the same value of the conditional probability of failure. The ORNL independent evaluation indicated RT{sub NDT} values considerably greater than those corresponding to the PTS-Rule screening criteria and a frequency of failure substantially greater than that corresponding to the primary acceptance criterion'' in US Regulatory Guide 1.154. Time constraints, however, prevented as rigorous a treatment as the situation deserves. Thus, these results are very preliminary.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheverton, R.D.; Dickson, T.L.; Merkle, J.G.
1992-03-01
The Yankee Atomic Electric Company has performed an Integrated Pressurized Thermal Shock (IPTS)-type evaluation of the Yankee Rowe reactor pressure vessel in accordance with the PTS Rule (10 CFR 50. 61) and a US Regulatory Guide 1.154. The Oak Ridge National Laboratory (ORNL) reviewed the YAEC document and performed an independent probabilistic fracture-mechnics analysis. The review included a comparison of the Pacific Northwest Laboratory (PNL) and the ORNL probabilistic fracture-mechanics codes (VISA-II and OCA-P, respectively). The review identified minor errors and one significant difference in philosophy. Also, the two codes have a few dissimilar peripheral features. Aside from these differences,more » VISA-II and OCA-P are very similar and with errors corrected and when adjusted for the difference in the treatment of fracture toughness distribution through the wall, yield essentially the same value of the conditional probability of failure. The ORNL independent evaluation indicated RT{sub NDT} values considerably greater than those corresponding to the PTS-Rule screening criteria and a frequency of failure substantially greater than that corresponding to the ``primary acceptance criterion`` in US Regulatory Guide 1.154. Time constraints, however, prevented as rigorous a treatment as the situation deserves. Thus, these results are very preliminary.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, G; Yin, F; Ren, L
Purpose: In order to track the tumor movement for patient positioning verification during arc treatment delivery or in between 3D/IMRT beams for stereotactic body radiation therapy (SBRT), the limited-angle kV projections acquisition simultaneously during arc treatment delivery or in-between static treatment beams as the gantry moves to the next beam angle was proposed. The purpose of this study is to estimate additional imaging dose resulting from multiple tomosynthesis acquisitions in-between static treatment beams and to compare with that of a conventional kV-CBCT acquisition. Methods: kV imaging system integrated into Varian TrueBeam accelerators was modeled using EGSnrc Monte Carlo user code,more » BEAMnrc and DOSXYZnrc code was used in dose calculations. The simulated realistic kV beams from the Varian TrueBeam OBI 1.5 system were used to calculate dose to patient based on CT images. Organ doses were analyzed using DVHs. The imaging dose to patient resulting from realistic multiple tomosynthesis acquisitions with each 25–30 degree kV source rotation between 6 treatment beam gantry angles was studied. Results: For a typical lung SBRT treatment delivery much lower (20–50%) kV imaging doses from the sum of realistic six tomosynthesis acquisitions with each 25–30 degree x-ray source rotation between six treatment beam gantry angles were observed compared to that from a single CBCT image acquisition. Conclusion: This work indicates that the kV imaging in this proposed Limited-angle Intra-fractional Verification (LIVE) System for SBRT Treatments has a negligible imaging dose increase. It is worth to note that the MV imaging dose caused by MV projection acquisition in-between static beams in LIVE can be minimized by restricting the imaging to the target region and reducing the number of projections acquired. For arc treatments, MV imaging acquisition in LIVE does not add additional imaging dose as the MV images are acquired from treatment beams directly during the treatment.« less
Jacobson, Julie; Mosher, Aryc W.; Walson, Judd L.
2016-01-01
Background While some evidence supports the beneficial effects of integrating neglected tropical disease (NTD) programs to optimize coverage and reduce costs, there is minimal information regarding when or how to effectively operationalize program integration. The lack of systematic analyses of integration experiences and of integration processes may act as an impediment to achieving more effective NTD programming. We aimed to learn about the experiences of NTD stakeholders and their perceptions of integration. Methodology We evaluated differences in the definitions, roles, perceived effectiveness, and implementation experiences of integrated NTD programs among a variety of NTD stakeholder groups, including multilateral organizations, funding partners, implementation partners, national Ministry of Health (MOH) teams, district MOH teams, volunteer rural health workers, and community members participating in NTD campaigns. Semi-structured key informant interviews were conducted. Coding of themes involved a mix of applying in-vivo open coding and a priori thematic coding from a start list. Findings In total, 41 interviews were conducted. Salient themes varied by stakeholder, however dominant themes on integration included: significant variations in definitions, differential effectiveness of specific integrated NTD activities, community member perceptions of NTD programs, the influence of funders, perceived facilitators, perceived barriers, and the effects of integration on health system strength. In general, stakeholder groups provided unique perspectives, rather than contrarian points of view, on the same topics. The stakeholders identified more advantages to integration than disadvantages, however there are a number of both unique facilitators and challenges to integration from the perspective of each stakeholder group. Conclusions Qualitative data suggest several structural, process, and technical opportunities that could be addressed to promote more effective and efficient integrated NTD elimination programs. We highlight a set of ten recommendations that may address stakeholder concerns and perceptions regarding these key opportunities. For example, public health stakeholders should embrace a broader perspective of community-based health needs, including and beyond NTDs, and available platforms for addressing those needs. PMID:27776127
pySecDec: A toolbox for the numerical evaluation of multi-scale integrals
NASA Astrophysics Data System (ADS)
Borowka, S.; Heinrich, G.; Jahn, S.; Jones, S. P.; Kerner, M.; Schlenk, J.; Zirke, T.
2018-01-01
We present pySECDEC, a new version of the program SECDEC, which performs the factorization of dimensionally regulated poles in parametric integrals, and the subsequent numerical evaluation of the finite coefficients. The algebraic part of the program is now written in the form of python modules, which allow a very flexible usage. The optimization of the C++ code, generated using FORM, is improved, leading to a faster numerical convergence. The new version also creates a library of the integrand functions, such that it can be linked to user-specific codes for the evaluation of matrix elements in a way similar to analytic integral libraries.
ART/Ada design project, phase 1. Task 3 report: Test plan
NASA Technical Reports Server (NTRS)
Allen, Bradley P.
1988-01-01
The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.
SOFIP: A Short Orbital Flux Integration Program
NASA Technical Reports Server (NTRS)
Stassinopoulos, E. G.; Hebert, J. J.; Butler, E. L.; Barth, J. L.
1979-01-01
A computer code was developed to evaluate the space radiation environment encountered by geocentric satellites. The Short Orbital Flux Integration Program (SOFIP) is a compact routine of modular compositions, designed mostly with structured programming techniques in order to provide core and time economy and ease of use. The program in its simplest form produces for a given input trajectory a composite integral orbital spectrum of either protons or electrons. Additional features are available separately or in combination with the inclusion of the corresponding (optional) modules. The code is described in detail, and the function and usage of the various modules are explained. A program listing and sample outputs are attached.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
Extending software repository hosting to code review and testing
NASA Astrophysics Data System (ADS)
Gonzalez Alvarez, A.; Aparicio Cotarelo, B.; Lossent, A.; Andersen, T.; Trzcinska, A.; Asbury, D.; Hłimyr, N.; Meinhard, H.
2015-12-01
We will describe how CERN's services around Issue Tracking and Version Control have evolved, and what the plans for the future are. We will describe the services main design, integration and structure, giving special attention to the new requirements from the community of users in terms of collaboration and integration tools and how we address this challenge when defining new services based on GitLab for collaboration to replace our current Gitolite service and Code Review and Jenkins for Continuous Integration. These new services complement the existing ones to create a new global "development tool stack" where each working group can place its particular development work-flow.
Programming (Tips) for Physicists & Engineers
Ozcan, Erkcan
2018-02-19
Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.
Programming (Tips) for Physicists & Engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozcan, Erkcan
2010-07-13
Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.
FERRET adjustment code: status/use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.A.
1982-03-01
The least-squares data analysis code FERRET is reviewed. Recent enhancements are discussed along with illustrative applications. Particular features noted include the use of differential as well as integral data, and additional user options for assigning and storing covariance matrices.
TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childerson, M.T.; Fujita, R.K.
1985-01-01
A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less
Status Report on NEAMS PROTEUS/ORIGEN Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A
2016-02-18
The US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program has contributed significantly to the development of the PROTEUS neutron transport code at Argonne National Laboratory and to the Oak Ridge Isotope Generation and Depletion Code (ORIGEN) depletion/decay code at Oak Ridge National Laboratory. PROTEUS’s key capability is the efficient and scalable (up to hundreds of thousands of cores) neutron transport solver on general, unstructured, three-dimensional finite-element-type meshes. The scalability and mesh generality enable the transfer of neutron and power distributions to other codes in the NEAMS toolkit for advanced multiphysics analysis. Recently, ORIGEN has received considerablemore » modernization to provide the high-performance depletion/decay capability within the NEAMS toolkit. This work presents a description of the initial integration of ORIGEN in PROTEUS, mainly performed during FY 2015, with minor updates in FY 2016.« less
Brunette, Mary F; Asher, Dianne; Whitley, Rob; Lutz, Wilma J; Wieder, Barbara L; Jones, Amanda M; McHugo, Gregory J
2008-09-01
Approximately half of the people who have serious mental illnesses experience a co-occurring substance use disorder at some point in their lifetime. Integrated dual disorders treatment, a program to treat persons with co-occurring disorders, improves outcomes but is not widely available in public mental health settings. This report describes the extent to which this intervention was implemented by 11 community mental health centers participating in a large study of practice implementation. Facilitators and barriers to implementation are described. Trained implementation monitors conducted regular site visits over two years. During visits, monitors interviewed key informants, conducted ethnographic observations of implementation efforts, and assessed fidelity to the practice model. These data were coded and used as a basis for detailed site reports summarizing implementation processes. The authors reviewed the reports and distilled the three top facilitators and barriers for each site. The most prominent cross-site facilitators and barriers were identified. Two sites reached high fidelity, six sites reached moderate fidelity, and three sites remained at low fidelity over the two years. Prominent facilitators and barriers to implementation with moderate to high fidelity were administrative leadership, consultation and training, supervisor mastery and supervision, chronic staff turnover, and finances. Common facilitators and barriers to implementation of integrated dual disorders treatment emerged across sites. The results confirmed the importance of the use of the consultant-trainer in the model of implementation, as well as the need for intensive activities at multiple levels to facilitate implementation. Further research on service implementation is needed, including but not limited to clarifying strategies to overcome barriers.
CACTI: free, open-source software for the sequential coding of behavioral interactions.
Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.
A co-designed equalization, modulation, and coding scheme
NASA Technical Reports Server (NTRS)
Peile, Robert E.
1992-01-01
The commercial impact and technical success of Trellis Coded Modulation seems to illustrate that, if Shannon's capacity is going to be neared, the modulation and coding of an analogue signal ought to be viewed as an integrated process. More recent work has focused on going beyond the gains obtained for Average White Gaussian Noise and has tried to combine the coding/modulation with adaptive equalization. The motive is to gain similar advances on less perfect or idealized channels.
Recent Developments in Grid Generation and Force Integration Technology for Overset Grids
NASA Technical Reports Server (NTRS)
Chan, William M.; VanDalsem, William R. (Technical Monitor)
1994-01-01
Recent developments in algorithms and software tools for generating overset grids for complex configurations are described. These include the overset surface grid generation code SURGRD and version 2.0 of the hyperbolic volume grid generation code HYPGEN. The SURGRD code is in beta test mode where the new features include the capability to march over a collection of panel networks, a variety of ways to control the side boundaries and the marching step sizes and distance, a more robust projection scheme and an interpolation option. New features in version 2.0 of HYPGEN include a wider range of boundary condition types. The code also allows the user to specify different marching step sizes and distance for each point on the surface grid. A scheme that takes into account of the overlapped zones on the body surface for the purpose of forces and moments computation is also briefly described, The process involves the following two software modules: MIXSUR - a composite grid generation module to produce a collection of quadrilaterals and triangles on which pressure and viscous stresses are to be integrated, and OVERINT - a forces and moments integration module.
Porting plasma physics simulation codes to modern computing architectures using the
NASA Astrophysics Data System (ADS)
Germaschewski, Kai; Abbott, Stephen
2015-11-01
Available computing power has continued to grow exponentially even after single-core performance satured in the last decade. The increase has since been driven by more parallelism, both using more cores and having more parallelism in each core, e.g. in GPUs and Intel Xeon Phi. Adapting existing plasma physics codes is challenging, in particular as there is no single programming model that covers current and future architectures. We will introduce the open-source
Status of BOUT fluid turbulence code: improvements and verification
NASA Astrophysics Data System (ADS)
Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.
2006-10-01
BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.
Dental Faculty Accuracy When Using Diagnostic Codes: A Pilot Study.
Sutton, Jeanne C; Fay, Rose-Marie; Huynh, Carolyn P; Johnson, Cleverick D; Zhu, Liang; Quock, Ryan L
2017-05-01
The aim of this study was to examine the accuracy of dental faculty members' utilization of diagnostic codes and resulting treatment planning based on radiographic interproximal tooth radiolucencies. In 2015, 50 full-time and part-time general dentistry faculty members at one U.S. dental school were shown a sequence of 15 bitewing radiographs; one interproximal radiolucency was highlighted on each bitewing. For each radiographic lesion, participants were asked to choose the most appropriate diagnostic code (from a concise list of five codes, corresponding to lesion progression to outer/inner halves of enamel and outer/middle/pulpal thirds of dentin), acute treatment (attempt to arrest/remineralize non-invasively, operative intervention, or no treatment), and level of confidence in choices. Diagnostic and treatment choices of participants were compared to "gold standard" correct responses, as determined by expert radiology and operative faculty members, respectively. The majority of the participants selected the correct diagnostic code for lesions in the outer one-third of dentin (p<0.0001) and the pulpal one-third of dentin (p<0.0001). For lesions in the outer and inner halves of enamel and the middle one-third of dentin, the correct rates were moderate. However, the majority of the participants chose correct treatments on all types of lesions (correct rate 63.6-100%). Faculty members' confidence in their responses was generally high for all lesions, all above 90%. Diagnostic codes were appropriately assigned by participants for the very deepest lesions, but they were not assigned accurately for more incipient lesions (limited to enamel). Paradoxically, treatment choices were generally correct, regardless of diagnostic choices. Further calibration is needed to improve faculty use and teaching of diagnostic codes.
Evaluating a Dental Diagnostic Terminology in an Electronic Health Record
White, Joel M.; Kalenderian, Elsbeth; Stark, Paul C.; Ramoni, Rachel L.; Vaderhobli, Ram; Walji, Muhammad F.
2011-01-01
Standardized treatment procedure codes and terms are routinely used in dentistry. Utilization of a diagnostic terminology is common in medicine, but there is not a satisfactory or commonly standardized dental diagnostic terminology available at this time. Recent advances in dental informatics have provided an opportunity for inclusion of diagnostic codes and terms as part of treatment planning and documentation in the patient treatment history. This article reports the results of the use of a diagnostic coding system in a large dental school’s predoctoral clinical practice. A list of diagnostic codes and terms, called Z codes, was developed by dental faculty members. The diagnostic codes and terms were implemented into an electronic health record (EHR) for use in a predoctoral dental clinic. The utilization of diagnostic terms was quantified. The validity of Z code entry was evaluated by comparing the diagnostic term entered to the procedure performed, where valid diagnosis-procedure associations were determined by consensus among three calibrated academically based dentists. A total of 115,004 dental procedures were entered into the EHR during the year sampled. Of those, 43,053 were excluded from this analysis because they represent diagnosis or other procedures unrelated to treatments. Among the 71,951 treatment procedures, 27,973 had diagnoses assigned to them with an overall utilization of 38.9 percent. Of the 147 available Z codes, ninety-three were used (63.3 percent). There were 335 unique procedures provided and 2,127 procedure/diagnosis pairs captured in the EHR. Overall, 76.7 percent of the diagnoses entered were valid. We conclude that dental diagnostic terminology can be incorporated within an electronic health record and utilized in an academic clinical environment. Challenges remain in the development of terms and implementation and ease of use that, if resolved, would improve the utilization. PMID:21546594
Primary care and communication in shared cancer care: A Qualitative Study
Sada, Yvonne; Street, Richard L.; Singh, Hardeep; Shada, Rachel; Naik, Aanand D.
2013-01-01
Objective To explore perceptions of primary care physicians’ (PCPs) and oncologists’ roles, responsibilities, and patterns of communication related to shared cancer care in three integrated health systems that used electronic health records (EHRs). Study design Qualitative study. Methods We conducted semi-structured interviews with ten early stage colorectal cancer patients and fourteen oncologists and PCPs. Sample sizes were determined by thematic saturation. Dominant themes and codes were identified and subsequently applied to all transcripts. Results Physicians reported that EHRs improved communication within integrated systems, but communication with physicians outside their system was still difficult. PCPs expressed uncertainty about their role during cancer care, even though medical oncologists emphasized the importance of co-morbidity control during cancer treatment. Both patients and physicians described additional roles for PCPs, including psychological distress support and behavior modification. Conclusions Integrated systems that use EHRs likely facilitate shared cancer care through improved PCP-oncologist communication. However, strategies to facilitate a more active role for PCPs in managing co-morbidities, psychological distress and behavior modification, as well as to overcome communication challenges between physicians not practicing within the same integrated system, are still needed to improve shared cancer care. PMID:21615196
2011-01-01
Background Translational medicine requires the integration of knowledge using heterogeneous data from health care to the life sciences. Here, we describe a collaborative effort to produce a prototype Translational Medicine Knowledge Base (TMKB) capable of answering questions relating to clinical practice and pharmaceutical drug discovery. Results We developed the Translational Medicine Ontology (TMO) as a unifying ontology to integrate chemical, genomic and proteomic data with disease, treatment, and electronic health records. We demonstrate the use of Semantic Web technologies in the integration of patient and biomedical data, and reveal how such a knowledge base can aid physicians in providing tailored patient care and facilitate the recruitment of patients into active clinical trials. Thus, patients, physicians and researchers may explore the knowledge base to better understand therapeutic options, efficacy, and mechanisms of action. Conclusions This work takes an important step in using Semantic Web technologies to facilitate integration of relevant, distributed, external sources and progress towards a computational platform to support personalized medicine. Availability TMO can be downloaded from http://code.google.com/p/translationalmedicineontology and TMKB can be accessed at http://tm.semanticscience.org/sparql. PMID:21624155
Bourdais-Mannone, Claire; Cherikh, Faredj; Gicquel, Nathalie; Gelsi, Eve; Jove, Frédérique; Staccini, Pascal
2011-01-01
The purpose of this study was to conduct a descriptive and comparative analysis of the tools used by healthcare professionals specializing in addictive disorders to promote a rapprochement of information systems. The evaluation guide used to assess the compensation needs of disabled persons treated in "Maisons Départementales des Personnes Handicapées" (centres for disabled people) organizes information in different areas, including a psychological component. The guide includes social and environmental information in the "Recueil Commun sur les Addictions et les Prises en charges" (Joint Report on Drug Addiction and Drug Treatment). While the program for the medicalization of information systems includes care data, the current information about social situations remains inadequate. The international classification of diseases provides synthetic diagnostic codes to describe substance use, etiologic factors and the somatic and psychological complications inherent to addictive disorders. The current system could be radically simplified and harmonized and would benefit from adopting a more individualized approach to non-substance behavioral addictions. The international classification of disabilities provides tools for evaluating the psychological component included in the recent definition of addictive disorders. Legal information should play an integral role in the structure of the information system and in international classifications. The prevalence of episodes of care and treatment of addictive and psychological disorders was assessed at Nice University Hospital in all disciplines. Except in addiction treatment units, very few patients were found to have a RECAP file.
Assessment and Application of the ROSE Code for Reactor Outage Thermal-Hydraulic and Safety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Thomas K.S.; Ko, F.-K.; Dai, L.-C
The currently available tools, such as RELAP5, RETRAN, and others, cannot easily and correctly perform the task of analyzing the system behavior during plant outages. Therefore, a medium-sized program aiming at reactor outage simulation and evaluation, such as midloop operation (MLO) with loss of residual heat removal (RHR), has been developed. Important thermal-hydraulic processes involved during MLO with loss of RHR can be properly simulated by the newly developed reactor outage simulation and evaluation (ROSE) code. The two-region approach with a modified two-fluid model has been adopted to be the theoretical basis of the ROSE code.To verify the analytical modelmore » in the first step, posttest calculations against the integral midloop experiments with loss of RHR have been performed. The excellent simulation capacity of the ROSE code against the Institute of Nuclear Energy Research Integral System Test Facility test data is demonstrated. To further mature the ROSE code in simulating a full-sized pressurized water reactor, assessment against the WGOTHIC code and the Maanshan momentary-loss-of-RHR event has been undertaken. The successfully assessed ROSE code is then applied to evaluate the abnormal operation procedure (AOP) with loss of RHR during MLO (AOP 537.4) for the Maanshan plant. The ROSE code also has been successfully transplanted into the Maanshan training simulator to support operator training. How the simulator was upgraded by the ROSE code for MLO will be presented in the future.« less
NASA Technical Reports Server (NTRS)
Shapiro, Wilbur
1991-01-01
The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.
Military Professional Ethics, Code of Conduct, and Military Academies’ Honor Codes,
1985-04-01
1981. Mathews, Jay. Ex-POW teaches that a mind can always be free. Washington Post 1:1+, 10 January 1983. 16 Meade, Henry J. Commitment to integrity. Air...to honor code’s review. Air Force Times 45:7, 4 February 1985. Glab , John E. Honor at West Point. (Letter) New York Times 26:5, 13 September 1976
National Combustion Code: Parallel Performance
NASA Technical Reports Server (NTRS)
Babrauckas, Theresa
2001-01-01
This report discusses the National Combustion Code (NCC). The NCC is an integrated system of codes for the design and analysis of combustion systems. The advanced features of the NCC meet designers' requirements for model accuracy and turn-around time. The fundamental features at the inception of the NCC were parallel processing and unstructured mesh. The design and performance of the NCC are discussed.
The Evolution of a Coding Schema in a Paced Program of Research
ERIC Educational Resources Information Center
Winters, Charlene A.; Cudney, Shirley; Sullivan, Therese
2010-01-01
A major task involved in the management, analysis, and integration of qualitative data is the development of a coding schema to facilitate the analytic process. Described in this paper is the evolution of a coding schema that was used in the analysis of qualitative data generated from online forums of middle-aged women with chronic conditions who…
Jeong, Jong Seob; Chang, Jin Ho; Shung, K. Kirk
2009-01-01
For noninvasive treatment of prostate tissue using high intensity focused ultrasound (HIFU), this paper proposes a design of an integrated multi-functional confocal phased array (IMCPA) and a strategy to perform both imaging and therapy simultaneously with this array. IMCPA is composed of triple-row phased arrays: a 6 MHz array in the center row for imaging and two 4 MHz arrays in the outer rows for therapy. Different types of piezoelectric materials and stack configurations may be employed to maximize their respective functionalities, i.e., therapy and imaging. Fabrication complexity of IMCPA may be reduced by assembling already constructed arrays. In IMCPA, reflected therapeutic signals may corrupt the quality of imaging signals received by the center row array. This problem can be overcome by implementing a coded excitation approach and/or a notch filter when B-mode images are formed during therapy. The 13-bit Barker code, which is a binary code with unique autocorrelation properties, is preferred for implementing coded excitation, although other codes may also be used. From both Field II simulation and experimental results, whether these remedial approaches would make it feasible to simultaneously carry out imaging and therapy by IMCPA was verifeid. The results showed that the 13-bit Barker code with 3 cycles per bit provided acceptable performances. The measured −6 dB and −20 dB range mainlobe widths were 0.52 mm and 0.91 mm, respectively, and a range sidelobe level was measured to be −48 dB regardless of whether a notch filter was used. The 13-bit Barker code with 2 cycles per bit yielded −6dB and −20dB range mainlobe widths of 0.39 mm and 0.67 mm. Its range sidelobe level was found to be −40 dB after notch filtering. These results indicate the feasibility of the proposed transducer design and system for real-time imaging during therapy. PMID:19811994
Jeong, Jong Seob; Chang, Jin Ho; Shung, K Kirk
2009-09-01
For noninvasive treatment of prostate tissue using high-intensity focused ultrasound this paper proposes a design of an integrated multifunctional confocal phased array (IMCPA) and a strategy to perform both imaging and therapy simultaneously with this array. IMCPA is composed of triple-row phased arrays: a 6-MHz array in the center row for imaging and two 4-MHz arrays in the outer rows for therapy. Different types of piezoelectric materials and stack configurations may be employed to maximize their respective functionalities, i.e., therapy and imaging. Fabrication complexity of IMCPA may be reduced by assembling already constructed arrays. In IMCPA, reflected therapeutic signals may corrupt the quality of imaging signals received by the center-row array. This problem can be overcome by implementing a coded excitation approach and/or a notch filter when B-mode images are formed during therapy. The 13-bit Barker code, which is a binary code with unique autocorrelation properties, is preferred for implementing coded excitation, although other codes may also be used. From both Field II simulation and experimental results, we verified whether these remedial approaches would make it feasible to simultaneously carry out imaging and therapy by IMCPA. The results showed that the 13-bit Barker code with 3 cycles per bit provided acceptable performances. The measured -6 dB and -20 dB range mainlobe widths were 0.52 mm and 0.91 mm, respectively, and a range sidelobe level was measured to be -48 dB regardless of whether a notch filter was used. The 13-bit Barker code with 2 cycles per bit yielded -6 dB and -20 dB range mainlobe widths of 0.39 mm and 0.67 mm. Its range sidelobe level was found to be -40 dB after notch filtering. These results indicate the feasibility of the proposed transducer design and system for real-time imaging during therapy.
NASA Astrophysics Data System (ADS)
Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf
2016-04-01
A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.
RNA-based ovarian cancer research from 'a gene to systems biomedicine' perspective.
Gov, Esra; Kori, Medi; Arga, Kazim Yalcin
2017-08-01
Ovarian cancer remains the leading cause of death from a gynecologic malignancy, and treatment of this disease is harder than any other type of female reproductive cancer. Improvements in the diagnosis and development of novel and effective treatment strategies for complex pathophysiologies, such as ovarian cancer, require a better understanding of disease emergence and mechanisms of progression through systems medicine approaches. RNA-level analyses generate new information that can help in understanding the mechanisms behind disease pathogenesis, to identify new biomarkers and therapeutic targets and in new drug discovery. Whole RNA sequencing and coding and non-coding RNA expression array datasets have shed light on the mechanisms underlying disease progression and have identified mRNAs, miRNAs, and lncRNAs involved in ovarian cancer progression. In addition, the results from these analyses indicate that various signalling pathways and biological processes are associated with ovarian cancer. Here, we present a comprehensive literature review on RNA-based ovarian cancer research and highlight the benefits of integrative approaches within the systems biomedicine concept for future ovarian cancer research. We invite the ovarian cancer and systems biomedicine research fields to join forces to achieve the interdisciplinary caliber and rigor required to find real-life solutions to common, devastating, and complex diseases such as ovarian cancer. CAF: cancer-associated fibroblasts; COG: Cluster of Orthologous Groups; DEA: disease enrichment analysis; EOC: epithelial ovarian carcinoma; ESCC: oesophageal squamous cell carcinoma; GSI: gamma secretase inhibitor; GO: Gene Ontology; GSEA: gene set enrichment analyzes; HAS: Hungarian Academy of Sciences; lncRNAs: long non-coding RNAs; MAPK/ERK: mitogen-activated protein kinase/extracellular signal-regulated kinases; NGS: next-generation sequencing; ncRNAs: non-coding RNAs; OvC: ovarian cancer; PI3K/Akt/mTOR: phosphatidylinositol-3-kinase/protein kinase B/mammalian target of rapamycin; RT-PCR: real-time polymerase chain reaction; SNP: single nucleotide polymorphism; TF: transcription factor; TGF-β: transforming growth factor-β.
Global error estimation based on the tolerance proportionality for some adaptive Runge-Kutta codes
NASA Astrophysics Data System (ADS)
Calvo, M.; González-Pinto, S.; Montijano, J. I.
2008-09-01
Modern codes for the numerical solution of Initial Value Problems (IVPs) in ODEs are based in adaptive methods that, for a user supplied tolerance [delta], attempt to advance the integration selecting the size of each step so that some measure of the local error is [similar, equals][delta]. Although this policy does not ensure that the global errors are under the prescribed tolerance, after the early studies of Stetter [Considerations concerning a theory for ODE-solvers, in: R. Burlisch, R.D. Grigorieff, J. Schröder (Eds.), Numerical Treatment of Differential Equations, Proceedings of Oberwolfach, 1976, Lecture Notes in Mathematics, vol. 631, Springer, Berlin, 1978, pp. 188-200; Tolerance proportionality in ODE codes, in: R. März (Ed.), Proceedings of the Second Conference on Numerical Treatment of Ordinary Differential Equations, Humbold University, Berlin, 1980, pp. 109-123] and the extensions of Higham [Global error versus tolerance for explicit Runge-Kutta methods, IMA J. Numer. Anal. 11 (1991) 457-480; The tolerance proportionality of adaptive ODE solvers, J. Comput. Appl. Math. 45 (1993) 227-236; The reliability of standard local error control algorithms for initial value ordinary differential equations, in: Proceedings: The Quality of Numerical Software: Assessment and Enhancement, IFIP Series, Springer, Berlin, 1997], it has been proved that in many existing explicit Runge-Kutta codes the global errors behave asymptotically as some rational power of [delta]. This step-size policy, for a given IVP, determines at each grid point tn a new step-size hn+1=h(tn;[delta]) so that h(t;[delta]) is a continuous function of t. In this paper a study of the tolerance proportionality property under a discontinuous step-size policy that does not allow to change the size of the step if the step-size ratio between two consecutive steps is close to unity is carried out. This theory is applied to obtain global error estimations in a few problems that have been solved with the code Gauss2 [S. Gonzalez-Pinto, R. Rojas-Bello, Gauss2, a Fortran 90 code for second order initial value problems,
A Framework for Modeling Competitive and Cooperative Computation in Retinal Processing
NASA Astrophysics Data System (ADS)
Moreno-Díaz, Roberto; de Blasio, Gabriel; Moreno-Díaz, Arminda
2008-07-01
The structure of the retina suggests that it should be treated (at least from the computational point of view), as a layered computer. Different retinal cells contribute to the coding of the signals down to ganglion cells. Also, because of the nature of the specialization of some ganglion cells, the structure suggests that all these specialization processes should take place at the inner plexiform layer and they should be of a local character, prior to a global integration and frequency-spike coding by the ganglion cells. The framework we propose consists of a layered computational structure, where outer layers provide essentially with band-pass space-time filtered signals which are progressively delayed, at least for their formal treatment. Specialization is supposed to take place at the inner plexiform layer by the action of spatio-temporal microkernels (acting very locally), and having a centerperiphery space-time structure. The resulting signals are then integrated by the ganglion cells through macrokernels structures. Practically all types of specialization found in different vertebrate retinas, as well as the quasilinear behavior in some higher vertebrates, can be modeled and simulated within this framework. Finally, possible feedback from central structures is considered. Though their relevance to retinal processing is not definitive, it is included here for the sake of completeness, since it is a formal requisite for recursiveness.
Medical care providers' perspectives on dental information needs in electronic health records.
Acharya, Amit; Shimpi, Neel; Mahnke, Andrea; Mathias, Richard; Ye, Zhan
2017-05-01
The authors conducted this study to identify the most relevant patient dental information in a medical-dental integrated electronic health record (iEHR) necessary for medical care providers to inform holistic treatment. The authors collected input from a diverse sample of 65 participants from a large, regional health system representing 13 medical specialties and administrative units. The authors collected feedback from participants through 11 focus group sessions. Two independent reviewers analyzed focus group transcripts to identify major and minor themes. The authors identified 336 of 385 annotations that most medical care providers coded as relevant. Annotations strongly supporting relevancy to clinical practice aligned with 18 major thematic categories, with the top 6 categories being communication, appointments, system design, medications, treatment plan, and dental alerts. Study participants identified dental data of highest relevance to medical care providers and recommended implementation of user-friendly access to dental data in iEHRs as crucial to holistic care delivery. Identification of the patients' dental information most relevant to medical care providers will inform strategies for improving the integration of that information into the medical-dental iEHR. Copyright © 2017 American Dental Association. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Lindner, Bernhard Lee; Ackerman, Thomas P.; Pollack, James B.
1990-01-01
CO2 comprises 95 pct. of the composition of the Martian atmosphere. However, the Martian atmosphere also has a high aerosol content. Dust particles vary from less than 0.2 to greater than 3.0. CO2 is an active absorber and emitter in near IR and IR wavelengths; the near IR absorption bands of CO2 provide significant heating of the atmosphere, and the 15 micron band provides rapid cooling. Including both CO2 and aerosol radiative transfer simultaneously in a model is difficult. Aerosol radiative transfer requires a multiple scattering code, while CO2 radiative transfer must deal with complex wavelength structure. As an alternative to the pure atmosphere treatment in most models which causes inaccuracies, a treatment was developed called the exponential sum or k distribution approximation. The chief advantage of the exponential sum approach is that the integration over k space of f(k) can be computed more quickly than the integration of k sub upsilon over frequency. The exponential sum approach is superior to the photon path distribution and emissivity techniques for dusty conditions. This study was the first application of the exponential sum approach to Martian conditions.
... Monitoring Review Plans Program Integrity National Correct Coding Initiative Affordable Care Act Program Integrity Provisions Cost Sharing ... to Care Living Well Quality of Care Improvement Initiatives Medicaid Managed Care Performance Measurement Releases & Announcements Enrollment ...
... Monitoring Review Plans Program Integrity National Correct Coding Initiative Affordable Care Act Program Integrity Provisions Cost Sharing ... to Care Living Well Quality of Care Improvement Initiatives Medicaid Managed Care Performance Measurement Releases & Announcements Enrollment ...
Considerations of MCNP Monte Carlo code to be used as a radiotherapy treatment planning tool.
Juste, B; Miro, R; Gallardo, S; Verdu, G; Santos, A
2005-01-01
The present work has simulated the photon and electron transport in a Theratron 780® (MDS Nordion)60Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle). This project explains mainly the different methodologies carried out to speedup calculations in order to apply this code efficiently in radiotherapy treatment planning.
Green, Carla A.; McCarty, Dennis; Mertens, Jennifer; Lynch, Frances L.; Hilde, Anadam; Firemark, Alison; Weisner, Constance M.; Pating, David; Anderson, Bradley M.
2013-01-01
Qualified physicians may prescribe buprenorphine to treat opioid dependence, but medication use remains controversial. We examined adoption of buprenorphine in two not-for-profit integrated health plans, over time, completing 101 semi-structured interviews with clinicians and clinician-administrators from primary and specialty care. Transcripts were reviewed, coded, and analyzed. A strong leader championing the new treatment was critical for adoption in both health plans. Once clinicians began using buprenorphine, patients’ and other clinicians’ experiences affected decisions more than did the champion. With experience, protocols developed to manage unsuccessful patients and changed to support maintenance rather than detoxification. Diffusion outside addiction and mental health settings was nonexistent; primary care clinicians cited scope-of-practice issues and referred patients to specialty care. With greater diffusion came questions about long-term use and safety. Recognizing how implementation processes develop may suggest where, when, and how to best expend resources to increase adoption of such treatments. PMID:24268947
Growth, Yield and Fruit Quality of Grapevines under Organic and Biodynamic Management
Döring, Johanna; Frisch, Matthias; Tittmann, Susanne; Stoll, Manfred; Kauer, Randolf
2015-01-01
The main objective of this study was to determine growth, yield and fruit quality of grapevines under organic and biodynamic management in relation to integrated viticultural practices. Furthermore, the mechanisms for the observed changes in growth, yield and fruit quality were investigated by determining nutrient status, physiological performance of the plants and disease incidence on bunches in three consecutive growing seasons. A field trial (Vitis vinifera L. cv. Riesling) was set up at Hochschule Geisenheim University, Germany. The integrated treatment was managed according to the code of good practice. Organic and biodynamic plots were managed according to Regulation (EC) No 834/2007 and Regulation (EC) No 889/2008 and according to ECOVIN- and Demeter-Standards, respectively. The growth and yield of the grapevines differed strongly among the different management systems, whereas fruit quality was not affected by the management system. The organic and the biodynamic treatments showed significantly lower growth and yield in comparison to the integrated treatment. The physiological performance was significantly lower in the organic and the biodynamic systems, which may account for differences in growth and cluster weight and might therefore induce lower yields of the respective treatments. Soil management and fertilization strategy could be responsible factors for these changes. Yields of the organic and the biodynamic treatments partially decreased due to higher disease incidence of downy mildew. The organic and the biodynamic plant protection strategies that exclude the use of synthetic fungicides are likely to induce higher disease incidence and might partially account for differences in the nutrient status of vines under organic and biodynamic management. Use of the biodynamic preparations had little influence on vine growth and yield. Due to the investigation of important parameters that induce changes especially in growth and yield of grapevines under organic and biodynamic management the study can potentially provide guidance for defining more effective farming systems. PMID:26447762
Cooper, Diane; Mantell, Joanne E; Moodley, Jennifer; Mall, Sumaya
2015-03-04
Integration of sexual and reproductive health (SRH) and HIV policies and services delivered by the same provider is prioritised worldwide, especially in sub-Saharan Africa where HIV prevalence is highest. South Africa has the largest antiretroviral treatment (ART) programme in the world, with an estimated 2.7 million people on ART, elevating South Africa's prominence as a global leader in HIV treatment. In 2011, the Southern African HIV Clinicians Society published safer conception guidelines for people living with HIV (PLWH) and in 2013, the South African government published contraceptive guidelines highlighting the importance of SRH and fertility planning services for people living with HIV. Addressing unintended pregnancies, safer conception and maternal health issues is crucial for improving PLWH's SRH and combatting the global HIV epidemic. This paper explores South African policymakers' perspectives on public sector SRH-HIV policy integration, with a special focus on the need for national and regional policies on safer conception for PLWH and contraceptive guidelines implementation. It draws on 42 in-depth interviews with national, provincial and civil society policymakers conducted between 2008-2009 and 2011-2012, as the number of people on ART escalated. Interviews focused on three key domains: opinions on PLWH's childbearing; the status of SRH-HIV integration policies and services; and thoughts and suggestions on SRH-HIV integration within the restructuring of South African primary care services. Data were coded and analysed according to themes. Participants supported SRH-HIV integrated policy and services. However, integration challenges identified included a lack of policy and guidelines, inadequately trained providers, vertical programming, provider work overload, and a weak health system. Participants acknowledged that SRH-HIV integration policies, particularly for safer conception, contraception and cervical cancer, had been neglected. Policymakers supported public sector adoption of safer conception policy and services. Participants interviewed after expanded ART were more positive about safer conception policies for PLWH than participants interviewed earlier. The past decade's HIV policy changes have increased opportunities for SRH-HIV integration. The findings provide important insights for international, regional and national SRH-HIV policy and service integration initiatives.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-19
... Treatment of Property Used To Acquire Parent Stock or Securities in Certain Triangular Reorganizations... 367 of the Internal Revenue Code (Code) relating to the treatment of property used to acquire parent... subsidiary (S) purchases, in connection with the reorganization, stock of its parent corporation (P) in...
TRAC-PF1/MOD1 support calculations for the MIST/OTIS program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujita, R.K.; Knight, T.D.
1984-01-01
We are using the Transient Reactor Analysis Code (TRAC), specifically version TRAC-PF1/MOD1, to perform analyses in support of the MultiLoop Integral-System Test (MIST) and the Once-Through Integral-System (OTIS) experiment program. We have analyzed Geradrohr Dampferzeuger Anlage (GERDA) Test 1605AA to benchmark the TRAC-PF1/MOD1 code against phenomena expected to occur in a raised-loop B and W plant during a small-break loss-of-coolant accident (SBLOCA). These results show that the code can calculate both single- and two-phase natural circulation, flow interruption, boiler-condenser-mode (BCM) heat transfer, and primary-system refill in a B and W-type geometry with low-elevation auxiliary feedwater. 19 figures, 7 tables.
LEGO: A modular accelerator design code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Y.; Donald, M.; Irwin, J.
1997-08-01
An object-oriented accelerator design code has been designed and implemented in a simple and modular fashion. It contains all major features of its predecessors: TRACY and DESPOT. All physics of single-particle dynamics is implemented based on the Hamiltonian in the local frame of the component. Components can be moved arbitrarily in the three dimensional space. Several symplectic integrators are used to approximate the integration of the Hamiltonian. A differential algebra class is introduced to extract a Taylor map up to arbitrary order. Analysis of optics is done in the same way both for the linear and nonlinear case. Currently, themore » code is used to design and simulate the lattices of the PEP-II. It will also be used for the commissioning.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cullen, D.E.
1977-01-12
A code, SIGMA1, has been designed to Doppler broaden evaluated cross sections in the ENDF/B format. The code can only be applied to tabulated data that vary linearly in energy and cross section between tabulated points. This report describes the methods used in the code and serves as a user's guide to the code.
1981-12-01
file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler
Second Generation Integrated Composite Analyzer (ICAN) Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.
1993-01-01
This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.
Galactic and solar radiation exposure to aircrew during a solar cycle.
Lewis, B J; Bennett, L G I; Green, A R; McCall, M J; Ellaschuk, B; Butler, A; Pierre, M
2002-01-01
An on-going investigation using a tissue-equivalent proportional counter (TEPC) has been carried out to measure the ambient dose equivalent rate of the cosmic radiation exposure of aircrew during a solar cycle. A semi-empirical model has been derived from these data to allow for the interpolation of the dose rate for any global position. The model has been extended to an altitude of up to 32 km with further measurements made on board aircraft and several balloon flights. The effects of changing solar modulation during the solar cycle are characterised by correlating the dose rate data to different solar potential models. Through integration of the dose-rate function over a great circle flight path or between given waypoints, a Predictive Code for Aircrew Radiation Exposure (PCAIRE) has been further developed for estimation of the route dose from galactic cosmic radiation exposure. This estimate is provided in units of ambient dose equivalent as well as effective dose, based on E/H x (10) scaling functions as determined from transport code calculations with LUIN and FLUKA. This experimentally based treatment has also been compared with the CARI-6 and EPCARD codes that are derived solely from theoretical transport calculations. Using TEPC measurements taken aboard the International Space Station, ground based neutron monitoring, GOES satellite data and transport code analysis, an empirical model has been further proposed for estimation of aircrew exposure during solar particle events. This model has been compared to results obtained during recent solar flare events.
Parkinson, Rachel H; Little, Jacelyn M; Gray, John R
2017-04-20
Neonicotinoids are known to affect insect navigation and vision, however the mechanisms of these effects are not fully understood. A visual motion sensitive neuron in the locust, the Descending Contralateral Movement Detector (DCMD), integrates visual information and is involved in eliciting escape behaviours. The DCMD receives coded input from the compound eyes and monosynaptically excites motorneurons involved in flight and jumping. We show that imidacloprid (IMD) impairs neural responses to visual stimuli at sublethal concentrations, and these effects are sustained two and twenty-four hours after treatment. Most significantly, IMD disrupted bursting, a coding property important for motion detection. Specifically, IMD reduced the DCMD peak firing rate within bursts at ecologically relevant doses of 10 ng/g (ng IMD per g locust body weight). Effects on DCMD firing translate to deficits in collision avoidance behaviours: exposure to 10 ng/g IMD attenuates escape manoeuvers while 100 ng/g IMD prevents the ability to fly and walk. We show that, at ecologically-relevant doses, IMD causes significant and lasting impairment of an important pathway involved with visual sensory coding and escape behaviours. These results show, for the first time, that a neonicotinoid pesticide directly impairs an important, taxonomically conserved, motion-sensitive visual network.
NASA Astrophysics Data System (ADS)
Endrizzi, S.; Gruber, S.; Dall'Amico, M.; Rigon, R.
2013-12-01
This contribution describes the new version of GEOtop which emerges after almost eight years of development from the original version. GEOtop now integrate the 3D Richards equation with a new numerical method; improvements were made on the treatment of surface waters by using the shallow water equation. The freezing-soil module was greatly improved, and the evapotranspiration -vegetation modelling is now based on a double layer scheme. Here we discuss the rational of each choice that was made, and we compare the differences between the actual solutions and the old solutions. In doing we highlight the issues that we faced during the development, including the trade-off between complexity and simplicity of the code, the requirements of a shared development, the different branches that were opened during the evolution of the code, and why we think that a code like GEOtop is indeed necessary. Models where the hydrological cycle is simplified can be built on the base of perceptual models that neglects some fundamental aspects of the hydrological processes, of which some examples are presented. At the same time, also process-based models like GEOtop can indeed neglect some fundamental process: but this is made evident with the comparison with measurements, especially when data are imposed ex-ante, instead than calibrated.
Institutional Controls and Educational Research.
ERIC Educational Resources Information Center
Homan, Roger
1990-01-01
Recognizing tendencies toward contract research and possible consequences, advocates creating a conduct code to regulate educational research and protect its integrity. Reports survey responses from 48 British institutions, showing no systematic code. States confidence in supervisory discretion currently guides research. Proposes a specific code…
Energy System Basics and Distribution Integration Video Series | Energy
renewablesparticularly solar photovoltaic (PV) technologiesonto the distribution grid. Solar Energy Technologies PV Integration Case Studies Integrating Photovoltaic Systems onto Secondary Network Distribution Systems Standards and Codes for U.S. Photovoltaic System Installation Network-Optimal Control of Photovoltaics on
Finite element analysis of inviscid subsonic boattail flow
NASA Technical Reports Server (NTRS)
Chima, R. V.; Gerhart, P. M.
1981-01-01
A finite element code for analysis of inviscid subsonic flows over arbitrary nonlifting planar or axisymmetric bodies is described. The code solves a novel primitive variable formulation of the coupled irrotationality and compressible continuity equations. Results for flow over a cylinder, a sphere, and a NACA 0012 airfoil verify the code. Computed subcritical flows over an axisymmetric boattailed afterbody compare well with finite difference results and experimental data. Interative coupling with an integral turbulent boundary layer code shows strong viscous effects on the inviscid flow. Improvements in code efficiency and extensions to transonic flows are discussed.
Structural Integrity of Water Reactor Pressure Boundary Components.
1980-08-01
Boiler and Pressure Vessel Code , Sec. Ill). Estimates of the upper shelf K level from small-specimen...from Appendix A of Section XI of the ASME Boiler and Pressure Vessel Code [11. Figure 9 shows this same data set, together with earlier data for...0969, NRL Memo- randum Report 4063, Sep. 1979. 11. Section XI - ASME Boiler and Pressure Vessel Code , Rules for Inservice Inspection of Nuclear
Integrated Modeling of the Battlespace Environment
2010-10-01
Office of Counsel.Code 1008.3 ADOR/Director NCST E. R. Franchi , 7000 Public Affairs (Unclassified/ Unlimited Only). Code 7030 4 Division, Code...ESMF: the Hakamada- Akasofu-Fry version 2 (HAFv2) solar wind model and the global assimilation of ionospheric mea- surements (GAIM1) forecast...ground-truth measurements for comparison with the solar wind predictions. Global Assimilation of Ionospheric Measurements The GAIMv2.3 effort
NASA Technical Reports Server (NTRS)
Wade, Randall S.; Jones, Bailey
2009-01-01
A computer program loads configuration code into a Xilinx field-programmable gate array (FPGA), reads back and verifies that code, reloads the code if an error is detected, and monitors the performance of the FPGA for errors in the presence of radiation. The program consists mainly of a set of VHDL files (wherein "VHDL" signifies "VHSIC Hardware Description Language" and "VHSIC" signifies "very-high-speed integrated circuit").
Glez-Peña, Daniel; Díaz, Fernando; Hernández, Jesús M; Corchado, Juan M; Fdez-Riverola, Florentino
2009-06-18
Bioinformatics and medical informatics are two research fields that serve the needs of different but related communities. Both domains share the common goal of providing new algorithms, methods and technological solutions to biomedical research, and contributing to the treatment and cure of diseases. Although different microarray techniques have been successfully used to investigate useful information for cancer diagnosis at the gene expression level, the true integration of existing methods into day-to-day clinical practice is still a long way off. Within this context, case-based reasoning emerges as a suitable paradigm specially intended for the development of biomedical informatics applications and decision support systems, given the support and collaboration involved in such a translational development. With the goals of removing barriers against multi-disciplinary collaboration and facilitating the dissemination and transfer of knowledge to real practice, case-based reasoning systems have the potential to be applied to translational research mainly because their computational reasoning paradigm is similar to the way clinicians gather, analyze and process information in their own practice of clinical medicine. In addressing the issue of bridging the existing gap between biomedical researchers and clinicians who work in the domain of cancer diagnosis, prognosis and treatment, we have developed and made accessible a common interactive framework. Our geneCBR system implements a freely available software tool that allows the use of combined techniques that can be applied to gene selection, clustering, knowledge extraction and prediction for aiding diagnosis in cancer research. For biomedical researches, geneCBR expert mode offers a core workbench for designing and testing new techniques and experiments. For pathologists or oncologists, geneCBR diagnostic mode implements an effective and reliable system that can diagnose cancer subtypes based on the analysis of microarray data using a CBR architecture. For programmers, geneCBR programming mode includes an advanced edition module for run-time modification of previous coded techniques. geneCBR is a new translational tool that can effectively support the integrative work of programmers, biomedical researches and clinicians working together in a common framework. The code is freely available under the GPL license and can be obtained at http://www.genecbr.org.
Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language
NASA Astrophysics Data System (ADS)
Heaphy, R. T.; Burke, M. P.; Love, J. T.
2015-12-01
Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.
NASA Technical Reports Server (NTRS)
Lin, Shu (Principal Investigator); Uehara, Gregory T.; Nakamura, Eric; Chu, Cecilia W. P.
1996-01-01
The (64, 40, 8) subcode of the third-order Reed-Muller (RM) code for high-speed satellite communications is proposed. The RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. The progress made toward achieving the goal of implementing a decoder system based upon this code is summarized. The development of the integrated circuit prototype sub-trellis IC, particularly focusing on the design methodology, is addressed.
STAGS Developments for Residual Strength Analysis Methods for Metallic Fuselage Structures
NASA Technical Reports Server (NTRS)
Young, Richard D.; Rose, Cheryl A.
2014-01-01
A summary of advances in the Structural Analysis of General Shells (STAGS) finite element code for the residual strength analysis of metallic fuselage structures, that were realized through collaboration between the structures group at NASA Langley, and Dr. Charles Rankin is presented. The majority of the advancements described were made in the 1990's under the NASA Airframe Structural Integrity Program (NASIP). Example results from studies that were conducted using the STAGS code to develop improved understanding of the nonlinear response of cracked fuselage structures subjected to combined loads are presented. An integrated residual strength analysis methodology for metallic structure that models crack growth to predict the effect of cracks on structural integrity is demonstrated
A Roadmap to Continuous Integration for ATLAS Software Development
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.
Code Samples Used for Complexity and Control
NASA Astrophysics Data System (ADS)
Ivancevic, Vladimir G.; Reid, Darryn J.
2015-11-01
The following sections are included: * MathematicaⓇ Code * Generic Chaotic Simulator * Vector Differential Operators * NLS Explorer * 2C++ Code * C++ Lambda Functions for Real Calculus * Accelerometer Data Processor * Simple Predictor-Corrector Integrator * Solving the BVP with the Shooting Method * Linear Hyperbolic PDE Solver * Linear Elliptic PDE Solver * Method of Lines for a Set of the NLS Equations * C# Code * Iterative Equation Solver * Simulated Annealing: A Function Minimum * Simple Nonlinear Dynamics * Nonlinear Pendulum Simulator * Lagrangian Dynamics Simulator * Complex-Valued Crowd Attractor Dynamics * Freeform Fortran Code * Lorenz Attractor Simulator * Complex Lorenz Attractor * Simple SGE Soliton * Complex Signal Presentation * Gaussian Wave Packet * Hermitian Matrices * Euclidean L2-Norm * Vector/Matrix Operations * Plain C-Code: Levenberg-Marquardt Optimizer * Free Basic Code: 2D Crowd Dynamics with 3000 Agents
Quasi 1D Modeling of Mixed Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Kopasakis, George; Connolly, Joseph W.; Paxson, Daniel E.; Woolwine, Kyle J.
2012-01-01
The AeroServoElasticity task under the NASA Supersonics Project is developing dynamic models of the propulsion system and the vehicle in order to conduct research for integrated vehicle dynamic performance. As part of this effort, a nonlinear quasi 1-dimensional model of the 2-dimensional bifurcated mixed compression supersonic inlet is being developed. The model utilizes computational fluid dynamics for both the supersonic and subsonic diffusers. The oblique shocks are modeled utilizing compressible flow equations. This model also implements variable geometry required to control the normal shock position. The model is flexible and can also be utilized to simulate other mixed compression supersonic inlet designs. The model was validated both in time and in the frequency domain against the legacy LArge Perturbation INlet code, which has been previously verified using test data. This legacy code written in FORTRAN is quite extensive and complex in terms of the amount of software and number of subroutines. Further, the legacy code is not suitable for closed loop feedback controls design, and the simulation environment is not amenable to systems integration. Therefore, a solution is to develop an innovative, more simplified, mixed compression inlet model with the same steady state and dynamic performance as the legacy code that also can be used for controls design. The new nonlinear dynamic model is implemented in MATLAB Simulink. This environment allows easier development of linear models for controls design for shock positioning. The new model is also well suited for integration with a propulsion system model to study inlet/propulsion system performance, and integration with an aero-servo-elastic system model to study integrated vehicle ride quality, vehicle stability, and efficiency.
Content Coding of Psychotherapy Transcripts Using Labeled Topic Models.
Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic
2017-03-01
Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, nonstandardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the labeled latent Dirichlet allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of 0.79, and 0.70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scalable method for accurate automated coding of psychotherapy sessions that perform better than comparable discriminative methods at session-level coding and can also predict fine-grained codes.
Content Coding of Psychotherapy Transcripts Using Labeled Topic Models
Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic
2016-01-01
Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, non-standardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly-available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the Labeled Latent Dirichlet Allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic (ROC) curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of .79, and .70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scaleable method for accurate automated coding of psychotherapy sessions that performs better than comparable discriminative methods at session-level coding and can also predict fine-grained codes. PMID:26625437
CACTI: Free, Open-Source Software for the Sequential Coding of Behavioral Interactions
Glynn, Lisa H.; Hallgren, Kevin A.; Houck, Jon M.; Moyers, Theresa B.
2012-01-01
The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713
Academic Integrity: Information Systems Education Perspective
ERIC Educational Resources Information Center
McHaney, Roger; Cronan, Timothy Paul; Douglas, David E.
2016-01-01
Academic integrity receives a great deal of attention in institutions of higher education. Universities and colleges provide specific honor codes or have administrative units to promote good behaviors and resolve dishonesty allegations. Students, faculty, and staff have stakes in maintaining high levels of academic integrity to ensure their…
Navier-Stokes analysis of cold scramjet-afterbody flows
NASA Technical Reports Server (NTRS)
Baysal, Oktay; Engelund, Walter C.; Eleshaky, Mohamed E.
1989-01-01
The progress of two efforts in coding solutions of Navier-Stokes equations is summarized. The first effort concerns a 3-D space marching parabolized Navier-Stokes (PNS) code being modified to compute the supersonic mixing flow through an internal/external expansion nozzle with multicomponent gases. The 3-D PNS equations, coupled with a set of species continuity equations, are solved using an implicit finite difference scheme. The completed work is summarized and includes code modifications for four chemical species, computing the flow upstream of the upper cowl for a theoretical air mixture, developing an initial plane solution for the inner nozzle region, and computing the flow inside the nozzle for both a N2/O2 mixture and a Freon-12/Ar mixture, and plotting density-pressure contours for the inner nozzle region. The second effort concerns a full Navier-Stokes code. The species continuity equations account for the diffusion of multiple gases. This 3-D explicit afterbody code has the ability to use high order numerical integration schemes such as the 4th order MacCormack, and the Gottlieb-MacCormack schemes. Changes to the work are listed and include, but are not limited to: (1) internal/external flow capability; (2) new treatments of the cowl wall boundary conditions and relaxed computations around the cowl region and cowl tip; (3) the entering of the thermodynamic and transport properties of Freon-12, Ar, O, and N; (4) modification to the Baldwin-Lomax turbulence model to account for turbulent eddies generated by cowl walls inside and external to the nozzle; and (5) adopting a relaxation formula to account for the turbulence in the mixing shear layer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forest, E.; Bengtsson, J.; Reusch, M.F.
1991-04-01
The full power of Yoshida's technique is exploited to produce an arbitrary order implicit symplectic integrator and multi-map explicit integrator. This implicit integrator uses a characteristic function involving the force term alone. Also we point out the usefulness of the plain Ruth algorithm in computing Taylor series map using the techniques first introduced by Berz in his 'COSY-INFINITY' code.
NASA Astrophysics Data System (ADS)
Jos, Sujit; Kumar, Preetam; Chakrabarti, Saswat
Orthogonal and quasi-orthogonal codes are integral part of any DS-CDMA based cellular systems. Orthogonal codes are ideal for use in perfectly synchronous scenario like downlink cellular communication. Quasi-orthogonal codes are preferred over orthogonal codes in the uplink communication where perfect synchronization cannot be achieved. In this paper, we attempt to compare orthogonal and quasi-orthogonal codes in presence of timing synchronization error. This will give insight into the synchronization demands in DS-CDMA systems employing the two classes of sequences. The synchronization error considered is smaller than chip duration. Monte-Carlo simulations have been carried out to verify the analytical and numerical results.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
NASA Astrophysics Data System (ADS)
Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.
2017-04-01
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.
NASA Astrophysics Data System (ADS)
He, Lirong; Cui, Guangmang; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting
2015-03-01
Coded exposure photography makes the motion de-blurring a well-posed problem. The integration pattern of light is modulated using the method of coded exposure by opening and closing the shutter within the exposure time, changing the traditional shutter frequency spectrum into a wider frequency band in order to preserve more image information in frequency domain. The searching method of optimal code is significant for coded exposure. In this paper, an improved criterion of the optimal code searching is proposed by analyzing relationship between code length and the number of ones in the code, considering the noise effect on code selection with the affine noise model. Then the optimal code is obtained utilizing the method of genetic searching algorithm based on the proposed selection criterion. Experimental results show that the time consuming of searching optimal code decreases with the presented method. The restoration image is obtained with better subjective experience and superior objective evaluation values.
Kim, Seungill; Kim, Myung-Shin; Kim, Yong-Min; Yeom, Seon-In; Cheong, Kyeongchae; Kim, Ki-Tae; Jeon, Jongbum; Kim, Sunggil; Kim, Do-Sun; Sohn, Seong-Han; Lee, Yong-Hwan; Choi, Doil
2015-01-01
The onion (Allium cepa L.) is one of the most widely cultivated and consumed vegetable crops in the world. Although a considerable amount of onion transcriptome data has been deposited into public databases, the sequences of the protein-coding genes are not accurate enough to be used, owing to non-coding sequences intermixed with the coding sequences. We generated a high-quality, annotated onion transcriptome from de novo sequence assembly and intensive structural annotation using the integrated structural gene annotation pipeline (ISGAP), which identified 54,165 protein-coding genes among 165,179 assembled transcripts totalling 203.0 Mb by eliminating the intron sequences. ISGAP performed reliable annotation, recognizing accurate gene structures based on reference proteins, and ab initio gene models of the assembled transcripts. Integrative functional annotation and gene-based SNP analysis revealed a whole biological repertoire of genes and transcriptomic variation in the onion. The method developed in this study provides a powerful tool for the construction of reference gene sets for organisms based solely on de novo transcriptome data. Furthermore, the reference genes and their variation described here for the onion represent essential tools for molecular breeding and gene cloning in Allium spp. PMID:25362073
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, P. T.; Dickson, T. L.; Yin, S.
The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include themore » NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.« less
RNAcentral: an international database of ncRNA sequences
Williams, Kelly Porter
2014-10-28
The field of non-coding RNA biology has been hampered by the lack of availability of a comprehensive, up-to-date collection of accessioned RNA sequences. Here we present the first release of RNAcentral, a database that collates and integrates information from an international consortium of established RNA sequence databases. The initial release contains over 8.1 million sequences, including representatives of all major functional classes. A web portal (http://rnacentral.org) provides free access to data, search functionality, cross-references, source code and an integrated genome browser for selected species.
An integrated radiation physics computer code system.
NASA Technical Reports Server (NTRS)
Steyn, J. J.; Harris, D. W.
1972-01-01
An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.
Tamori, Akihiro; Nishiguchi, Shuhei; Shiomi, Susumu; Hayashi, Takehiro; Kobayashi, Sawako; Habu, Daiki; Takeda, Tadashi; Seki, Shuichi; Hirohashi, Kazuhiro; Tanaka, Hiromu; Kubo, Shoji
2005-08-01
Hepatocellular carcinoma (HCC) has been reported in patients in whom hepatitis C virus (HCV) was eliminated by interferon (IFN) therapy. We examined the pathogenesis of HCC in patients with sustained viral response. Operable HCC developed in 7 of 342 patients cured of HCV infection by IFN monotherapy. No patient abused alcohol or had diabetes mellitus or obesity. Resected specimens of HCC were histologically evaluated. DNA extracted from HCC was examined by polymerase chain reaction (PCR) to locate hepatitis B virus (HBV) DNA. HBV integration sites in human genome were identified by cassette-ligation-mediated PCR. HBV DNA was not amplified in serum samples from any of the seven patients with HCC and was found in liver in four patients. In the latter four patients, HBV DNA was integrated into the human genome of HCC. In two of these patients, covalently closed circular HBV (cccHBV) was also detected. The patients with HBV DNA integration were free of HCV for more than 3 yr. In two of the three patients without HBV DNA integration, the surrounding liver showed cirrhosis. The liver of HCC with HBV DNA integration had not progressed to cirrhosis. Three of the four tumors with HBV integration had one integration site each, located at chromosomes 11q12, 11q22-23, and 22q11, respectively. The other tumor had two integration sites, situated at chromosomes 11q13 and 14q32. At chromosome 11q12, HBV DNA was integrated into protein-coding genome, the function of which remains unclear. Integrated HBV DNA may play a role in hepatocarcinogenesis after the clearance of HCV by IFN treatment.
Assuring Structural Integrity in Army Systems
1985-02-28
power plants are* I. American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code , Section III - Rules for Construction of Nuclear...Power Plant Components; 2. ASNE Boiler and Pressure Vessel Code , Section XI, Rules for In-Service Inspection of Nuclear Power Plant Components; and 3
Code modernization and modularization of APEX and SWAT watershed simulation models
USDA-ARS?s Scientific Manuscript database
SWAT (Soil and Water Assessment Tool) and APEX (Agricultural Policy / Environmental eXtender) are respectively large and small watershed simulation models derived from EPIC Environmental Policy Integrated Climate), a field-scale agroecology simulation model. All three models are coded in FORTRAN an...
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
Parametric color coding of digital subtraction angiography.
Strother, C M; Bender, F; Deuerling-Zheng, Y; Royalty, K; Pulfer, K A; Baumgart, J; Zellerhoff, M; Aagaard-Kienitz, B; Niemann, D B; Lindstrom, M L
2010-05-01
Color has been shown to facilitate both visual search and recognition tasks. It was our purpose to examine the impact of a color-coding algorithm on the interpretation of 2D-DSA acquisitions by experienced and inexperienced observers. Twenty-six 2D-DSA acquisitions obtained as part of routine clinical care from subjects with a variety of cerebrovascular disease processes were selected from an internal data base so as to include a variety of disease states (aneurysms, AVMs, fistulas, stenosis, occlusions, dissections, and tumors). Three experienced and 3 less experienced observers were each shown the acquisitions on a prerelease version of a commercially available double-monitor workstation (XWP, Siemens Healthcare). Acquisitions were presented first as a subtracted image series and then as a single composite color-coded image of the entire acquisition. Observers were then asked a series of questions designed to assess the value of the color-coded images for the following purposes: 1) to enhance their ability to make a diagnosis, 2) to have confidence in their diagnosis, 3) to plan a treatment, and 4) to judge the effect of a treatment. The results were analyzed by using 1-sample Wilcoxon tests. Color-coded images enhanced the ease of evaluating treatment success in >40% of cases (P < .0001). They also had a statistically significant impact on treatment planning, making planning easier in >20% of the cases (P = .0069). In >20% of the examples, color-coding made diagnosis and treatment planning easier for all readers (P < .0001). Color-coding also increased the confidence of diagnosis compared with the use of DSA alone (P = .056). The impact of this was greater for the naïve readers than for the expert readers. At no additional cost in x-ray dose or contrast medium, color-coding of DSA enhanced the conspicuity of findings on DSA images. It was particularly useful in situations in which there was a complex flow pattern and in evaluation of pre- and posttreatment acquisitions. Its full potential remains to be defined.
Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes
NASA Astrophysics Data System (ADS)
Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana
2013-03-01
The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.
40 CFR 268.40 - Applicability of treatment standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., biodegradation as defined by the technology code BIODG, carbon adsorption as defined by the technology code CARBN....42 Table 1 of this Part, for nonwastewaters; and, biodegradation as defined by the technology code...
40 CFR 268.40 - Applicability of treatment standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., biodegradation as defined by the technology code BIODG, carbon adsorption as defined by the technology code CARBN....42 Table 1 of this Part, for nonwastewaters; and, biodegradation as defined by the technology code...
40 CFR 268.40 - Applicability of treatment standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., biodegradation as defined by the technology code BIODG, carbon adsorption as defined by the technology code CARBN....42 Table 1 of this Part, for nonwastewaters; and, biodegradation as defined by the technology code...
System for loading executable code into volatile memory in a downhole tool
Hall, David R.; Bartholomew, David B.; Johnson, Monte L.
2007-09-25
A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.
1984-06-29
effort that requires hard copy documentation. As a result, there are generally numerous delays in providing current quality information. In the FoF...process have had fixed controls or were based on " hard -coded" information. A template, for example, is hard -coded information defining the shape of a...represents soft-coded control information. (Although manual handling of punch tapes still possess some of the limitations of " hard -coded" controls
NASA Technical Reports Server (NTRS)
Padovan, J.; Adams, M.; Fertis, J.; Zeid, I.; Lam, P.
1982-01-01
Finite element codes are used in modelling rotor-bearing-stator structure common to the turbine industry. Engine dynamic simulation is used by developing strategies which enable the use of available finite element codes. benchmarking the elements developed are benchmarked by incorporation into a general purpose code (ADINA); the numerical characteristics of finite element type rotor-bearing-stator simulations are evaluated through the use of various types of explicit/implicit numerical integration operators. Improving the overall numerical efficiency of the procedure is improved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cullen, D.E.
1978-07-04
The code SIGMA1 Doppler broadens evaluated cross sections in the ENDF/B format. The code can be applied only to data that vary as a linear function of energy and cross section between tabulated points. This report describes the methods used in the code and serves as a user's guide to the code. 6 figures, 2 tables.
75 FR 61345 - Airworthiness Directives; Eclipse Aerospace, Inc. Model EA500 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-05
... faulty integration of hardware and software, which could result in unannunciated, uncommanded changes in..., altitude preselect, and/or transponder codes. We are issuing this AD to correct faulty integration of...
Astrophysics Source Code Library Enhancements
NASA Astrophysics Data System (ADS)
Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.
2015-09-01
The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, J.-L.; Furman, M.A.; Azevedo, A.W.
2004-04-19
We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.
NEAMS Update. Quarterly Report for October - December 2011.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, K.
2012-02-16
The Advanced Modeling and Simulation Office within the DOE Office of Nuclear Energy (NE) has been charged with revolutionizing the design tools used to build nuclear power plants during the next 10 years. To accomplish this, the DOE has brought together the national laboratories, U.S. universities, and the nuclear energy industry to establish the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program. The mission of NEAMS is to modernize computer modeling of nuclear energy systems and improve the fidelity and validity of modeling results using contemporary software environments and high-performance computers. NEAMS will create a set of engineering-level codes aimedmore » at designing and analyzing the performance and safety of nuclear power plants and reactor fuels. The truly predictive nature of these codes will be achieved by modeling the governing phenomena at the spatial and temporal scales that dominate the behavior. These codes will be executed within a simulation environment that orchestrates code integration with respect to spatial meshing, computational resources, and execution to give the user a common 'look and feel' for setting up problems and displaying results. NEAMS is building upon a suite of existing simulation tools, including those developed by the federal Scientific Discovery through Advanced Computing and Advanced Simulation and Computing programs. NEAMS also draws upon existing simulation tools for materials and nuclear systems, although many of these are limited in terms of scale, applicability, and portability (their ability to be integrated into contemporary software and hardware architectures). NEAMS investments have directly and indirectly supported additional NE research and development programs, including those devoted to waste repositories, safeguarded separations systems, and long-term storage of used nuclear fuel. NEAMS is organized into two broad efforts, each comprising four elements. The quarterly highlights October-December 2011 are: (1) Version 1.0 of AMP, the fuel assembly performance code, was tested on the JAGUAR supercomputer and released on November 1, 2011, a detailed discussion of this new simulation tool is given; (2) A coolant sub-channel model and a preliminary UO{sub 2} smeared-cracking model were implemented in BISON, the single-pin fuel code, more information on how these models were developed and benchmarked is given; (3) The Object Kinetic Monte Carlo model was implemented to account for nucleation events in meso-scale simulations and a discussion of the significance of this advance is given; (4) The SHARP neutronics module, PROTEUS, was expanded to be applicable to all types of reactors, and a discussion of the importance of PROTEUS is given; (5) A plan has been finalized for integrating the high-fidelity, three-dimensional reactor code SHARP with both the systems-level code RELAP7 and the fuel assembly code AMP. This is a new initiative; (6) Work began to evaluate the applicability of AMP to the problem of dry storage of used fuel and to define a relevant problem to test the applicability; (7) A code to obtain phonon spectra from the force-constant matrix for a crystalline lattice has been completed. This important bridge between subcontinuum and continuum phenomena is discussed; (8) Benchmarking was begun on the meso-scale, finite-element fuels code MARMOT to validate its new variable splitting algorithm; (9) A very computationally demanding simulation of diffusion-driven nucleation of new microstructural features has been completed. An explanation of the difficulty of this simulation is given; (10) Experiments were conducted with deformed steel to validate a crystal plasticity finite-element code for bodycentered cubic iron; (11) The Capability Transfer Roadmap was completed and published as an internal laboratory technical report; (12) The AMP fuel assembly code input generator was integrated into the NEAMS Integrated Computational Environment (NiCE). More details on the planned NEAMS computing environment is given; and (13) The NEAMS program website (neams.energy.gov) is nearly ready to launch.« less
The Adult Attachment Projective Picture System: integrating attachment into clinical assessment.
George, Carol; West, Malcolm
2011-01-01
This article summarizes the development and validation of the Adult Attachment Projective System (AAP), a measure we developed from the Bowlby-Ainsworth developmental tradition to assess adult attachment status. The AAP has demonstrated excellent concurrent validity with the Adult Attachment Interview (George, Kaplan, & Main, 1984/1985/1996; Main & Goldwyn, 1985-1994; Main, Goldwyn, & Hesse, 2003), interjudge reliability, and test-retest reliability, with no effects of verbal intelligence or social desirability. The AAP coding and classification system and application in clinical and community samples are summarized. Finally, we introduce the 3 other articles that are part of this Special Section and discuss the use of the AAP in therapeutic assessment and treatment.
High Speed Research Program Structural Acoustics Multi-Year Summary Report
NASA Technical Reports Server (NTRS)
Beier, Theodor H.; Bhat, Waman V.; Rizzi, Stephen A.; Silcox, Richard J.; Simpson, Myles A.
2005-01-01
This report summarizes the work conducted by the Structural Acoustics Integrated Technology Development (ITD) Team under NASA's High Speed Research (HSR) Phase II program from 1993 to 1999. It is intended to serve as a reference for future researchers by documenting the results of the interior noise and sonic fatigue technology development activities conducted during this period. For interior noise, these activities included excitation modeling, structural acoustic response modeling, development of passive treatments and active controls, and prediction of interior noise. For sonic fatigue, these activities included loads prediction, materials characterization, sonic fatigue code development, development of response reduction techniques, and generation of sonic fatigue design requirements. Also included are lessons learned and recommendations for future work.
Ship to Shore Data Communication and Prioritization
2011-12-01
First Out FTP File Transfer Protocol GCCS-M Global Command and Control System Maritime HAIPE High Assurance Internet Protocol Encryptor HTTP Hypertext...Transfer Protocol (world wide web protocol ) IBS Integrated Bar Code System IDEF0 Integration Definition IER Information Exchange Requirements...INTEL Intelligence IP Internet Protocol IPT Integrated Product Team ISEA In-Service Engineering Agent ISNS Integrated Shipboard Network System IT
ERIC Educational Resources Information Center
Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien
2013-01-01
This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…
Shapiro, Matthew L.
2017-01-01
Memory can inform goal-directed behavior by linking current opportunities to past outcomes. The orbitofrontal cortex (OFC) may guide value-based responses by integrating the history of stimulus–reward associations into expected outcomes, representations of predicted hedonic value and quality. Alternatively, the OFC may rapidly compute flexible “online” reward predictions by associating stimuli with the latest outcome. OFC neurons develop predictive codes when rats learn to associate arbitrary stimuli with outcomes, but the extent to which predictive coding depends on most recent events and the integrated history of rewards is unclear. To investigate how reward history modulates OFC activity, we recorded OFC ensembles as rats performed spatial discriminations that differed only in the number of rewarded trials between goal reversals. The firing rate of single OFC neurons distinguished identical behaviors guided by different goals. When >20 rewarded trials separated goal switches, OFC ensembles developed stable and anticorrelated population vectors that predicted overall choice accuracy and the goal selected in single trials. When <10 rewarded trials separated goal switches, OFC population vectors decorrelated rapidly after each switch, but did not develop anticorrelated firing patterns or predict choice accuracy. The results show that, whereas OFC signals respond rapidly to contingency changes, they predict choices only when reward history is relatively stable, suggesting that consecutive rewarded episodes are needed for OFC computations that integrate reward history into expected outcomes. SIGNIFICANCE STATEMENT Adapting to changing contingencies and making decisions engages the orbitofrontal cortex (OFC). Previous work shows that OFC function can either improve or impair learning depending on reward stability, suggesting that OFC guides behavior optimally when contingencies apply consistently. The mechanisms that link reward history to OFC computations remain obscure. Here, we examined OFC unit activity as rodents performed tasks controlled by contingencies that varied reward history. When contingencies were stable, OFC neurons signaled past, present, and pending events; when contingencies were unstable, past and present coding persisted, but predictive coding diminished. The results suggest that OFC mechanisms require stable contingencies across consecutive episodes to integrate reward history, represent predicted outcomes, and inform goal-directed choices. PMID:28115481
Blake, Margaret Lehman; Tompkins, Connie A.; Scharp, Victoria L.; Meigh, Kimberly M.; Wambaugh, Julie
2014-01-01
Coarse coding is the activation of broad semantic fields that can include multiple word meanings and a variety of features, including those peripheral to a word’s core meaning. It is a partially domain-general process related to general discourse comprehension and contributes to both literal and non-literal language processing. Adults with damage to the right cerebral hemisphere (RHD) and a coarse coding deficit are particularly slow to activate features of words that are relatively distant or peripheral. This manuscript reports a pre-efficacy study of Contextual Constraint Treatment (CCT), a novel, implicit treatment designed to increase the efficiency of coarse coding with the goal of improving narrative comprehension and other language performance that relies on coarse coding. Participants were four adults with RHD. The study used a single-subject controlled experimental design across subjects and behaviors. The treatment involves pre-stimulation, using a hierarchy of strong- and moderately-biased contexts, to prime the intended distantly-related features of critical stimulus words. Three of the four participants exhibited gains in auditory narrative discourse comprehension, the primary outcome measure. All participants exhibited generalization to untreated items. No strong generalization to processing nonliteral language was evident. The results indicate that CCT yields both improved efficiency of the coarse coding process and generalization to narrative comprehension. PMID:24983133
A Dual Origin of the Xist Gene from a Protein-Coding Gene and a Set of Transposable Elements
Elisaphenko, Eugeny A.; Kolesnikov, Nikolay N.; Shevchenko, Alexander I.; Rogozin, Igor B.; Nesterova, Tatyana B.; Brockdorff, Neil; Zakian, Suren M.
2008-01-01
X-chromosome inactivation, which occurs in female eutherian mammals is controlled by a complex X-linked locus termed the X-inactivation center (XIC). Previously it was proposed that genes of the XIC evolved, at least in part, as a result of pseudogenization of protein-coding genes. In this study we show that the key XIC gene Xist, which displays fragmentary homology to a protein-coding gene Lnx3, emerged de novo in early eutherians by integration of mobile elements which gave rise to simple tandem repeats. The Xist gene promoter region and four out of ten exons found in eutherians retain homology to exons of the Lnx3 gene. The remaining six Xist exons including those with simple tandem repeats detectable in their structure have similarity to different transposable elements. Integration of mobile elements into Xist accompanies the overall evolution of the gene and presumably continues in contemporary eutherian species. Additionally we showed that the combination of remnants of protein-coding sequences and mobile elements is not unique to the Xist gene and is found in other XIC genes producing non-coding nuclear RNA. PMID:18575625
Bahadori, Amir A; Sato, Tatsuhiko; Slaba, Tony C; Shavers, Mark R; Semones, Edward J; Van Baalen, Mary; Bolch, Wesley E
2013-10-21
NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.
NASA Astrophysics Data System (ADS)
Bahadori, Amir A.; Sato, Tatsuhiko; Slaba, Tony C.; Shavers, Mark R.; Semones, Edward J.; Van Baalen, Mary; Bolch, Wesley E.
2013-10-01
NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.
Technical integration of hippocampus, Basal Ganglia and physical models for spatial navigation.
Fox, Charles; Humphries, Mark; Mitchinson, Ben; Kiss, Tamas; Somogyvari, Zoltan; Prescott, Tony
2009-01-01
Computational neuroscience is increasingly moving beyond modeling individual neurons or neural systems to consider the integration of multiple models, often constructed by different research groups. We report on our preliminary technical integration of recent hippocampal formation, basal ganglia and physical environment models, together with visualisation tools, as a case study in the use of Python across the modelling tool-chain. We do not present new modeling results here. The architecture incorporates leaky-integrator and rate-coded neurons, a 3D environment with collision detection and tactile sensors, 3D graphics and 2D plots. We found Python to be a flexible platform, offering a significant reduction in development time, without a corresponding significant increase in execution time. We illustrate this by implementing a part of the model in various alternative languages and coding styles, and comparing their execution times. For very large-scale system integration, communication with other languages and parallel execution may be required, which we demonstrate using the BRAHMS framework's Python bindings.
The Italian Code of Medical Deontology: characterizing features of its 2014 edition.
Conti, Andrea Alberto
2015-09-14
The latest edition of the Italian Code of Medical Deontology has been released by the Italian Federation of the Registers of Physicians and Dentists in May 2014 (1). The previous edition of the Italian Code dated back to 2006 (2), and it has been integrated and updated by a multi-professional and inter-disciplinary panel involving, besides physicians, representatives of scientific societies and trade unions, jurisconsults and experts in bioethics....
Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code
1979-06-01
dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was
NASA Technical Reports Server (NTRS)
Bittker, D. A.; Scullin, V. J.
1984-01-01
A general chemical kinetics code is described for complex, homogeneous ideal gas reactions in any chemical system. The main features of the GCKP84 code are flexibility, convenience, and speed of computation for many different reaction conditions. The code, which replaces the GCKP code published previously, solves numerically the differential equations for complex reaction in a batch system or one dimensional inviscid flow. It also solves numerically the nonlinear algebraic equations describing the well stirred reactor. A new state of the art numerical integration method is used for greatly increased speed in handling systems of stiff differential equations. The theory and the computer program, including details of input preparation and a guide to using the code are given.
Barnett, Miya L; Niec, Larissa N; Peer, Samuel O; Jent, Jason F; Weinstein, Allison; Gisbert, Patricia; Simpson, Gregory
2017-01-01
Although behavioral parent training is considered efficacious treatment for childhood conduct problems, not all families benefit equally from treatment. Some parents take longer to change their behaviors and others ultimately drop out. Understanding how therapist behaviors impact parental engagement is necessary to improve treatment utilization. This study investigated how different techniques of therapist in vivo feedback (i.e., coaching) influenced parent attrition and skill acquisition in parent-child interaction therapy (PCIT). Participants included 51 parent-child dyads who participated in PCIT. Children (age: M = 5.03, SD = 1.65) were predominately minorities (63% White Hispanic, 16% African American or Black). Eight families discontinued treatment prematurely. Therapist coaching techniques during the first session of treatment were coded using the Therapist-Parent Interaction Coding System, and parent behaviors were coded with the Dyadic Parent-Child Interaction Coding System, Third Edition. Parents who received more responsive coaching acquired child-centered parenting skills more quickly. Therapists used fewer responsive techniques and more drills with families who dropped out of treatment. A composite of therapist behaviors accurately predicted treatment completion for 86% of families. Although group membership was correctly classified for the treatment completers, only 1 dropout was accurately predicted. Findings suggest that therapist in vivo feedback techniques may impact parents' success in PCIT and that responsive coaching may be particularly relevant.
A model that integrates eye velocity commands to keep track of smooth eye displacements.
Blohm, Gunnar; Optican, Lance M; Lefèvre, Philippe
2006-08-01
Past results have reported conflicting findings on the oculomotor system's ability to keep track of smooth eye movements in darkness. Whereas some results indicate that saccades cannot compensate for smooth eye displacements, others report that memory-guided saccades during smooth pursuit are spatially correct. Recently, it was shown that the amount of time before the saccade made a difference: short-latency saccades were retinotopically coded, whereas long-latency saccades were spatially coded. Here, we propose a model of the saccadic system that can explain the available experimental data. The novel part of this model consists of a delayed integration of efferent smooth eye velocity commands. Two alternative physiologically realistic neural mechanisms for this integration stage are proposed. Model simulations accurately reproduced prior findings. Thus, this model reconciles the earlier contradictory reports from the literature about compensation for smooth eye movements before saccades because it involves a slow integration process.
van den Berg, Ronald; Roerdink, Jos B T M; Cornelissen, Frans W
2010-01-22
An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.
Concreteness and relational effects on recall of adjective-noun pairs.
Paivio, A; Khan, M; Begg, I
2000-09-01
Extending previous research on the problem, we studied the effects of concreteness and relatedness of adjective-noun pairs on free recall, cued recall, and memory integration. Two experiments varied the attributes in paired associates lists or sentences. Consistent with predictions from dual coding theory and prior results with noun-noun pairs, both experiments showed that the effects of concreteness were strong and independent of relatedness in free recall and cued recall. The generally positive effects of relatedness were absent in the case of free recall of sentences. The two attributes also had independent (additive) effects on integrative memory as measured by conditionalized free recall of pairs. Integration as measured by the increment from free to cued recall occurred consistently only when pairs were high in both concreteness and relatedness. Explanations focused on dual coding and relational-distinctiveness processing theories as well as task variables that affect integration measures.
NASA Astrophysics Data System (ADS)
Privas, E.; Archier, P.; Bernard, D.; De Saint Jean, C.; Destouche, C.; Leconte, P.; Noguère, G.; Peneliau, Y.; Capote, R.
2016-02-01
A new IAEA Coordinated Research Project (CRP) aims to test, validate and improve the IRDF library. Among the isotopes of interest, the modelisation of the 238U capture and fission cross sections represents a challenging task. A new description of the 238U neutrons induced reactions in the fast energy range is within progress in the frame of an IAEA evaluation consortium. The Nuclear Data group of Cadarache participates in this effort utilizing the 238U spectral indices measurements and Post Irradiated Experiments (PIE) carried out in the fast reactors MASURCA (CEA Cadarache) and PHENIX (CEA Marcoule). Such a collection of experimental results provides reliable integral information on the (n,γ) and (n,f) cross sections. This paper presents the Integral Data Assimilation (IDA) technique of the CONRAD code used to propagate the uncertainties of the integral data on the 238U cross sections of interest for dosimetry applications.
Data integration to prioritize drugs using genomics and curated data.
Louhimo, Riku; Laakso, Marko; Belitskin, Denis; Klefström, Juha; Lehtonen, Rainer; Hautaniemi, Sampsa
2016-01-01
Genomic alterations affecting drug target proteins occur in several tumor types and are prime candidates for patient-specific tailored treatments. Increasingly, patients likely to benefit from targeted cancer therapy are selected based on molecular alterations. The selection of a precision therapy benefiting most patients is challenging but can be enhanced with integration of multiple types of molecular data. Data integration approaches for drug prioritization have successfully integrated diverse molecular data but do not take full advantage of existing data and literature. We have built a knowledge-base which connects data from public databases with molecular results from over 2200 tumors, signaling pathways and drug-target databases. Moreover, we have developed a data mining algorithm to effectively utilize this heterogeneous knowledge-base. Our algorithm is designed to facilitate retargeting of existing drugs by stratifying samples and prioritizing drug targets. We analyzed 797 primary tumors from The Cancer Genome Atlas breast and ovarian cancer cohorts using our framework. FGFR, CDK and HER2 inhibitors were prioritized in breast and ovarian data sets. Estrogen receptor positive breast tumors showed potential sensitivity to targeted inhibitors of FGFR due to activation of FGFR3. Our results suggest that computational sample stratification selects potentially sensitive samples for targeted therapies and can aid in precision medicine drug repositioning. Source code is available from http://csblcanges.fimm.fi/GOPredict/.
Parallel CARLOS-3D code development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Putnam, J.M.; Kotulski, J.D.
1996-02-01
CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions tomore » the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.« less
MacPherson, Hugh; Thomas, Kate
2008-04-01
In the literature on acupuncture research, the active (or specific) component of acupuncture is almost always presented as acupuncture needling alone. However, specific components, by definition, should include all interventions driven by acupuncture theory that are also believed to be causally associated with outcome. In this paper, we explore the delivery of self-help advice as a component of the process of acupuncture care, and discuss the implications for future trial designs. In a nested qualitative study, six acupuncturists were interviewed about the treatments they provided within a pragmatic clinical trial. The acupuncturists practised individualised acupuncture according to traditional principles. Audiotapes were transcribed and coded and the contents analysed by case and by theme. The analysis focuses on a priori and emergent themes associated with the process of delivering self-help advice as described by the practitioners. Individualised self-help advice is seen by practitioners as being an integral part of the acupuncture treatment that they provide for patients with low back pain. Several categories of generic advice were described; all were embedded in the acupuncture diagnosis. These included; movement, exercise and stretching to move 'qi stagnation'; rest in cases of 'qi deficiency'; diet when the digestive system was compromised; protection from the elements where indicated by the diagnosis, e.g. Bi Syndrome. According to the practitioners, longer-term benefits require the active participation of patients in their self-care. Simplified concepts derived from acupuncture theory, such as 'stagnation' and 'energy', are employed as an integral part of the process of care, in order to engage patients in lifestyle changes, help them to understand their condition, and to see ways in which they can help themselves. Within acupuncture care, self-help advice is not seen as an 'add-on' but rather as an integral and interactive component of a theory-based complex intervention. Studies designed to evaluate the overall effectiveness of traditional acupuncture should accommodate the full range of therapeutic components, strategies and related patient-centred treatment processes. In acupuncture trials, non-needling components, such as self-help advice, when drawn directly from the diagnosis and integral to the process of care, should not be misclassified as incidental, non-specific, or placebo if we are to accurately assess the value of treatment as delivered.
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
Theory-based model for the pedestal, edge stability and ELMs in tokamaks
NASA Astrophysics Data System (ADS)
Pankin, A. Y.; Bateman, G.; Brennan, D. P.; Schnack, D. D.; Snyder, P. B.; Voitsekhovitch, I.; Kritz, A. H.; Janeschitz, G.; Kruger, S.; Onjun, T.; Pacher, G. W.; Pacher, H. D.
2006-04-01
An improved model for triggering edge localized mode (ELM) crashes is developed for use within integrated modelling simulations of the pedestal and ELM cycles at the edge of H-mode tokamak plasmas. The new model is developed by using the BALOO, DCON and ELITE ideal MHD stability codes to derive parametric expressions for the ELM triggering threshold. The whole toroidal mode number spectrum is studied with these codes. The DCON code applies to low mode numbers, while the BALOO code applies to only high mode numbers and the ELITE code applies to intermediate and high mode numbers. The variables used in the parametric stability expressions are the normalized pressure gradient and the parallel current density, which drive ballooning and peeling modes. Two equilibria motivated by DIII-D geometry with different plasma triangularities are studied. It is found that the stable region in the high triangularity discharge covers a much larger region of parameter space than the corresponding stability region in the low triangularity discharge. The new ELM trigger model is used together with a previously developed model for pedestal formation and ELM crashes in the ASTRA integrated modelling code to follow the time evolution of the temperature profiles during ELM cycles. The ELM frequencies obtained in the simulations of low and high triangularity discharges are observed to increase with increasing heating power. There is a transition from second stability to first ballooning mode stability as the heating power is increased in the high triangularity simulations. The results from the ideal MHD stability codes are compared with results from the resistive MHD stability code NIMROD.
Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hideki; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki
2009-10-01
To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.
NASA Astrophysics Data System (ADS)
Hardy, Luke A.; Chang, Chun-Hung; Myers, Erinn M.; Kennelly, Michael J.; Fried, Nathaniel M.
2016-02-01
Treatment of female stress urinary incontinence (SUI) by laser thermal remodeling of subsurface tissues is studied. Light transport, heat transfer, and thermal damage simulations were performed for transvaginal and transurethral methods. Monte Carlo (MC) provided absorbed photon distributions in tissue layers (vaginal wall, endopelvic fascia, urethral wall). Optical properties (n,μa,μs,g) were assigned to each tissue at λ=1064 nm. A 5-mm-diameter laser beam and power of 5 W for 15 s was used, based on previous experiments. MC output was converted into absorbed energy, serving as input for ANSYS finite element heat transfer simulations of tissue temperatures over time. Convective heat transfer was simulated with contact cooling probe set at 0 °C. Thermal properties (κ,c,ρ) were assigned to each tissue layer. MATLAB code was used for Arrhenius integral thermal damage calculations. A temperature matrix was constructed from ANSYS output, and finite sum was incorporated to approximate Arrhenius integral calculations. Tissue damage properties (Ea,A) were used to compute Arrhenius sums. For the transvaginal approach, 37% of energy was absorbed in endopelvic fascia layer with 0.8% deposited beyond it. Peak temperature was 71°C, treatment zone was 0.8-mm-diameter, and almost all of 2.7-mm-thick vaginal wall was preserved. For transurethral approach, 18% energy was absorbed in endopelvic fascia with 0.3% deposited beyond it. Peak temperature was 80°C, treatment zone was 2.0-mm-diameter, and only 0.6 mm of 2.4-mm-thick urethral wall was preserved. A transvaginal approach is more feasible than transurethral approach for laser treatment of SUI.
Software for Collaborative Engineering of Launch Rockets
NASA Technical Reports Server (NTRS)
Stanley, Thomas Troy
2003-01-01
The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.
Kashikar-Zuck, Susmita; Tran, Susan T; Barnett, Kimberly; Bromberg, Maggie H; Strotman, Daniel; Sil, Soumitri; Thomas, Staci M; Joffe, Naomi; Ting, Tracy V; Williams, Sara E; Myer, Gregory D
2016-01-01
Adolescents with juvenile fibromyalgia (JFM) are typically sedentary despite recommendations for physical exercise, a key component of pain management. Interventions such as cognitive-behavior therapy (CBT) are beneficial but do not improve exercise participation. The objective of this study was to obtain preliminary information about the feasibility, safety, and acceptability of a new intervention--Fibromyalgia Integrative Training for Teens (FIT Teens), which combines CBT with specialized neuromuscular exercise training modified from evidence-based injury prevention protocols. Participants were 17 adolescent females (aged 12 to 18 y) with JFM. Of these, 11 completed the 8-week (16 sessions) FIT Teens program in a small-group format with 3 to 4 patients per group. Patients provided detailed qualitative feedback via individual semistructured interviews after treatment. Interview content was coded using thematic analysis. Interventionist feedback about treatment implementation was also obtained. The intervention was found to be feasible, well tolerated, and safe for JFM patients. Barriers to enrollment (50% of those approached) included difficulties with transportation or time conflicts. Treatment completers enjoyed the group format and reported increased self-efficacy, strength, and motivation to exercise. Participants also reported decreased pain and increased energy levels. Feedback from participants and interventionists was incorporated into a final treatment manual to be used in a future trial. Results of this study provided initial support for the new FIT Teens program. An integrative strategy of combining pain coping skills via CBT enhanced with tailored exercise specifically designed to improve confidence in movement and improving activity participation holds promise in the management of JFM.
Kashikar-Zuck, Susmita; Tran, Susan T.; Barnett, Kimberly; Bromberg, Maggie H.; Strotman, Daniel; Sil, Soumitri; Thomas, Staci M.; Joffe, Naomi; Ting, Tracy V.; Williams, Sara E.; Myer, Gregory D.
2015-01-01
Objectives Adolescents with juvenile fibromyalgia (JFM) are typically sedentary despite recommendations for physical exercise, a key component of pain management. Interventions such as cognitive behavioral therapy (CBT) are beneficial but do not improve exercise participation. The objective of this study was to obtain preliminary information about the feasibility, safety, and acceptability of a new intervention - Fibromyalgia Integrative Training for Teens (FIT Teens), which combines CBT with specialized neuromuscular exercise training modified from evidence-based injury prevention protocols. Methods Participants were 17 adolescent females (ages 12–18) with JFM. Of these, 11 completed the 8-week (16-session) FIT Teens program in a small-group format with 3–4 patients per group. Patients provided detailed qualitative feedback via individual semi-structured interviews after treatment. Interview content was coded using thematic analysis. Interventionist feedback about treatment implementation was also obtained. Results The intervention was found to be feasible, well-tolerated, and safe for JFM patients. Barriers to enrollment (50% of those approached) included difficulties with transportation or time conflicts. Treatment completers enjoyed the group format and reported increased self-efficacy, strength, and motivation to exercise. Participants also reported decreased pain and increased energy levels. Feedback from participants and interventionists was incorporated into a final treatment manual to be used in a future trial. Discussion Results of this study provided initial support for the new FIT Teens program. An integrative strategy of combining pain coping skills via CBT enhanced with tailored exercise specifically designed to improve confidence in movement and improving activity participation holds promise in the management of JFM. PMID:25724022
Positive futures? The impact of HIV infection on achieving health, wealth and future planning.
Harding, Richard; Molloy, Tim
2008-05-01
Although HIV is now cast as a chronic condition with favourable clinical outcomes under new treatments, it is unclear how living with HIV affects expectations and planning for the future. This mixed-methods study aimed to investigate UK gay men's expectations of their own future when living with HIV, and to identify the heath and social interventions required to enhance roles, participation and personal fulfilment. A preliminary focus group identified relevant domains of enquiry for a subsequent online cross-sectional survey. A total of 347 gay men living in the UK with HIV participated in the survey, and 56.6% were currently on treatment. However, high 7-day prevalence of psychological and physical symptoms was identified (42.6% in pain, 80.2% worrying); 57.8% perceived reduced career options due to their infection and 71.8% reduced life expectancy. Being on treatment was not significantly associated with perceived life expectancy. Coded open-ended survey data identified eight principle themes related to goal planning and attainment. The integrated open and closed data items offer an understanding of barriers and challenges that focus on poor mental health due to clinical inattention, discrimination and stigma, poor career and job opportunities due to benefit and workplace inflexibility and lack of understanding, a lack of personal goals and associated skills deficit related to confidence and self esteem. Gay men living with HIV require an integrated holistic approach to wellbeing that incorporates clinical, social and individual intervention in order to lead productive lives with maximum benefit from treatment gains.
Zaytsev, Yury V; Morrison, Abigail
2012-01-01
High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.
Zaytsev, Yury V.; Morrison, Abigail
2013-01-01
High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique. PMID:23316158
Semantic Integration and Age of Acquisition Effects in Code-Blend Comprehension
ERIC Educational Resources Information Center
Giezen, Marcel R.; Emmorey, Karen
2016-01-01
Semantic and lexical decision tasks were used to investigate the mechanisms underlying code-blend facilitation: the finding that hearing bimodal bilinguals comprehend signs in American Sign Language (ASL) and spoken English words more quickly when they are presented together simultaneously than when each is presented alone. More robust…
49 CFR 178.337-3 - Structural integrity.
Code of Federal Regulations, 2011 CFR
2011-10-01
... stress at any point in the cargo tank may not exceed the maximum allowable stress value prescribed in... ASME Code or the ASTM standard to which the material is manufactured. (3) The maximum design stress at... ASME Code. The cargo tank design must include calculation of stresses generated by design pressure, the...
[The QR code in society, economy and medicine--fields of application, options and chances].
Flaig, Benno; Parzeller, Markus
2011-01-01
2D codes like the QR Code ("Quick Response") are becoming more and more common in society and medicine. The application spectrum and benefits in medicine and other fields are described. 2D codes can be created free of charge on any computer with internet access without any previous knowledge. The codes can be easily used in publications, presentations, on business cards and posters. Editors choose between contact details, text or a hyperlink as information behind the code. At expert conferences, linkage by QR Code allows the audience to download presentations and posters quickly. The documents obtained can then be saved, printed, processed etc. Fast access to stored data in the internet makes it possible to integrate additional and explanatory multilingual videos into medical posters. In this context, a combination of different technologies (printed handout, QR Code and screen) may be reasonable.
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
Primary care nurses' performance in motivational interviewing: a quantitative descriptive study.
Östlund, Ann-Sofi; Kristofferzon, Marja-Leena; Häggström, Elisabeth; Wadensten, Barbro
2015-07-25
Motivational interviewing is a collaborative conversational style intended to strengthen motivation to change. It has been shown to be effective in addressing many different lifestyle problems as well as in chronic disease management, and many disease prevention guidelines promote use of motivational interviewing. The aim of the present study was twofold: to assess to what extent the primary care nurses in the study perform motivational interviewing according to the Motivational Interviewing Treatment Integrity Code and to investigate how the participating primary care nurses rated their own performance in motivational interviewing. The study was based on twelve primary care nurses' audio-recorded motivational interviewing sessions with patients (total 32 sessions). After each session, the nurses completed a questionnaire regarding their experience of their own performance in motivational interviewing. The audio-recorded sessions were analyzed using Motivational Interviewing Integrity Code 3.1.1. None of the nurses achieved beginning proficiency in all parts of any motivational interviewing sessions and two nurses did not achieve beginning proficiency in any parts or sessions. Making more complex than simple reflections was the specific verbal behavior/summary score that most nurses achieved. Beginning proficiency/competency in "percent open questions" was the summary score that fewest achieved. Primary care nurses did not achieve beginning proficiency/competency in all aspects of motivational interviewing in their recorded sessions with patients, where lifestyle change was discussed. This indicates a need for improvement and thus additional training, feedback and supervision in clinical practice with motivational interviewing.
Wallace, Sarah J; Worrall, Linda; Rose, Tanya; Le Dorze, Guylaine
2017-11-12
This study synthesised the findings of three separate consensus processes exploring the perspectives of key stakeholder groups about important aphasia treatment outcomes. This process was conducted to generate recommendations for outcome domains to be included in a core outcome set for aphasia treatment trials. International Classification of Functioning, Disability, and Health codes were examined to identify where the groups of: (1) people with aphasia, (2) family members, (3) aphasia researchers, and (4) aphasia clinicians/managers, demonstrated congruence in their perspectives regarding important treatment outcomes. Codes were contextualized using qualitative data. Congruence across three or more stakeholder groups was evident for ICF chapters: Mental functions; Communication; and Services, systems, and policies. Quality of life was explicitly identified by clinicians/managers and researchers, while people with aphasia and their families identified outcomes known to be determinants of quality of life. Core aphasia outcomes include: language, emotional wellbeing, communication, patient-reported satisfaction with treatment and impact of treatment, and quality of life. International Classification of Functioning, Disability, and Health coding can be used to compare stakeholder perspectives and identify domains for core outcome sets. Pairing coding with qualitative data may ensure important nuances of meaning are retained. Implications for rehabilitation The outcomes measured in treatment research should be relevant to stakeholders and support health care decision making. Core outcome sets (agreed, minimum set of outcomes, and outcome measures) are increasingly being used to ensure the relevancy and consistency of the outcomes measured in treatment studies. Important aphasia treatment outcomes span all components of the International Classification of Functioning, Disability, and Health. Stakeholders demonstrated congruence in the identification of important outcomes which related Mental functions; Communication; Services, systems, and policies; and Quality of life. A core outcome set for aphasia treatment research should include measures relating to: language, emotional wellbeing, communication, patient-reported satisfaction with treatment and impact of treatment, and quality of life. Coding using the International Classification of Functioning, Disability, and Health, presents a novel methodology for the comparison of stakeholder perspectives to inform recommendations for outcome constructs to be included in a core outcome set. Coding can be paired with qualitative data to ensure nuances of meaning are retained.
NASA Technical Reports Server (NTRS)
Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.
1982-01-01
Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.
Tsuyuki, Kiyomi; Surratt, Hilary L.; Levi-Minzi, Maria A.; O’Grady, Catherine L.; Kurtz, Steven P.
2014-01-01
The diversion of antiretroviral medications (ARVs) has implications for the integrity and success of HIV care, however little is known about the ARV illicit market. This paper aimed to identify the motivations for buying illicit ARVs and to describe market dynamics. Semi-structured interviews (n=44) were conducted with substance-involved individuals living with HIV with a history of purchasing ARVs on the street. Grounded theory was used to code and analyze interviews. Motivations for buying ARVs on the illicit market were: to repurchase ARVs after having diverted them for money or drugs; having limited access or low quality health care; to replace lost or ruined ARVs; and to buy a back-up stock of ARVs. This study identified various structural barriers to HIV treatment and ARV adherence that incentivized ARV diversion. Findings highlight the need to improve patient-provider relationships, ensure continuity of care, and integrate services to engage and retain high-needs populations. PMID:25092512
Tsuyuki, Kiyomi; Surratt, Hilary L; Levi-Minzi, Maria A; O'Grady, Catherine L; Kurtz, Steven P
2015-05-01
The diversion of antiretroviral medications (ARVs) has implications for the integrity and success of HIV care, however little is known about the ARV illicit market. This paper aimed to identify the motivations for buying illicit ARVs and to describe market dynamics. Semi-structured interviews (n = 44) were conducted with substance-involved individuals living with HIV who have a history of purchasing ARVs on the street. Grounded theory was used to code and analyze interviews. Motivations for buying ARVs on the illicit market were: to repurchase ARVs after having diverted them for money or drugs; having limited access or low quality health care; to replace lost or ruined ARVs; and to buy a back-up stock of ARVs. This study identified various structural barriers to HIV treatment and ARV adherence that incentivized ARV diversion. Findings highlight the need to improve patient-provider relationships, ensure continuity of care, and integrate services to engage and retain high-needs populations.
Methadone, Buprenorphine and Preferences for Opioid Agonist Treatment: A Qualitative Analysis
Yarborough, Bobbi Jo H.; Stumbo, Scott P.; McCarty, Dennis; Mertens, Jennifer; Weisner, Constance; Green, Carla A.
2016-01-01
Background Patients and clinicians have begun to recognize the advantages and disadvantages of buprenorphine relative to methadone, but factors that influence choices between these two medications remain unclear. For example, we know little about how patients’ preferences and previous experiences influence treatment decisions. Understanding these issues may enhance treatment engagement and retention. Methods Adults with opioid dependence (n = 283) were recruited from two integrated health systems to participate in interviews focused on prior experiences with treatment for opioid dependence, knowledge of medication options, preferences for treatment, and experiences with treatment for chronic pain in the context of problems with opioids. Interviews were audio-recorded, transcribed verbatim, and coded using Atlas.ti. Results Our analysis revealed seven areas of consideration for opioid agonist treatment decision-making: 1) awareness of treatment options; 2) expectations and goals for duration of treatment and abstinence; 3) prior experience with buprenorphine or methadone; 4) need for accountability and structured support; 5) preference to avoid methadone clinics or associated stigma; 6) fear of continued addiction and perceived difficulty of withdrawal; and 7) pain control. Conclusion The availability of medication options increases the need for clear communication between clinicians and patients, for additional patient education about these medications, and for collaboration and patient influence over choices in treatment decision-making. Our results suggest that access to both methadone and buprenorphine will increase treatment options and patient choice and may enhance treatment adherence and outcomes. PMID:26796596
HZETRN: A heavy ion/nucleon transport code for space radiations
NASA Technical Reports Server (NTRS)
Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.
1991-01-01
The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.
Design and Implementation of a REST API for the Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Integrated decision support systems for regulatory applications benefit from standardindustry practices such as code reuse, test-driven development, and modularization. Theseapproaches make meeting the federal government’s goals of transparency, efficiency, and quality assurance ...
Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Design of supercritical cascades with high solidity
NASA Technical Reports Server (NTRS)
Sanz, J. M.
1982-01-01
The method of complex characteristics of Garabedian and Korn was successfully used to design shockless cascades with solidities of up to one. A code was developed using this method and a new hodograph transformation of the flow onto an ellipse. This code allows the design of cascades with solidities of up to two and larger turning angles. The equations of potential flow are solved in a complex hodograph like domain by setting a characteristic initial value problem and integrating along suitable paths. The topology that the new mapping introduces permits a simpler construction of these paths of integration.
Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles
NASA Technical Reports Server (NTRS)
Wieting, A. R.
1979-01-01
The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.
A model of transverse fuel injection applied to the computation of supersonic combustor flow
NASA Technical Reports Server (NTRS)
Rogers, R. C.
1979-01-01
A two-dimensional, nonreacting flow model of the aerodynamic interaction of a transverse hydrogen jet within a supersonic mainstream has been developed. The model assumes profile shapes of mass flux, pressure, flow angle, and hydrogen concentration and produces downstream profiles of the other flow parameters under the constraints of the integrated conservation equations. These profiles are used as starting conditions for an existing finite difference parabolic computer code for the turbulent supersonic combustion of hydrogen. Integrated mixing and flow profile results obtained from the computer code compare favorably with existing data for the supersonic combustion of hydrogen.
Integration of Rotor Aerodynamic Optimization with the Conceptual Design of a Large Civil Tiltrotor
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.
2010-01-01
Coupling of aeromechanics analysis with vehicle sizing is demonstrated with the CAMRAD II aeromechanics code and NDARC sizing code. The example is optimization of cruise tip speed with rotor/wing interference for the Large Civil Tiltrotor (LCTR2) concept design. Free-wake models were used for both rotors and the wing. This report is part of a NASA effort to develop an integrated analytical capability combining rotorcraft aeromechanics, structures, propulsion, mission analysis, and vehicle sizing. The present paper extends previous efforts by including rotor/wing interference explicitly in the rotor performance optimization and implicitly in the sizing.
Yang, Y. M.; Geurts, M.; Smilowitz, J. B.; Sterpin, E.; Bednarz, B. P.
2015-01-01
Purpose: Several groups are exploring the integration of magnetic resonance (MR) image guidance with radiotherapy to reduce tumor position uncertainty during photon radiotherapy. The therapeutic gain from reducing tumor position uncertainty using intrafraction MR imaging during radiotherapy could be partially offset if the negative effects of magnetic field-induced dose perturbations are not appreciated or accounted for. The authors hypothesize that a more rotationally symmetric modality such as helical tomotherapy will permit a systematic mediation of these dose perturbations. This investigation offers a unique look at the dose perturbations due to homogeneous transverse magnetic field during the delivery of Tomotherapy® Treatment System plans under varying degrees of rotational beamlet symmetry. Methods: The authors accurately reproduced treatment plan beamlet and patient configurations using the Monte Carlo code geant4. This code has a thoroughly benchmarked electromagnetic particle transport physics package well-suited for the radiotherapy energy regime. The three approved clinical treatment plans for this study were for a prostate, head and neck, and lung treatment. The dose heterogeneity index metric was used to quantify the effect of the dose perturbations to the target volumes. Results: The authors demonstrate the ability to reproduce the clinical dose–volume histograms (DVH) to within 4% dose agreement at each DVH point for the target volumes and most planning structures, and therefore, are able to confidently examine the effects of transverse magnetic fields on the plans. The authors investigated field strengths of 0.35, 0.7, 1, 1.5, and 3 T. Changes to the dose heterogeneity index of 0.1% were seen in the prostate and head and neck case, reflecting negligible dose perturbations to the target volumes, a change from 5.5% to 20.1% was observed with the lung case. Conclusions: This study demonstrated that the effect of external magnetic fields can be mitigated by exploiting a more rotationally symmetric treatment modality. PMID:25652485
Barth, Jürgen; Michlig, Nadja; Munder, Thomas
2014-01-01
Randomised controlled trials (RCTs) of psychotherapeutic interventions assume that specific techniques are used in treatments, which are responsible for changes in the client's symptoms. This assumption also holds true for meta-analyses, where evidence for specific interventions and techniques is compiled. However, it has also been argued that different treatments share important techniques and that an upcoming consensus about useful treatment strategies is leading to a greater integration of treatments. This makes assumptions about the effectiveness of specific interventions ingredients questionable if the shared (common) techniques are more often used in interventions than are the unique techniques. This study investigated the unique or shared techniques in RCTs of cognitive-behavioural therapy (CBT) and short-term psychodynamic psychotherapy (STPP). Psychotherapeutic techniques were coded from 42 masked treatment descriptions of RCTs in the field of depression (1979–2010). CBT techniques were often used in studies identified as either CBT or STPP. However, STPP techniques were only used in STPP-identified studies. Empirical clustering of treatment descriptions did not confirm the original distinction of CBT versus STPP, but instead showed substantial heterogeneity within both approaches. Extraction of psychotherapeutic techniques from the treatment descriptions is feasible and could be used as a content-based approach to classify treatments in systematic reviews and meta-analyses. PMID:25750827
Reggiani, Claudio; Coppens, Sandra; Sekhara, Tayeb; Dimov, Ivan; Pichon, Bruno; Lufin, Nicolas; Addor, Marie-Claude; Belligni, Elga Fabia; Digilio, Maria Cristina; Faletra, Flavio; Ferrero, Giovanni Battista; Gerard, Marion; Isidor, Bertrand; Joss, Shelagh; Niel-Bütschi, Florence; Perrone, Maria Dolores; Petit, Florence; Renieri, Alessandra; Romana, Serge; Topa, Alexandra; Vermeesch, Joris Robert; Lenaerts, Tom; Casimir, Georges; Abramowicz, Marc; Bontempi, Gianluca; Vilain, Catheline; Deconinck, Nicolas; Smits, Guillaume
2017-07-19
Tissue-specific integrative omics has the potential to reveal new genic elements important for developmental disorders. Two pediatric patients with global developmental delay and intellectual disability phenotype underwent array-CGH genetic testing, both showing a partial deletion of the DLG2 gene. From independent human and murine omics datasets, we combined copy number variations, histone modifications, developmental tissue-specific regulation, and protein data to explore the molecular mechanism at play. Integrating genomics, transcriptomics, and epigenomics data, we describe two novel DLG2 promoters and coding first exons expressed in human fetal brain. Their murine conservation and protein-level evidence allowed us to produce new DLG2 gene models for human and mouse. These new genic elements are deleted in 90% of 29 patients (public and in-house) showing partial deletion of the DLG2 gene. The patients' clinical characteristics expand the neurodevelopmental phenotypic spectrum linked to DLG2 gene disruption to cognitive and behavioral categories. While protein-coding genes are regarded as well known, our work shows that integration of multiple omics datasets can unveil novel coding elements. From a clinical perspective, our work demonstrates that two new DLG2 promoters and exons are crucial for the neurodevelopmental phenotypes associated with this gene. In addition, our work brings evidence for the lack of cross-annotation in human versus mouse reference genomes and nucleotide versus protein databases.
Kim, Seungill; Kim, Myung-Shin; Kim, Yong-Min; Yeom, Seon-In; Cheong, Kyeongchae; Kim, Ki-Tae; Jeon, Jongbum; Kim, Sunggil; Kim, Do-Sun; Sohn, Seong-Han; Lee, Yong-Hwan; Choi, Doil
2015-02-01
The onion (Allium cepa L.) is one of the most widely cultivated and consumed vegetable crops in the world. Although a considerable amount of onion transcriptome data has been deposited into public databases, the sequences of the protein-coding genes are not accurate enough to be used, owing to non-coding sequences intermixed with the coding sequences. We generated a high-quality, annotated onion transcriptome from de novo sequence assembly and intensive structural annotation using the integrated structural gene annotation pipeline (ISGAP), which identified 54,165 protein-coding genes among 165,179 assembled transcripts totalling 203.0 Mb by eliminating the intron sequences. ISGAP performed reliable annotation, recognizing accurate gene structures based on reference proteins, and ab initio gene models of the assembled transcripts. Integrative functional annotation and gene-based SNP analysis revealed a whole biological repertoire of genes and transcriptomic variation in the onion. The method developed in this study provides a powerful tool for the construction of reference gene sets for organisms based solely on de novo transcriptome data. Furthermore, the reference genes and their variation described here for the onion represent essential tools for molecular breeding and gene cloning in Allium spp. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
A New Code SORD for Simulation of Polarized Light Scattering in the Earth Atmosphere
NASA Technical Reports Server (NTRS)
Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Aliaksandr; Holben, Brent
2016-01-01
We report a new publicly available radiative transfer (RT) code for numerical simulation of polarized light scattering in plane-parallel atmosphere of the Earth. Using 44 benchmark tests, we prove high accuracy of the new RT code, SORD (Successive ORDers of scattering). We describe capabilities of SORD and show run time for each test on two different machines. At present, SORD is supposed to work as part of the Aerosol Robotic NETwork (AERONET) inversion algorithm. For natural integration with the AERONET software, SORD is coded in Fortran 90/95. The code is available by email request from the corresponding (first) author or from ftp://climate1.gsfc.nasa.gov/skorkin/SORD/.
Boltzmann Transport Code Update: Parallelization and Integrated Design Updates
NASA Technical Reports Server (NTRS)
Heinbockel, J. H.; Nealy, J. E.; DeAngelis, G.; Feldman, G. A.; Chokshi, S.
2003-01-01
The on going efforts at developing a web site for radiation analysis is expected to result in an increased usage of the High Charge and Energy Transport Code HZETRN. It would be nice to be able to do the requested calculations quickly and efficiently. Therefore the question arose, "Could the implementation of parallel processing speed up the calculations required?" To answer this question two modifications of the HZETRN computer code were created. The first modification selected the shield material of Al(2219) , then polyethylene and then Al(2219). The modified Fortran code was labeled 1SSTRN.F. The second modification considered the shield material of CO2 and Martian regolith. This modified Fortran code was labeled MARSTRN.F.
NASA Technical Reports Server (NTRS)
Steyn, J. J.; Born, U.
1970-01-01
A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.
Scherr, Karen A.; Fagerlin, Angela; Williamson, Lillie D.; Davis, J. Kelly; Fridman, Ilona; Atyeo, Natalie; Ubel, Peter A.
2016-01-01
Background Physicians’ recommendations affect patients’ treatment choices. However, most research relies on physicians’ or patients’ retrospective reports of recommendations, which offer a limited perspective and have limitations such as recall bias. Objective To develop a reliable and valid method to measure the strength of physician recommendations using direct observation of clinical encounters. Methods Clinical encounters (n = 257) were recorded as part of a larger study of prostate cancer decision making. We used an iterative process to create the 5-point Physician Recommendation Coding System (PhyReCS). To determine reliability, research assistants double-coded 50 transcripts. To establish content validity, we used one-way ANOVAs to determine whether relative treatment recommendation scores differed as a function of which treatment patients received. To establish concurrent validity, we examined whether patients’ perceived treatment recommendations matched our coded recommendations. Results The PhyReCS was highly reliable (Krippendorf’s alpha =. 89, 95% CI [.86, .91]). The average relative treatment recommendation score for each treatment was higher for individuals who received that particular treatment. For example, the average relative surgery recommendation score was higher for individuals who received surgery versus radiation (mean difference = .98, SE = .18, p < .001) or active surveillance (mean difference = 1.10, SE = .14, p < .001). Patients’ perceived recommendations matched coded recommendations 81% of the time. Conclusion The PhyReCS is a reliable and valid way to capture the strength of physician recommendations. We believe that the PhyReCS would be helpful for other researchers who wish to study physician recommendations, an important part of patient decision making. PMID:27343015
Scherr, Karen A; Fagerlin, Angela; Williamson, Lillie D; Davis, J Kelly; Fridman, Ilona; Atyeo, Natalie; Ubel, Peter A
2017-01-01
Physicians' recommendations affect patients' treatment choices. However, most research relies on physicians' or patients' retrospective reports of recommendations, which offer a limited perspective and have limitations such as recall bias. To develop a reliable and valid method to measure the strength of physician recommendations using direct observation of clinical encounters. Clinical encounters (n = 257) were recorded as part of a larger study of prostate cancer decision making. We used an iterative process to create the 5-point Physician Recommendation Coding System (PhyReCS). To determine reliability, research assistants double-coded 50 transcripts. To establish content validity, we used 1-way analyses of variance to determine whether relative treatment recommendation scores differed as a function of which treatment patients received. To establish concurrent validity, we examined whether patients' perceived treatment recommendations matched our coded recommendations. The PhyReCS was highly reliable (Krippendorf's alpha = 0.89, 95% CI [0.86, 0.91]). The average relative treatment recommendation score for each treatment was higher for individuals who received that particular treatment. For example, the average relative surgery recommendation score was higher for individuals who received surgery versus radiation (mean difference = 0.98, SE = 0.18, P < 0.001) or active surveillance (mean difference = 1.10, SE = 0.14, P < 0.001). Patients' perceived recommendations matched coded recommendations 81% of the time. The PhyReCS is a reliable and valid way to capture the strength of physician recommendations. We believe that the PhyReCS would be helpful for other researchers who wish to study physician recommendations, an important part of patient decision making. © The Author(s) 2016.
Horizontal vectorization of electron repulsion integrals.
Pritchard, Benjamin P; Chow, Edmond
2016-10-30
We present an efficient implementation of the Obara-Saika algorithm for the computation of electron repulsion integrals that utilizes vector intrinsics to calculate several primitive integrals concurrently in a SIMD vector. Initial benchmarks display a 2-4 times speedup with AVX instructions over comparable scalar code, depending on the basis set. Speedup over scalar code is found to be sensitive to the level of contraction of the basis set, and is best for (lAlB|lClD) quartets when lD = 0 or lB=lD=0, which makes such a vectorization scheme particularly suitable for density fitting. The basic Obara-Saika algorithm, how it is vectorized, and the performance bottlenecks are analyzed and discussed. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Light Water Reactor Sustainability Program Status Report on the Grizzly Code Enhancements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novascone, Stephen R.; Spencer, Benjamin W.; Hales, Jason D.
2013-09-01
This report summarizes work conducted during fiscal year 2013 to work toward developing a full capability to evaluate fracture contour J-integrals to the Grizzly code. This is a progress report on ongoing work. During the next fiscal year, this capability will be completed, and Grizzly will be capable of evaluating these contour integrals for 3D geometry, including the effects of thermal stress and large deformation. A usable, limited capability has been developed, which is capable of evaluating these integrals on 2D geometry, without considering the effects of material nonlinearity, thermal stress or large deformation. This report presents an overview ofmore » the approach used, along with a demonstration of the current capability in Grizzly, including a comparison with an analytical solution.« less
2011-01-01
Background There is increasing interest by chiropractors in North America regarding integration into mainstream healthcare; however, there is limited information about attitudes towards the profession among conventional healthcare providers, including orthopaedic surgeons. Methods We administered a 43-item cross-sectional survey to 1000 Canadian and American orthopaedic surgeons that inquired about demographic variables and their attitudes towards chiropractic. Our survey included an option for respondants to include written comments, and our present analysis is restricted to these comments. Two reviewers, independantly and in duplicate, coded all written comments using thematic analysis. Results 487 surgeons completed the survey (response rate 49%), and 174 provided written comments. Our analysis revealed 8 themes and 24 sub-themes represented in surgeons' comments. Reported themes were: variability amongst chiropractors (n = 55); concerns with chiropractic treatment (n = 54); areas where chiropractic is perceived as effective (n = 43); unethical behavior (n = 43); patient interaction (n = 36); the scientific basis of chiropractic (n = 26); personal experiences with chiropractic (n = 21); and chiropractic training (n = 18). Common sub-themes endorsed by surgeon's were diversity within the chiropractic profession as a barrier to increased interprofessional collaboration, endorsement for chiropractic treatment of musculoskeletal complaints, criticism for treatment of non-musculoskeletal complaints, and concern over whether chiropractic care was evidence-based. Conclusions Our analysis identified a number of issues that will have to be considered by the chiropractic profession as part of its efforts to further integrate chiropractic into mainstream healthcare. PMID:21970333
ERIC Educational Resources Information Center
Naidoo, Devika
2010-01-01
This paper provides an analysis of the extent of integration at a historically advantaged school. A qualitative multi-method case study allowed for in-depth analysis of integration in the school. Bernstein's theory of code, classification, boundary and power framed the study. Data analysis showed that: racial desegregation was achieved at student…
Podlogar, Matthew C.; Novins, Douglas K.
2015-01-01
Research regarding the quality of behavioral health care for American Indian (AI) children and adolescents is extremely limited, and no study has considered the qualitative perspectives of the AI children receiving such services or that of their families. This pilot study investigated AI patient and family perspectives of what quality of care means to them. Data were drawn from interviews of parents (n = 15), and the youth (if they were age 11 or older; n = 11) of 16 children and adolescents who received treatment at three behavioral health programs serving AI communities. Interview transcripts were coded and analyzed for key themes that related to treatment structure, process, and outcomes. According to these participants, the principal indicator of treatment quality was “being able to trust the clinician.” The most valued treatment outcomes for improvement were the youth’s “self-efficacy and self-worth,” “functioning in school,” and “relationship with the family.” Future research is needed on how to best integrate these domains into specific and objective indicators for standardized quality of care assessments of AI child and adolescent behavioral health services. PMID:25961647
Matrix factorization-based data fusion for the prediction of lncRNA-disease associations.
Fu, Guangyuan; Wang, Jun; Domeniconi, Carlotta; Yu, Guoxian
2018-05-01
Long non-coding RNAs (lncRNAs) play crucial roles in complex disease diagnosis, prognosis, prevention and treatment, but only a small portion of lncRNA-disease associations have been experimentally verified. Various computational models have been proposed to identify lncRNA-disease associations by integrating heterogeneous data sources. However, existing models generally ignore the intrinsic structure of data sources or treat them as equally relevant, while they may not be. To accurately identify lncRNA-disease associations, we propose a Matrix Factorization based LncRNA-Disease Association prediction model (MFLDA in short). MFLDA decomposes data matrices of heterogeneous data sources into low-rank matrices via matrix tri-factorization to explore and exploit their intrinsic and shared structure. MFLDA can select and integrate the data sources by assigning different weights to them. An iterative solution is further introduced to simultaneously optimize the weights and low-rank matrices. Next, MFLDA uses the optimized low-rank matrices to reconstruct the lncRNA-disease association matrix and thus to identify potential associations. In 5-fold cross validation experiments to identify verified lncRNA-disease associations, MFLDA achieves an area under the receiver operating characteristic curve (AUC) of 0.7408, at least 3% higher than those given by state-of-the-art data fusion based computational models. An empirical study on identifying masked lncRNA-disease associations again shows that MFLDA can identify potential associations more accurately than competing models. A case study on identifying lncRNAs associated with breast, lung and stomach cancers show that 38 out of 45 (84%) associations predicted by MFLDA are supported by recent biomedical literature and further proves the capability of MFLDA in identifying novel lncRNA-disease associations. MFLDA is a general data fusion framework, and as such it can be adopted to predict associations between other biological entities. The source code for MFLDA is available at: http://mlda.swu.edu.cn/codes.php? name = MFLDA. gxyu@swu.edu.cn. Supplementary data are available at Bioinformatics online.
Jeong, Jong Seob; Cannata, Jonathan Matthew; Shung, K Kirk
2010-01-01
It was previously demonstrated that it is feasible to simultaneously perform ultrasound therapy and imaging of a coagulated lesion during treatment with an integrated transducer that is capable of high intensity focused ultrasound (HIFU) and B-mode ultrasound imaging. It was found that coded excitation and fixed notch filtering upon reception could significantly reduce interference caused by the therapeutic transducer. During HIFU sonication, the imaging signal generated with coded excitation and fixed notch filtering had a range side-lobe level of less than −40 dB, while traditional short-pulse excitation and fixed notch filtering produced a range side-lobe level of −20 dB. The shortcoming is, however, that relatively complicated electronics may be needed to utilize coded excitation in an array imaging system. It is for this reason that in this paper an adaptive noise canceling technique is proposed to improve image quality by minimizing not only the therapeutic interference, but also the remnant side-lobe ‘ripples’ when using the traditional short-pulse excitation. The performance of this technique was verified through simulation and experiments using a prototype integrated HIFU/imaging transducer. Although it is known that the remnant ripples are related to the notch attenuation value of the fixed notch filter, in reality, it is difficult to find the optimal notch attenuation value due to the change in targets or the media resulted from motion or different acoustic properties even during one sonication pulse. In contrast, the proposed adaptive noise canceling technique is capable of optimally minimizing both the therapeutic interference and residual ripples without such constraints. The prototype integrated HIFU/imaging transducer is composed of three rectangular elements. The 6 MHz center element is used for imaging and the outer two identical 4 MHz elements work together to transmit the HIFU beam. Two HIFU elements of 14.4 mm × 20.0 mm dimensions could increase the temperature of the soft biological tissue from 55 °C to 71 °C within 60 s. Two types of experiments for simultaneous therapy and imaging were conducted to acquire a single scan-line and B-mode image with an aluminum plate and a slice of porcine muscle, respectively. The B-mode image was obtained using the single element imaging system during HIFU beam transmission. The experimental results proved that the combination of the traditional short-pulse excitation and the adaptive noise canceling method could significantly reduce therapeutic interference and remnant ripples and thus may be a better way to implement real-time simultaneous therapy and imaging. PMID:20224162
Jeong, Jong Seob; Cannata, Jonathan Matthew; Shung, K Kirk
2010-04-07
It was previously demonstrated that it is feasible to simultaneously perform ultrasound therapy and imaging of a coagulated lesion during treatment with an integrated transducer that is capable of high intensity focused ultrasound (HIFU) and B-mode ultrasound imaging. It was found that coded excitation and fixed notch filtering upon reception could significantly reduce interference caused by the therapeutic transducer. During HIFU sonication, the imaging signal generated with coded excitation and fixed notch filtering had a range side-lobe level of less than -40 dB, while traditional short-pulse excitation and fixed notch filtering produced a range side-lobe level of -20 dB. The shortcoming is, however, that relatively complicated electronics may be needed to utilize coded excitation in an array imaging system. It is for this reason that in this paper an adaptive noise canceling technique is proposed to improve image quality by minimizing not only the therapeutic interference, but also the remnant side-lobe 'ripples' when using the traditional short-pulse excitation. The performance of this technique was verified through simulation and experiments using a prototype integrated HIFU/imaging transducer. Although it is known that the remnant ripples are related to the notch attenuation value of the fixed notch filter, in reality, it is difficult to find the optimal notch attenuation value due to the change in targets or the media resulted from motion or different acoustic properties even during one sonication pulse. In contrast, the proposed adaptive noise canceling technique is capable of optimally minimizing both the therapeutic interference and residual ripples without such constraints. The prototype integrated HIFU/imaging transducer is composed of three rectangular elements. The 6 MHz center element is used for imaging and the outer two identical 4 MHz elements work together to transmit the HIFU beam. Two HIFU elements of 14.4 mm x 20.0 mm dimensions could increase the temperature of the soft biological tissue from 55 degrees C to 71 degrees C within 60 s. Two types of experiments for simultaneous therapy and imaging were conducted to acquire a single scan-line and B-mode image with an aluminum plate and a slice of porcine muscle, respectively. The B-mode image was obtained using the single element imaging system during HIFU beam transmission. The experimental results proved that the combination of the traditional short-pulse excitation and the adaptive noise canceling method could significantly reduce therapeutic interference and remnant ripples and thus may be a better way to implement real-time simultaneous therapy and imaging.
A Developmental Approach to the Teaching of Ethical Decision Making.
ERIC Educational Resources Information Center
Neukrug, Edward S.
1996-01-01
Examines the newly adopted code of ethics, reviews some ethical decision-making models, and hypothesizes how the maturity of a student might mediate the effective use of codes and of decision-making models. Provides a model for human service educators that integrates ethical guidelines and ethical decision-making models. (RJM)
49 CFR 178.338-3 - Structural integrity.
Code of Federal Regulations, 2010 CFR
2010-10-01
... calculated design stress at any point in the tank may not exceed the lesser of the maximum allowable stress... Code or the ASTM standard to which the material is manufactured. (3) The maximum design stress at any... ASME Code (IBR, see § 171.7 of this subchapter). The tank design must include calculation of stress due...
49 CFR 178.345-3 - Structural integrity.
Code of Federal Regulations, 2010 CFR
2010-10-01
... requirements and acceptance criteria. (1) The maximum calculated design stress at any point in the cargo tank wall may not exceed the maximum allowable stress value prescribed in Section VIII of the ASME Code (IBR... Code or the ASTM standard to which the material is manufactured. (3) The maximum design stress at any...
Codes of Ethics in Australian Education: Towards a National Perspective
ERIC Educational Resources Information Center
Forster, Daniella J.
2012-01-01
Teachers have a dual moral responsibility as both values educators and moral agents representing the integrity of the profession. Codes of ethics and conduct in teaching articulate shared professional values and aim to provide some guidance for action around recognised issues special to the profession but are also instruments of regulation which…
40 CFR 264.340 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) through (b)(4) of this section... hazardous waste in part 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I... chapter solely because it is reactive (Hazard Code R) for characteristics other than those listed in § 261...
Development Of A Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Yoon, Seokkwan; Kwak, Dochan
1993-01-01
Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.
49 CFR 178.345-3 - Structural integrity.
Code of Federal Regulations, 2011 CFR
2011-10-01
... acceptance criteria. (1) The maximum calculated design stress at any point in the cargo tank wall may not exceed the maximum allowable stress value prescribed in Section VIII of the ASME Code (IBR, see § 171.7... Code or the ASTM standard to which the material is manufactured. (3) The maximum design stress at any...
Enhancing Nursing and Midwifery Student Learning Through the Use of QR Codes.
Downer, Terri; Oprescu, Florin; Forbes, Helen; Phillips, Nikki; McTier, Lauren; Lord, Bill; Barr, Nigel; Bright, Peter; Simbag, Vilma
A recent teaching and learning innovation using new technologies involves the use of quick response codes, which are read by smartphones and tablets. Integrating this technology as a teaching and learning strategy in nursing and midwifery education has been embraced by academics and students at a regional university.
Enforcing Hardware-Assisted Integrity for Secure Transactions from Commodity Operating Systems
2015-08-17
OS. First, we dedicate one hard disk to each OS. A System Management Mode ( SMM )-based monitoring module monitors if an OS is accessing another hard...hypervisor- based systems. An adversary can only target the BIOS-anchored SMM code, which is tiny, and without any need for foreign code (i.e. third
NASA Astrophysics Data System (ADS)
The present conference on global telecommunications discusses topics in the fields of Integrated Services Digital Network (ISDN) technology field trial planning and results to date, motion video coding, ISDN networking, future network communications security, flexible and intelligent voice/data networks, Asian and Pacific lightwave and radio systems, subscriber radio systems, the performance of distributed systems, signal processing theory, satellite communications modulation and coding, and terminals for the handicapped. Also discussed are knowledge-based technologies for communications systems, future satellite transmissions, high quality image services, novel digital signal processors, broadband network access interface, traffic engineering for ISDN design and planning, telecommunications software, coherent optical communications, multimedia terminal systems, advanced speed coding, portable and mobile radio communications, multi-Gbit/second lightwave transmission systems, enhanced capability digital terminals, communications network reliability, advanced antimultipath fading techniques, undersea lightwave transmission, image coding, modulation and synchronization, adaptive signal processing, integrated optical devices, VLSI technologies for ISDN, field performance of packet switching, CSMA protocols, optical transport system architectures for broadband ISDN, mobile satellite communications, indoor wireless communication, echo cancellation in communications, and distributed network algorithms.
Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks.
Scarpetta, Silvia; Giacco, Ferdinando
2013-04-01
We study the collective dynamics of a Leaky Integrate and Fire network in which precise relative phase relationship of spikes among neurons are stored, as attractors of the dynamics, and selectively replayed at different time scales. Using an STDP-based learning process, we store in the connectivity several phase-coded spike patterns, and we find that, depending on the excitability of the network, different working regimes are possible, with transient or persistent replay activity induced by a brief signal. We introduce an order parameter to evaluate the similarity between stored and recalled phase-coded pattern, and measure the storage capacity. Modulation of spiking thresholds during replay changes the frequency of the collective oscillation or the number of spikes per cycle, keeping preserved the phases relationship. This allows a coding scheme in which phase, rate and frequency are dissociable. Robustness with respect to noise and heterogeneity of neurons parameters is studied, showing that, since dynamics is a retrieval process, neurons preserve stable precise phase relationship among units, keeping a unique frequency of oscillation, even in noisy conditions and with heterogeneity of internal parameters of the units.
NASA Technical Reports Server (NTRS)
Deere, Karen A.; Viken, Sally A.; Carter, Melissa B.; Viken, Jeffrey K.; Derlaga, Joseph M.; Stoll, Alex M.
2017-01-01
A variety of tools, from fundamental to high order, have been used to better understand applications of distributed electric propulsion to aid the wing and propulsion system design of the Leading Edge Asynchronous Propulsion Technology (LEAPTech) project and the X-57 Maxwell airplane. Three high-fidelity, Navier-Stokes computational fluid dynamics codes used during the project with results presented here are FUN3D, STAR-CCM+, and OVERFLOW. These codes employ various turbulence models to predict fully turbulent and transitional flow. Results from these codes are compared for two distributed electric propulsion configurations: the wing tested at NASA Armstrong on the Hybrid-Electric Integrated Systems Testbed truck, and the wing designed for the X-57 Maxwell airplane. Results from these computational tools for the high-lift wing tested on the Hybrid-Electric Integrated Systems Testbed truck and the X-57 high-lift wing presented compare reasonably well. The goal of the X-57 wing and distributed electric propulsion system design achieving or exceeding the required ?? (sub L) = 3.95 for stall speed was confirmed with all of the computational codes.
Yuan, Mingquan; Jiang, Qisheng; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu
2018-06-01
This paper addresses two key challenges toward an integrated forward error-correcting biosensor based on our previously reported self-assembled quick-response (QR) code. The first challenge involves the choice of the paper substrate for printing and self-assembling the QR code. We have compared four different substrates that includes regular printing paper, Whatman filter paper, nitrocellulose membrane and lab synthesized bacterial cellulose. We report that out of the four substrates bacterial cellulose outperforms the others in terms of probe (gold nanorods) and ink retention capability. The second challenge involves remote activation of the analyte sampling and the QR code self-assembly process. In this paper, we use light as a trigger signal and a graphite layer as a light-absorbing material. The resulting change in temperature due to infrared absorption leads to a temperature gradient that then exerts a diffusive force driving the analyte toward the regions of self-assembly. The working principle has been verified in this paper using assembled biosensor prototypes where we demonstrate higher sample flow rate due to light induced thermal gradients.
Resonance Parameter Adjustment Based on Integral Experiments
Sobes, Vladimir; Leal, Luiz; Arbanas, Goran; ...
2016-06-02
Our project seeks to allow coupling of differential and integral data evaluation in a continuous-energy framework and to use the generalized linear least-squares (GLLS) methodology in the TSURFER module of the SCALE code package to update the parameters of a resolved resonance region evaluation. We recognize that the GLLS methodology in TSURFER is identical to the mathematical description of a Bayesian update in SAMMY, the SAMINT code was created to use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Traditionally, SAMMY used differential experimental data to adjust nuclear data parameters. Integral experimental data, suchmore » as in the International Criticality Safety Benchmark Experiments Project, remain a tool for validation of completed nuclear data evaluations. SAMINT extracts information from integral benchmarks to aid the nuclear data evaluation process. Later, integral data can be used to resolve any remaining ambiguity between differential data sets, highlight troublesome energy regions, determine key nuclear data parameters for integral benchmark calculations, and improve the nuclear data covariance matrix evaluation. Moreover, SAMINT is not intended to bias nuclear data toward specific integral experiments but should be used to supplement the evaluation of differential experimental data. Using GLLS ensures proper weight is given to the differential data.« less
Qiu, Guo-Hua
2016-01-01
In this review, the protective function of the abundant non-coding DNA in the eukaryotic genome is discussed from the perspective of genome defense against exogenous nucleic acids. Peripheral non-coding DNA has been proposed to act as a bodyguard that protects the genome and the central protein-coding sequences from ionizing radiation-induced DNA damage. In the proposed mechanism of protection, the radicals generated by water radiolysis in the cytosol and IR energy are absorbed, blocked and/or reduced by peripheral heterochromatin; then, the DNA damage sites in the heterochromatin are removed and expelled from the nucleus to the cytoplasm through nuclear pore complexes, most likely through the formation of extrachromosomal circular DNA. To strengthen this hypothesis, this review summarizes the experimental evidence supporting the protective function of non-coding DNA against exogenous nucleic acids. Based on these data, I hypothesize herein about the presence of an additional line of defense formed by small RNAs in the cytosol in addition to their bodyguard protection mechanism in the nucleus. Therefore, exogenous nucleic acids may be initially inactivated in the cytosol by small RNAs generated from non-coding DNA via mechanisms similar to the prokaryotic CRISPR-Cas system. Exogenous nucleic acids may enter the nucleus, where some are absorbed and/or blocked by heterochromatin and others integrate into chromosomes. The integrated fragments and the sites of DNA damage are removed by repetitive non-coding DNA elements in the heterochromatin and excluded from the nucleus. Therefore, the normal eukaryotic genome and the central protein-coding sequences are triply protected by non-coding DNA against invasion by exogenous nucleic acids. This review provides evidence supporting the protective role of non-coding DNA in genome defense. Copyright © 2016 Elsevier B.V. All rights reserved.
Zelingher, Julian; Ash, Nachman
2013-05-01
The IsraeLi healthcare system has undergone major processes for the adoption of health information technologies (HIT), and enjoys high Levels of utilization in hospital and ambulatory care. Coding is an essential infrastructure component of HIT, and ts purpose is to represent data in a simplified and common format, enhancing its manipulation by digital systems. Proper coding of data enables efficient identification, storage, retrieval and communication of data. UtiLization of uniform coding systems by different organizations enables data interoperability between them, facilitating communication and integrating data elements originating in different information systems from various organizations. Current needs in Israel for heaLth data coding include recording and reporting of diagnoses for hospitalized patients, outpatients and visitors of the Emergency Department, coding of procedures and operations, coding of pathology findings, reporting of discharge diagnoses and causes of death, billing codes, organizational data warehouses and national registries. New national projects for cLinicaL data integration, obligatory reporting of quality indicators and new Ministry of Health (MOH) requirements for HIT necessitate a high Level of interoperability that can be achieved only through the adoption of uniform coding. Additional pressures were introduced by the USA decision to stop the maintenance of the ICD-9-CM codes that are also used by Israeli healthcare, and the adoption of ICD-10-C and ICD-10-PCS as the main coding system for billing purpose. The USA has also mandated utilization of SNOMED-CT as the coding terminology for the ELectronic Health Record problem list, and for reporting quality indicators to the CMS. Hence, the Israeli MOH has recently decided that discharge diagnoses will be reported using ICD-10-CM codes, and SNOMED-CT will be used to code the cLinical information in the EHR. We reviewed the characteristics, strengths and weaknesses of these two coding systems. In summary, the adoption of ICD-10-CM is in line with the USA decision to abandon ICD-9-CM, and the Israeli heaLthcare system could benefit from USA heaLthcare efforts in this direction. The Large content of SNOMED-CT and its sophisticated hierarchical data structure will enable advanced cLinicaL decision support and quality improvement applications.
Hydrodynamic Instability, Integrated Code, Laboratory Astrophysics, and Astrophysics
NASA Astrophysics Data System (ADS)
Takabe, Hideaki
2016-10-01
This is an article for the memorial lecture of Edward Teller Medal and is presented as memorial lecture at the IFSA03 conference held on September 12th, 2003, at Monterey, CA. The author focuses on his main contributions to fusion science and its extension to astrophysics in the field of theory and computation by picking up five topics. The first one is the anomalous resisitivity to hot electrons penetrating over-dense region through the ion wave turbulence driven by the return current compensating the current flow by the hot electrons. It is concluded that almost the same value of potential as the average kinetic energy of the hot electrons is realized to prevent the penetration of the hot electrons. The second is the ablative stabilization of Rayleigh-Taylor instability at ablation front and its dispersion relation so-called Takabe formula. This formula gave a principal guideline for stable target design. The author has developed an integrated code ILESTA (ID & 2D) for analyses and design of laser produced plasma including implosion dynamics. It is also applied to design high gain targets. The third is the development of the integrated code ILESTA. The forth is on Laboratory Astrophysics with intense lasers. This consists of two parts; one is review on its historical background and the other is on how we relate laser plasma to wide-ranging astrophysics and the purposes for promoting such research. In relation to one purpose, I gave a comment on anomalous transport of relativistic electrons in Fast Ignition laser fusion scheme. Finally, I briefly summarize recent activity in relation to application of the author's experience to the development of an integrated code for studying extreme phenomena in astrophysics.
Software Security Knowledge: CWE. Knowing What Could Make Software Vulnerable to Attack
2011-05-01
shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1...Buffer • CWE-642: External Control of Critical State Data • CWE-73: External Control of File Name or Path • CWE-426: Untrusted Search Path • CWE...94: Failure to Control Generation of Code (aka ’Code Injection’) • CWE-494: Download of Code Without Integrity Check • CWE-404: Improper Resource
NASA Technical Reports Server (NTRS)
Chan, William M.
1995-01-01
Algorithms and computer code developments were performed for the overset grid approach to solving computational fluid dynamics problems. The techniques developed are applicable to compressible Navier-Stokes flow for any general complex configurations. The computer codes developed were tested on different complex configurations with the Space Shuttle launch vehicle configuration as the primary test bed. General, efficient and user-friendly codes were produced for grid generation, flow solution and force and moment computation.
Computer code for controller partitioning with IFPC application: A user's manual
NASA Technical Reports Server (NTRS)
Schmidt, Phillip H.; Yarkhan, Asim
1994-01-01
A user's manual for the computer code for partitioning a centralized controller into decentralized subcontrollers with applicability to Integrated Flight/Propulsion Control (IFPC) is presented. Partitioning of a centralized controller into two subcontrollers is described and the algorithm on which the code is based is discussed. The algorithm uses parameter optimization of a cost function which is described. The major data structures and functions are described. Specific instructions are given. The user is led through an example of an IFCP application.
ALE3D: An Arbitrary Lagrangian-Eulerian Multi-Physics Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noble, Charles R.; Anderson, Andrew T.; Barton, Nathan R.
ALE3D is a multi-physics numerical simulation software tool utilizing arbitrary-Lagrangian- Eulerian (ALE) techniques. The code is written to address both two-dimensional (2D plane and axisymmetric) and three-dimensional (3D) physics and engineering problems using a hybrid finite element and finite volume formulation to model fluid and elastic-plastic response of materials on an unstructured grid. As shown in Figure 1, ALE3D is a single code that integrates many physical phenomena.
Nisar, Asim; Afzulpurkar, Nitin; Tuantranont, Adisorn; Mahaisavariya, Banchong
2008-12-01
In this paper, we present design of a transdermal drug delivery system for treatment of cardiovascular or hemodynamic disorders such as hypertension. The system comprises of integrated control electronics and microelectromechanical system devices such as micropump, micro blood pressure sensor and microneedle array. The objective is to overcome the limitations of oral therapy such as variable absorption profile and the need for frequent dosing, by fabricating a safe, reliable and cost effective transdermal drug delivery system to dispense various pharmacological agents through the skin for treatment of hemodynamic dysfunction such as hypertension. Moreover, design optimization of a piezoelectrically actuated valveless micropump is presented for the drug delivery system. Because of the complexity in analysis of piezoelectric micropump, which involves structural and fluid field couplings in a complicated geometrical arrangement, finite element (FE) numerical simulation rather than an analytical system has been used. The behavior of the piezoelectric actuator with biocompatible polydimethylsiloxane membrane is first studied by conducting piezoelectric analysis. Then the performance of the valveless micropump is analyzed by building a three dimensional electric-solid-fluid model of the micropump. The effect of geometrical dimensions on micropump characteristics and efficiency of nozzle/diffuser elements of a valveless micropump is investigated in the transient analysis using multiple code coupling method. The deformation results of the membrane using multifield code coupling analysis are in good agreement with analytical as well as results of single code coupling analysis of a piezoelectric micropump. The analysis predicts that to enhance the performance of the micropump, diffuser geometrical dimensions such as diffuser length, diffuser neck width and diffuser angle need to be optimized. Micropump flow rate is not strongly affected at low excitation frequencies from 10 to 200 Hz. The excitation voltage is the more dominant factor that affects the flow rate of the micropump as compared with the excitation frequency. However, at extremely high excitation frequencies beyond 8,000 Hz, the flow rate drops as the membrane exhibits multiple bending peaks which is not desirable for fluid flow. Following the extensive numerical analysis, actual fabrication and performance characterization of the micropump is presented. The performance of the micropump is characterized in terms of piezoelectric actuator deflection and micropump flow rate at different operational parameters. The set of multifield simulations and experimental measurement of deflection and flow rate at varying voltage and excitation frequency is a significant advance in the study of the electric-solid-fluid coupled field effects as it allows transient, three dimensional piezoelectric and fluid analysis of the micropump thereby facilitating a more realistic multifield analysis. The results of the present study will also help to conduct relevant strength duration tests of integrated drug delivery device with micropump and microneedle array in future.
NASA Astrophysics Data System (ADS)
Litts, Breanne K.; Kafai, Yasmin B.; Lui, Debora A.; Walker, Justice T.; Widman, Sari A.
2017-10-01
Learning about circuitry by connecting a battery, light bulb, and wires is a common activity in many science classrooms. In this paper, we expand students' learning about circuitry with electronic textiles, which use conductive thread instead of wires and sewable LEDs instead of lightbulbs, by integrating programming sensor inputs and light outputs and examining how the two domains interact. We implemented an electronic textiles unit with 23 high school students ages 16-17 years who learned how to craft and code circuits with the LilyPad Arduino, an electronic textile construction kit. Our analyses not only confirm significant increases in students' understanding of functional circuits but also showcase students' ability in designing and remixing program code for controlling circuits. In our discussion, we address opportunities and challenges of introducing codeable circuit design for integrating maker activities that include engineering and computing into classrooms.
Comparison of different methods used in integral codes to model coagulation of aerosols
NASA Astrophysics Data System (ADS)
Beketov, A. I.; Sorokin, A. A.; Alipchenkov, V. M.; Mosunova, N. A.
2013-09-01
The methods for calculating coagulation of particles in the carrying phase that are used in the integral codes SOCRAT, ASTEC, and MELCOR, as well as the Hounslow and Jacobson methods used to model aerosol processes in the chemical industry and in atmospheric investigations are compared on test problems and against experimental results in terms of their effectiveness and accuracy. It is shown that all methods are characterized by a significant error in modeling the distribution function for micrometer particles if calculations are performed using rather "coarse" spectra of particle sizes, namely, when the ratio of the volumes of particles from neighboring fractions is equal to or greater than two. With reference to the problems considered, the Hounslow method and the method applied in the aerosol module used in the ASTEC code are the most efficient ones for carrying out calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaufmann, John R.; Hand, James R.; Halverson, Mark A.
This report evaluates how and when to best integrate renewable energy requirements into building energy codes. The basic goals were to: (1) provide a rough guide of where we’re going and how to get there; (2) identify key issues that need to be considered, including a discussion of various options with pros and cons, to help inform code deliberations; and (3) to help foster alignment among energy code-development organizations. The authors researched current approaches nationally and internationally, conducted a survey of key stakeholders to solicit input on various approaches, and evaluated the key issues related to integration of renewable energymore » requirements and various options to address those issues. The report concludes with recommendations and a plan to engage stakeholders. This report does not evaluate whether the use of renewable energy should be required on buildings; that question involves a political decision that is beyond the scope of this report.« less
OpenFOAM: Open source CFD in research and industry
NASA Astrophysics Data System (ADS)
Jasak, Hrvoje
2009-12-01
The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.
Atomic Physics Effects on Convergent, Child-Langmuir Ion Flow between Nearly Transparent Electrodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santarius, John F.; Emmert, Gilbert A.
Research during this project at the University of Wisconsin Fusion Technology Institute (UW FTI) on ion and neutral flow through an arbitrary, monotonic potential difference created by nearly transparent electrodes accomplished the following: (1) developed and implemented an integral equation approach for atomic physics effects in helium plasmas; (2) extended the analysis to coupled integral equations that treat atomic and molecular deuterium ions and neutrals; (3) implemented the key deuterium and helium atomic and molecular cross sections; (4) added negative ion production and related cross sections; and (5) benchmarked the code against experimental results. The analysis and codes treat themore » species D0, D20, D+, D2+, D3+, D and, separately at present, He0 and He+. Extensions enhanced the analysis and related computer codes to include He++ ions plus planar and cylindrical geometries.« less
Simulation of Jet Noise with OVERFLOW CFD Code and Kirchhoff Surface Integral
NASA Technical Reports Server (NTRS)
Kandula, M.; Caimi, R.; Voska, N. (Technical Monitor)
2002-01-01
An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.
NASA Technical Reports Server (NTRS)
Kandula, Max; Caimi, Raoul; Steinrock, T. (Technical Monitor)
2001-01-01
An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.
Decoding the non-coding RNAs in Alzheimer's disease.
Schonrock, Nicole; Götz, Jürgen
2012-11-01
Non-coding RNAs (ncRNAs) are integral components of biological networks with fundamental roles in regulating gene expression. They can integrate sequence information from the DNA code, epigenetic regulation and functions of multimeric protein complexes to potentially determine the epigenetic status and transcriptional network in any given cell. Humans potentially contain more ncRNAs than any other species, especially in the brain, where they may well play a significant role in human development and cognitive ability. This review discusses their emerging role in Alzheimer's disease (AD), a human pathological condition characterized by the progressive impairment of cognitive functions. We discuss the complexity of the ncRNA world and how this is reflected in the regulation of the amyloid precursor protein and Tau, two proteins with central functions in AD. By understanding this intricate regulatory network, there is hope for a better understanding of disease mechanisms and ultimately developing diagnostic and therapeutic tools.
Can Disability Code Activation Promote Sustainable Development in Egypt... After the Arab Spring?
Mahmoud Issa Abdou, Safaa
2015-01-01
In January 2011, Egypt followed Tunisia in its Uprisal against the ruling oppressive regimes in search for democracy, freedom and better living conditions. The movement, later known as the Arab Spring, had implications on the country's economic and political systems. Hence, the need to adopt Sustainable Development strategies and that in order to ensure all people well being, and the implementation of their human rights. This would only be realized when the built environment would become accessible to vulnerable people, as well as to persons with disabilities and would enable them to participate and be included in various living activities. This paper reviews the impact of the Egyptian disability code, that was published 2003, and how its activation could help to provide the environment that supports persons with disabilities, and allows their integration. Key Words: Disability Code; Sustainable Development; Arab Spring; Accessible Enabling Environment, People with Disabilities Integration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica
2017-12-28
The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. © Hara et al.
Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica
2017-01-01
The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. PMID:29284701
Guo, Jun; Zhou, Yuan; Cheng, Yafen; Fang, Weiwei; Hu, Gang; Wei, Jie; Lin, Yajun; Man, Yong; Guo, Lixin; Sun, Mingxiao; Cui, Qinghua; Li, Jian
2018-01-01
Recent studies have suggested that changes in non-coding mRNA play a key role in the progression of non-alcoholic fatty liver disease (NAFLD). Metformin is now recommended and effective for the treatment of NAFLD. We hope the current analyses of the non-coding mRNA transcriptome will provide a better presentation of the potential roles of mRNAs and long non-coding RNAs (lncRNAs) that underlie NAFLD and metformin intervention. The present study mainly analysed changes in the coding transcriptome and non-coding RNAs after the application of a five-week metformin intervention. Liver samples from three groups of mice were harvested for transcriptome profiling, which covered mRNA, lncRNA, microRNA (miRNA) and circular RNA (circRNA), using a microarray technique. A systematic alleviation of high-fat diet (HFD)-induced transcriptome alterations by metformin was observed. The metformin treatment largely reversed the correlations with diabetes-related pathways. Our analysis also suggested interaction networks between differentially expressed lncRNAs and known hepatic disease genes and interactions between circRNA and their disease-related miRNA partners. Eight HFD-responsive lncRNAs and three metformin-responsive lncRNAs were noted due to their widespread associations with disease genes. Moreover, seven miRNAs that interacted with multiple differentially expressed circRNAs were highlighted because they were likely to be associated with metabolic or liver diseases. The present study identified novel changes in the coding transcriptome and non-coding RNAs in the livers of NAFLD mice after metformin treatment that might shed light on the underlying mechanism by which metformin impedes the progression of NAFLD. © 2018 The Author(s). Published by S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, J.; Kucukboyaci, V. N.; Nguyen, L.
2012-07-01
The Westinghouse Small Modular Reactor (SMR) is an 800 MWt (> 225 MWe) integral pressurized water reactor (iPWR) with all primary components, including the steam generator and the pressurizer located inside the reactor vessel. The reactor core is based on a partial-height 17x17 fuel assembly design used in the AP1000{sup R} reactor core. The Westinghouse SMR utilizes passive safety systems and proven components from the AP1000 plant design with a compact containment that houses the integral reactor vessel and the passive safety systems. A preliminary loss of coolant accident (LOCA) analysis of the Westinghouse SMR has been performed using themore » WCOBRA/TRAC-TF2 code, simulating a transient caused by a double ended guillotine (DEG) break in the direct vessel injection (DVI) line. WCOBRA/TRAC-TF2 is a new generation Westinghouse LOCA thermal-hydraulics code evolving from the US NRC licensed WCOBRA/TRAC code. It is designed to simulate PWR LOCA events from the smallest break size to the largest break size (DEG cold leg). A significant number of fluid dynamics models and heat transfer models were developed or improved in WCOBRA/TRAC-TF2. A large number of separate effects and integral effects tests were performed for a rigorous code assessment and validation. WCOBRA/TRAC-TF2 was introduced into the Westinghouse SMR design phase to assist a quick and robust passive cooling system design and to identify thermal-hydraulic phenomena for the development of the SMR Phenomena Identification Ranking Table (PIRT). The LOCA analysis of the Westinghouse SMR demonstrates that the DEG DVI break LOCA is mitigated by the injection and venting from the Westinghouse SMR passive safety systems without core heat up, achieving long term core cooling. (authors)« less
Good Trellises for IC Implementation of Viterbi Decoders for Linear Block Codes
NASA Technical Reports Server (NTRS)
Moorthy, Hari T.; Lin, Shu; Uehara, Gregory T.
1997-01-01
This paper investigates trellis structures of linear block codes for the integrated circuit (IC) implementation of Viterbi decoders capable of achieving high decoding speed while satisfying a constraint on the structural complexity of the trellis in terms of the maximum number of states at any particular depth. Only uniform sectionalizations of the code trellis diagram are considered. An upper-bound on the number of parallel and structurally identical (or isomorphic) subtrellises in a proper trellis for a code without exceeding the maximum state complexity of the minimal trellis of the code is first derived. Parallel structures of trellises with various section lengths for binary BCH and Reed-Muller (RM) codes of lengths 32 and 64 are analyzed. Next, the complexity of IC implementation of a Viterbi decoder based on an L-section trellis diagram for a code is investigated. A structural property of a Viterbi decoder called add-compare-select (ACS)-connectivity which is related to state connectivity is introduced. This parameter affects the complexity of wire-routing (interconnections within the IC). The effect of five parameters namely: (1) effective computational complexity; (2) complexity of the ACS-circuit; (3) traceback complexity; (4) ACS-connectivity; and (5) branch complexity of a trellis diagram on the very large scale integration (VISI) complexity of a Viterbi decoder is investigated. It is shown that an IC implementation of a Viterbi decoder based on a nonminimal trellis requires less area and is capable of operation at higher speed than one based on the minimal trellis when the commonly used ACS-array architecture is considered.
Methodology, status and plans for development and assessment of the code ATHLET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teschendorff, V.; Austregesilo, H.; Lerchl, G.
1997-07-01
The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is being developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) for the analysis of anticipated and abnormal plant transients, small and intermediate leaks as well as large breaks in light water reactors. The aim of the code development is to cover the whole spectrum of design basis and beyond design basis accidents (without core degradation) for PWRs and BWRs with only one code. The main code features are: advanced thermal-hydraulics; modular code architecture; separation between physical models and numerical methods; pre- and post-processing tools; portability. The codemore » has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialization by a steady-state calculation, full-range drift-flux model, dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The code development is accompained by a systematic and comprehensive validation program. A large number of integral experiments and separate effect tests, including the major International Standard Problems, have been calculated by GRS and by independent organizations. The ATHLET validation matrix is a well balanced set of integral and separate effects tests derived from the CSNI proposal emphasizing, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities.« less
Tamori, Akihiro; Yamanishi, Yoshihiro; Kawashima, Shuichi; Kanehisa, Minoru; Enomoto, Masaru; Tanaka, Hiromu; Kubo, Shoji; Shiomi, Susumu; Nishiguchi, Shuhei
2005-08-15
Integration of hepatitis B virus (HBV) DNA into the human genome is one of the most important steps in HBV-related carcinogenesis. This study attempted to find the link between HBV DNA, the adjoining cellular sequence, and altered gene expression in hepatocellular carcinoma (HCC) with integrated HBV DNA. We examined 15 cases of HCC infected with HBV by cassette ligation-mediated PCR. The human DNA adjacent to the integrated HBV DNA was sequenced. Protein coding sequences were searched for in the human sequence. In five cases with HBV DNA integration, from which good quality RNA was extracted, gene expression was examined by cDNA microarray analysis. The human DNA sequence successive to integrated HBV DNA was determined in the 15 HCCs. Eight protein-coding regions were involved: ras-responsive element binding protein 1, calmodulin 1, mixed lineage leukemia 2 (MLL2), FLJ333655, LOC220272, LOC255345, LOC220220, and LOC168991. The MLL2 gene was expressed in three cases with HBV DNA integrated into exon 3 of MLL2 and in one case with HBV DNA integrated into intron 3 of MLL2. Gene expression analysis suggested that two HCCs with HBV integrated into MLL2 had similar patterns of gene expression compared with three HCCs with HBV integrated into other loci of human chromosomes. HBV DNA was integrated at random sites of human DNA, and the MLL2 gene was one of the targets for integration. Our results suggest that HBV DNA might modulate human genes near integration sites, followed by integration site-specific expression of such genes during hepatocarcinogenesis.
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
A new code SORD for simulation of polarized light scattering in the Earth atmosphere
NASA Astrophysics Data System (ADS)
Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Aliaksandr; Holben, Brent
2016-05-01
We report a new publicly available radiative transfer (RT) code for numerical simulation of polarized light scattering in plane-parallel Earth atmosphere. Using 44 benchmark tests, we prove high accuracy of the new RT code, SORD (Successive ORDers of scattering1, 2). We describe capabilities of SORD and show run time for each test on two different machines. At present, SORD is supposed to work as part of the Aerosol Robotic NETwork3 (AERONET) inversion algorithm. For natural integration with the AERONET software, SORD is coded in Fortran 90/95. The code is available by email request from the corresponding (first) author or from ftp://climate1.gsfc.nasa.gov/skorkin/SORD/ or ftp://maiac.gsfc.nasa.gov/pub/SORD.zip
Image compression using quad-tree coding with morphological dilation
NASA Astrophysics Data System (ADS)
Wu, Jiaji; Jiang, Weiwei; Jiao, Licheng; Wang, Lei
2007-11-01
In this paper, we propose a new algorithm which integrates morphological dilation operation to quad-tree coding, the purpose of doing this is to compensate each other's drawback by using quad-tree coding and morphological dilation operation respectively. New algorithm can not only quickly find the seed significant coefficient of dilation but also break the limit of block boundary of quad-tree coding. We also make a full use of both within-subband and cross-subband correlation to avoid the expensive cost of representing insignificant coefficients. Experimental results show that our algorithm outperforms SPECK and SPIHT. Without using any arithmetic coding, our algorithm can achieve good performance with low computational cost and it's more suitable to mobile devices or scenarios with a strict real-time requirement.
The moving mesh code SHADOWFAX
NASA Astrophysics Data System (ADS)
Vandenbroucke, B.; De Rijcke, S.
2016-07-01
We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.
Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN
NASA Technical Reports Server (NTRS)
Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.
1996-01-01
A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.
SNPConvert: SNP Array Standardization and Integration in Livestock Species.
Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra
2016-06-09
One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.
Department of Energy's Virtual Lab Infrastructure for Integrated Earth System Science Data
NASA Astrophysics Data System (ADS)
Williams, D. N.; Palanisamy, G.; Shipman, G.; Boden, T.; Voyles, J.
2014-12-01
The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) Climate and Environmental Sciences Division (CESD) produces a diversity of data, information, software, and model codes across its research and informatics programs and facilities. This information includes raw and reduced observational and instrumentation data, model codes, model-generated results, and integrated data products. Currently, most of this data and information are prepared and shared for program specific activities, corresponding to CESD organization research. A major challenge facing BER CESD is how best to inventory, integrate, and deliver these vast and diverse resources for the purpose of accelerating Earth system science research. This talk provides a concept for a CESD Integrated Data Ecosystem and an initial roadmap for its implementation to address this integration challenge in the "Big Data" domain. Towards this end, a new BER Virtual Laboratory Infrastructure will be presented, which will include services and software connecting the heterogeneous CESD data holdings, and constructed with open source software based on industry standards, protocols, and state-of-the-art technology.
Tompkins, Connie A.; Blake, Margaret T.; Wambaugh, Julie; Meigh, Kimberly
2012-01-01
Background This manuscript reports the initial phase of testing for a novel, “Contextual constraint” treatment, designed to stimulate inefficient language comprehension processes in adults with right hemisphere brain damage (RHD). Two versions of treatment were developed to target two normal comprehension processes that have broad relevance for discourse comprehension and that are often disrupted by RHD: coarse semantic coding and suppression. The development of the treatment was informed by two well-documented strengths of the RHD population. The first is consistently better performance on assessments that are implicit, or nearly so, than on explicit, metalinguistic measures of language and cognitive processing. The second is improved performance when given linguistic context that moderately-to-strongly biases an intended meaning. Treatment consisted of providing brief context sentences to prestimulate, or constrain, intended interpretations. Participants made no explicit associations or judgments about the constraint sentences; rather, these contexts served only as implicit primes. Aims This Phase I treatment study aimed to determine the effects of a novel, implicit, Contextual Constraint treatment in adults with RHD whose coarse coding or suppression processes were inefficient. Treatment was hypothesized to speed coarse coding or suppression function in these individuals. Methods & Procedures Three adults with RHD participated in this study, one (P1) with a coarse coding deficit and two (P2, P3) with suppression deficits. Probe tasks were adapted from prior studies of coarse coding and suppression in RHD. The dependent measure was the percentage of responses that met predetermined response time criteria. When pre-treatment baseline performance was stable, treatment was initiated. There were two levels of contextual constraint, Strong and Moderate, and treatment for each item began with the provision of the Strong constraint context. Outcomes & Results Treatment-contingent gains were evident after brief periods of treatment, for P1 on two treatment lists, and for P2. P3 made slower but still substantial gains. Maintenance of gains was evident for P1, the only participant for whom it was measured. Conclusions This Phase I treatment study documents the potential for considerable gains from an implicit, Contextual constraint treatment. If replicated, this approach to treatment may hold promise for individuals who do poorly with effortful, metalinguistic treatment tasks, or for whom it is desirable to minimize errors during treatment. The real test of this treatment’s benefit will come from later phase studies of study, which will test broad-based generalization to various aspects of discourse comprehension. PMID:22368317
Emmorey, Karen; Petrich, Jennifer; Gollan, Tamar H.
2012-01-01
Bilinguals who are fluent in American Sign Language (ASL) and English often produce code-blends - simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization times (Experiment 2) for code-blends versus ASL signs and English words produced alone. In production, code-blending did not slow lexical retrieval for ASL and actually facilitated access to low-frequency signs. However, code-blending delayed speech production because bimodal bilinguals synchronized English and ASL lexical onsets. In comprehension, code-blending speeded access to both languages. Bimodal bilinguals’ ability to produce code-blends without any cost to ASL implies that the language system either has (or can develop) a mechanism for switching off competition to allow simultaneous production of close competitors. Code-blend facilitation effects during comprehension likely reflect cross-linguistic (and cross-modal) integration at the phonological and/or semantic levels. The absence of any consistent processing costs for code-blending illustrates a surprising limitation on dual-task costs and may explain why bimodal bilinguals code-blend more often than they code-switch. PMID:22773886
Mensi, Skander; Hagens, Olivier; Gerstner, Wulfram; Pozzorini, Christian
2016-02-01
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter--describing somatic integration--and the spike-history filter--accounting for spike-frequency adaptation--dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations.
ERIC Educational Resources Information Center
Ingraham, Nissa; Nuttall, Susanne
2016-01-01
This qualitative case study of a southwest regional elementary school used interviews, focus groups, and document collection to better understand how this arts-integrated school is meeting the needs of English-language learner (ELL) students, discerning increased test performance on state standardized tests. Data were analyzed using open coding.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Christopher J.; Stone, James M.; Gammie, Charles F.
2016-08-01
We present a new general relativistic magnetohydrodynamics (GRMHD) code integrated into the Athena++ framework. Improving upon the techniques used in most GRMHD codes, ours allows the use of advanced, less diffusive Riemann solvers, in particular HLLC and HLLD. We also employ a staggered-mesh constrained transport algorithm suited for curvilinear coordinate systems in order to maintain the divergence-free constraint of the magnetic field. Our code is designed to work with arbitrary stationary spacetimes in one, two, or three dimensions, and we demonstrate its reliability through a number of tests. We also report on its promising performance and scalability.
Kalapos, Miklós Péter
2011-01-01
Talking of the Act LXXX. of 2009, the amendment of the Act IV. of 1978 on Criminal Code, the author reviews the Hungarian history of the changes of regulations referring to mentally ill criminals. He discusses the treatment regulations referring to criminals identified as insane, too. From historical and legal philosophical points of view, those parts of the modification of Criminal Code are analyzed that deal with mandatory treatment and took effect in he May 2010. The changes are judged as paradigm changing in a negative course that represents a doubtful step from the direction of perpetrator based criminal law to criminal act based criminal law.
Spectral fitting, shock layer modeling, and production of nitrogen oxides and excited nitrogen
NASA Technical Reports Server (NTRS)
Blackwell, H. E.
1991-01-01
An analysis was made of N2 emission from 8.72 MJ/kg shock layer at 2.54, 1.91, and 1.27 cm positions and vibrational state distributions, temperatures, and relative electronic state populations was obtained from data sets. Other recorded arc jet N2 and air spectral data were reviewed and NO emission characteristics were studied. A review of operational procedures of the DSMC code was made. Information on other appropriate codes and modifications, including ionization, were made as well as a determination of the applicability of codes reviewed to task requirement. A review was also made of computational procedures used in CFD codes of Li and other codes on JSC computers. An analysis was made of problems associated with integration of specific chemical kinetics applicable to task into CFD codes.
Integrated Composite Analyzer (ICAN): Users and programmers manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1986-01-01
The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.
i-PI: A Python interface for ab initio path integral molecular dynamics simulations
NASA Astrophysics Data System (ADS)
Ceriotti, Michele; More, Joshua; Manolopoulos, David E.
2014-03-01
Recent developments in path integral methodology have significantly reduced the computational expense of including quantum mechanical effects in the nuclear motion in ab initio molecular dynamics simulations. However, the implementation of these developments requires a considerable programming effort, which has hindered their adoption. Here we describe i-PI, an interface written in Python that has been designed to minimise the effort required to bring state-of-the-art path integral techniques to an electronic structure program. While it is best suited to first principles calculations and path integral molecular dynamics, i-PI can also be used to perform classical molecular dynamics simulations, and can just as easily be interfaced with an empirical forcefield code. To give just one example of the many potential applications of the interface, we use it in conjunction with the CP2K electronic structure package to showcase the importance of nuclear quantum effects in high-pressure water. Catalogue identifier: AERN_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AERN_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 138626 No. of bytes in distributed program, including test data, etc.: 3128618 Distribution format: tar.gz Programming language: Python. Computer: Multiple architectures. Operating system: Linux, Mac OSX, Windows. RAM: Less than 256 Mb Classification: 7.7. External routines: NumPy Nature of problem: Bringing the latest developments in the modelling of nuclear quantum effects with path integral molecular dynamics to ab initio electronic structure programs with minimal implementational effort. Solution method: State-of-the-art path integral molecular dynamics techniques are implemented in a Python interface. Any electronic structure code can be patched to receive the atomic coordinates from the Python interface, and to return the forces and energy that are used to integrate the equations of motion. Restrictions: This code only deals with distinguishable particles. It does not include fermonic or bosonic exchanges between equivalent nuclei, which can become important at very low temperatures. Running time: Depends dramatically on the nature of the simulation being performed. A few minutes for short tests with empirical force fields, up to several weeks for production calculations with ab initio forces. The examples provided with the code run in less than an hour.
Rapid Calculation of Spacecraft Trajectories Using Efficient Taylor Series Integration
NASA Technical Reports Server (NTRS)
Scott, James R.; Martini, Michael C.
2011-01-01
A variable-order, variable-step Taylor series integration algorithm was implemented in NASA Glenn's SNAP (Spacecraft N-body Analysis Program) code. SNAP is a high-fidelity trajectory propagation program that can propagate the trajectory of a spacecraft about virtually any body in the solar system. The Taylor series algorithm's very high order accuracy and excellent stability properties lead to large reductions in computer time relative to the code's existing 8th order Runge-Kutta scheme. Head-to-head comparison on near-Earth, lunar, Mars, and Europa missions showed that Taylor series integration is 15.8 times faster than Runge- Kutta on average, and is more accurate. These speedups were obtained for calculations involving central body, other body, thrust, and drag forces. Similar speedups have been obtained for calculations that include J2 spherical harmonic for central body gravitation. The algorithm includes a step size selection method that directly calculates the step size and never requires a repeat step. High-order Taylor series integration algorithms have been shown to provide major reductions in computer time over conventional integration methods in numerous scientific applications. The objective here was to directly implement Taylor series integration in an existing trajectory analysis code and demonstrate that large reductions in computer time (order of magnitude) could be achieved while simultaneously maintaining high accuracy. This software greatly accelerates the calculation of spacecraft trajectories. At each time level, the spacecraft position, velocity, and mass are expanded in a high-order Taylor series whose coefficients are obtained through efficient differentiation arithmetic. This makes it possible to take very large time steps at minimal cost, resulting in large savings in computer time. The Taylor series algorithm is implemented primarily through three subroutines: (1) a driver routine that automatically introduces auxiliary variables and sets up initial conditions and integrates; (2) a routine that calculates system reduced derivatives using recurrence relations for quotients and products; and (3) a routine that determines the step size and sums the series. The order of accuracy used in a trajectory calculation is arbitrary and can be set by the user. The algorithm directly calculates the motion of other planetary bodies and does not require ephemeris files (except to start the calculation). The code also runs with Taylor series and Runge-Kutta used interchangeably for different phases of a mission.
A finite element conjugate gradient FFT method for scattering
NASA Technical Reports Server (NTRS)
Collins, Jeffery D.; Ross, Dan; Jin, J.-M.; Chatterjee, A.; Volakis, John L.
1991-01-01
Validated results are presented for the new 3D body of revolution finite element boundary integral code. A Fourier series expansion of the vector electric and mangnetic fields is employed to reduce the dimensionality of the system, and the exact boundary condition is employed to terminate the finite element mesh. The mesh termination boundary is chosen such that is leads to convolutional boundary operatores of low O(n) memory demand. Improvements of this code are discussed along with the proposed formulation for a full 3D implementation of the finite element boundary integral method in conjunction with a conjugate gradiant fast Fourier transformation (CGFFT) solution.
Video data compression using artificial neural network differential vector quantization
NASA Technical Reports Server (NTRS)
Krishnamurthy, Ashok K.; Bibyk, Steven B.; Ahalt, Stanley C.
1991-01-01
An artificial neural network vector quantizer is developed for use in data compression applications such as Digital Video. Differential Vector Quantization is used to preserve edge features, and a new adaptive algorithm, known as Frequency-Sensitive Competitive Learning, is used to develop the vector quantizer codebook. To develop real time performance, a custom Very Large Scale Integration Application Specific Integrated Circuit (VLSI ASIC) is being developed to realize the associative memory functions needed in the vector quantization algorithm. By using vector quantization, the need for Huffman coding can be eliminated, resulting in superior performance against channel bit errors than methods that use variable length codes.
Langevin, Christian D.; Shoemaker, W. Barclay; Guo, Weixing
2003-01-01
SEAWAT-2000 is the latest release of the SEAWAT computer program for simulation of three-dimensional, variable-density, transient ground-water flow in porous media. SEAWAT-2000 was designed by combining a modified version of MODFLOW-2000 and MT3DMS into a single computer program. The code was developed using the MODFLOW-2000 concept of a process, which is defined as ?part of the code that solves a fundamental equation by a specified numerical method.? SEAWAT-2000 contains all of the processes distributed with MODFLOW-2000 and also includes the Variable-Density Flow Process (as an alternative to the constant-density Ground-Water Flow Process) and the Integrated MT3DMS Transport Process. Processes may be active or inactive, depending on simulation objectives; however, not all processes are compatible. For example, the Sensitivity and Parameter Estimation Processes are not compatible with the Variable-Density Flow and Integrated MT3DMS Transport Processes. The SEAWAT-2000 computer code was tested with the common variable-density benchmark problems and also with problems representing evaporation from a salt lake and rotation of immiscible fluids.
Embedded Systems Hardware Integration and Code Development for Maraia Capsule and E-MIST
NASA Technical Reports Server (NTRS)
Carretero, Emmanuel S.
2015-01-01
The cost of sending large spacecraft to orbit makes them undesirable for carrying out smaller scientific missions. Small spacecraft are more economical and can be tailored for missions where specific tasks need to be carried out, the Maraia capsule is such a spacecraft. Maraia will allow for samples of experiments conducted on the International Space Station to be returned to earth. The use of balloons to conduct experiments at the edge of space is a practical approach to reducing the large expense of using rockets. E-MIST is a payload designed to fly on a high altitude balloon. It can maintain science experiments in a controlled manner at the edge of space. The work covered here entails the integration of hardware onto each of the mentioned systems and the code associated with such work. In particular, the resistance temperature detector, pressure transducers, cameras, and thrusters for Maraia are discussed. The integration of the resistance temperature detectors and motor controllers to E-MIST is described. Several issues associated with sensor accuracy, code lock-up, and in-flight reset issues are mentioned. The solutions and proposed solutions to these issues are explained.
NASA Technical Reports Server (NTRS)
Rowe, C. K.
1971-01-01
The symbolic manipulation capabilities of the FORMAC (Formula Manipulation Compiler) language are employed to expand and analytically evaluate integrals. The program integration is effected by expanding the integral(s) into a series of subintegrals and then substituting a pre-derived and pre-coded solution for that particular subintegral. Derivation of the integral solutions necessary for precoding is included, as is a discussion of the FORMAC system limitations encountered in the programming effort.
2-Step scalar deadzone quantization for bitplane image coding.
Auli-Llinas, Francesc
2013-12-01
Modern lossy image coding systems generate a quality progressive codestream that, truncated at increasing rates, produces an image with decreasing distortion. Quality progressivity is commonly provided by an embedded quantizer that employs uniform scalar deadzone quantization (USDQ) together with a bitplane coding strategy. This paper introduces a 2-step scalar deadzone quantization (2SDQ) scheme that achieves same coding performance as that of USDQ while reducing the coding passes and the emitted symbols of the bitplane coding engine. This serves to reduce the computational costs of the codec and/or to code high dynamic range images. The main insights behind 2SDQ are the use of two quantization step sizes that approximate wavelet coefficients with more or less precision depending on their density, and a rate-distortion optimization technique that adjusts the distortion decreases produced when coding 2SDQ indexes. The integration of 2SDQ in current codecs is straightforward. The applicability and efficiency of 2SDQ are demonstrated within the framework of JPEG2000.
Rhodes, Gillian; Jeffery, Linda; Taylor, Libby; Hayward, William G; Ewing, Louise
2014-06-01
Despite their similarity as visual patterns, we can discriminate and recognize many thousands of faces. This expertise has been linked to 2 coding mechanisms: holistic integration of information across the face and adaptive coding of face identity using norms tuned by experience. Recently, individual differences in face recognition ability have been discovered and linked to differences in holistic coding. Here we show that they are also linked to individual differences in adaptive coding of face identity, measured using face identity aftereffects. Identity aftereffects correlated significantly with several measures of face-selective recognition ability. They also correlated marginally with own-race face recognition ability, suggesting a role for adaptive coding in the well-known other-race effect. More generally, these results highlight the important functional role of adaptive face-coding mechanisms in face expertise, taking us beyond the traditional focus on holistic coding mechanisms. PsycINFO Database Record (c) 2014 APA, all rights reserved.
CFD Code Validation of Wall Heat Fluxes for a G02/GH2 Single Element Combustor
NASA Technical Reports Server (NTRS)
Lin, Jeff; West, Jeff S.; Williams, Robert W.; Tucker, P. Kevin
2005-01-01
This paper puts forth the case for the need for improved injector design tools to meet NASA s Vision for Space Exploration goals. Requirements for this improved tool are outlined and discussed. The potential for Computational Fluid Dynamics (CFD) to meet these requirements is noted along with its current shortcomings, especially relative to demonstrated solution accuracy. The concept of verification and validation is introduced as the primary process for building and quantifying the confidence necessary for CFD to be useful as an injector design tool. The verification and validation process is considered in the context of the Marshall Space Flight Center (MSFC) Combustion Devices CFD Simulation Capability Roadmap via the Simulation Readiness Level (SRL) concept. The portion of the validation process which demonstrates the ability of a CFD code to simulate heat fluxes to a rocket engine combustor wall is the focus of the current effort. The FDNS and Loci-CHEM codes are used to simulate a shear coaxial single element G02/GH2 injector experiment. The experiment was conducted a t a chamber pressure of 750 psia using hot propellants from preburners. A measured wall temperature profile is used as a boundary condition to facilitate the calculations. Converged solutions, obtained from both codes by using wall functions with the K-E turbulence model and integrating to the wall using Mentor s baseline turbulence model, are compared to the experimental data. The initial solutions from both codes revealed significant issues with the wall function implementation associated with the recirculation zone between the shear coaxial jet and the chamber wall. The FDNS solution with a corrected implementation shows marked improvement in overall character and level of comparison to the data. With the FDNS code, integrating to the wall with Mentor s baseline turbulence model actually produce a degraded solution when compared to the wall function solution with the K--E model. The Loci-CHEM solution, produced by integrating to the wall with Mentor s baseline turbulence model, matches both the heat flux rise rate in the near injector region and the peak heat flux level very well. However, it moderately over predicts the heat fluxes downstream of the reattachment point. The Loci-CHEM solution achieved by integrating to the wall with Mentor s baseline turbulence model was clearly superior to the other solutions produced in this effort.
Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2003-01-01
This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neely, J. R.; Hornung, R.; Black, A.
This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completionmore » signed by the review committee will act as proof of completion for this milestone.« less
Guarani Morphology in Paraguayan Spanish: Insights from Code-Mixing Typology
ERIC Educational Resources Information Center
Estigarribia, Bruno
2017-01-01
In this paper we examine the use of Guarani affixes and clitics in colloquial Paraguayan Spanish. We depart from the traditional view of these as "borrowings," and instead explore the idea that these phenomena can be integrated within Muysken's (2000, 2013, 2014) typology of code-mixing. We claim that most of these uses may stem from a…
A Hybrid Constraint Representation and Reasoning Framework
NASA Technical Reports Server (NTRS)
Golden, Keith; Pang, Wan-Lin
2003-01-01
This paper introduces JNET, a novel constraint representation and reasoning framework that supports procedural constraints and constraint attachments, providing a flexible way of integrating the constraint reasoner with a run- time software environment. Attachments in JNET are constraints over arbitrary Java objects, which are defined using Java code, at runtime, with no changes to the JNET source code.
Secured Transactions: An Integrated Classroom Approach Using Financial Statements and Acronyms
ERIC Educational Resources Information Center
Seganish, W. Michael
2005-01-01
Students struggle with the subject of secured transactions under the Uniform Commercial Code. In this article, the author presents a method that uses balance-sheet information to help students visualize the difference between secured and unsecured creditors. The balance sheet is also used in the Uniform Commercial Code process, in which one must…
Multiplier Architecture for Coding Circuits
NASA Technical Reports Server (NTRS)
Wang, C. C.; Truong, T. K.; Shao, H. M.; Deutsch, L. J.
1986-01-01
Multipliers based on new algorithm for Galois-field (GF) arithmetic regular and expandable. Pipeline structures used for computing both multiplications and inverses. Designs suitable for implementation in very-large-scale integrated (VLSI) circuits. This general type of inverter and multiplier architecture especially useful in performing finite-field arithmetic of Reed-Solomon error-correcting codes and of some cryptographic algorithms.
Teachers' Perceptions of the Effect of Their Attire on Middle-School Students' Behavior and Learning
ERIC Educational Resources Information Center
Sampson, Elizabeth Clemons
2016-01-01
Teachers were once held to a professional dress code. This code has become lax, resulting in teachers dressing in more casual attire. A local middle school in rural Georgia was experiencing complaints about teachers' unprofessional attire from other teachers, administrators, and parents. Teachers play an integral role in modeling cultural and…
Geographic Information Systems: A Primer
1990-10-01
AVAILABILITY OF REPORT Approved for public release; distribution 2b DECLASSjFICATION/ DOWNGRADING SCHEDULE unlimited. 4 PERFORMING ORGANIZATION REPORT...utilizing sophisticated integrated databases (usually vector-based), avoid the indirect value coding scheme by recognizing names or direct magnitudes...intricate involvement required by the operator in order to establish a functional coding scheme . A simple raster system, in which cell values indicate
NASA Technical Reports Server (NTRS)
Pratt, D. T.
1984-01-01
Conventional algorithms for the numerical integration of ordinary differential equations (ODEs) are based on the use of polynomial functions as interpolants. However, the exact solutions of stiff ODEs behave like decaying exponential functions, which are poorly approximated by polynomials. An obvious choice of interpolant are the exponential functions themselves, or their low-order diagonal Pade (rational function) approximants. A number of explicit, A-stable, integration algorithms were derived from the use of a three-parameter exponential function as interpolant, and their relationship to low-order, polynomial-based and rational-function-based implicit and explicit methods were shown by examining their low-order diagonal Pade approximants. A robust implicit formula was derived by exponential fitting the trapezoidal rule. Application of these algorithms to integration of the ODEs governing homogenous, gas-phase chemical kinetics was demonstrated in a developmental code CREK1D, which compares favorably with the Gear-Hindmarsh code LSODE in spite of the use of a primitive stepsize control strategy.
NOAA draft scientific integrity policy: Comment period open through 20 August
NASA Astrophysics Data System (ADS)
Showstack, Randy
2011-08-01
The National Oceanic and Atmospheric Administration (NOAA) is aiming to finalize its draft scientific integrity policy possibly by the end of the year, Larry Robinson, NOAA assistant secretary for conservation and management, indicated during a 28 July teleconference. The policy “is key to fostering an environment where science is encouraged, nurtured, respected, rewarded, and protected,” Robinson said, adding that the agency's comment period for the draft policy, which was released on 16 June, ends on 20 August. “Science underpins all that NOAA does. This policy is one piece of a broader effort to strengthen NOAA science,” Robinson said, noting that the draft “represents the first ever scientific integrity policy for NOAA. Previously, our policy only addressed research misconduct and focused on external grants. What's new about this policy is that it establishes NOAA's principles for scientific integrity, a scientific code of conduct, and a code of ethics for science supervision and management.”
van den Berg, Ronald; Roerdink, Jos B. T. M.; Cornelissen, Frans W.
2010-01-01
An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called “crowding”. Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, “compulsory averaging”, and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality. PMID:20098499
NASA Technical Reports Server (NTRS)
Cheng, Michael K.; Lyubarev, Mark; Nakashima, Michael A.; Andrews, Kenneth S.; Lee, Dennis
2008-01-01
Low-density parity-check (LDPC) codes are the state-of-the-art in forward error correction (FEC) technology that exhibits capacity approaching performance. The Jet Propulsion Laboratory (JPL) has designed a family of LDPC codes that are similar in structure and therefore, leads to a single decoder implementation. The Accumulate-Repeat-by-4-Jagged- Accumulate (AR4JA) code design offers a family of codes with rates 1/2, 2/3, 4/5 and lengths 1024, 4096, 16384 information bits. Performance is less than one dB from capacity for all combinations.Integrating a stand-alone LDPC decoder with a commercial-off-the-shelf (COTS) receiver faces additional challenges than building a single receiver-decoder unit from scratch. In this work, we outline the issues and show that these additional challenges can be over-come by simple solutions. To demonstrate that an LDPC decoder can be made to work seamlessly with a COTS receiver, we interface an AR4JA LDPC decoder developed on a field-programmable gate array (FPGA) with a modern high data rate receiver and mea- sure the combined receiver-decoder performance. Through optimizations that include an improved frame synchronizer and different soft-symbol scaling algorithms, we show that a combined implementation loss of less than one dB is possible and therefore, most of the coding gain evidence in theory can also be obtained in practice. Our techniques can benefit any modem that utilizes an advanced FEC code.
Holsclaw, Tracy; Hallgren, Kevin A; Steyvers, Mark; Smyth, Padhraic; Atkins, David C
2015-12-01
Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased Type I and Type II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in online supplemental materials. (c) 2016 APA, all rights reserved).
Holsclaw, Tracy; Hallgren, Kevin A.; Steyvers, Mark; Smyth, Padhraic; Atkins, David C.
2015-01-01
Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non-normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased type-I and type-II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally-technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in supplementary materials. PMID:26098126
Integrating automated structured analysis and design with Ada programming support environments
NASA Technical Reports Server (NTRS)
Hecht, Alan; Simmons, Andy
1986-01-01
Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.
Combined Wavelet Video Coding and Error Control for Internet Streaming and Multicast
NASA Astrophysics Data System (ADS)
Chu, Tianli; Xiong, Zixiang
2003-12-01
This paper proposes an integrated approach to Internet video streaming and multicast (e.g., receiver-driven layered multicast (RLM) by McCanne) based on combined wavelet video coding and error control. We design a packetized wavelet video (PWV) coder to facilitate its integration with error control. The PWV coder produces packetized layered bitstreams that are independent among layers while being embedded within each layer. Thus, a lost packet only renders the following packets in the same layer useless. Based on the PWV coder, we search for a multilayered error-control strategy that optimally trades off source and channel coding for each layer under a given transmission rate to mitigate the effects of packet loss. While both the PWV coder and the error-control strategy are new—the former incorporates embedded wavelet video coding and packetization and the latter extends the single-layered approach for RLM by Chou et al.—the main distinction of this paper lies in the seamless integration of the two parts. Theoretical analysis shows a gain of up to 1 dB on a channel with 20% packet loss using our combined approach over separate designs of the source coder and the error-control mechanism. This is also substantiated by our simulations with a gain of up to 0.6 dB. In addition, our simulations show a gain of up to 2.2 dB over previous results reported by Chou et al.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter Andrew
The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomicmore » scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-04-01
The phase appearance/disappearance issue presents serious numerical challenges in two-phase flow simulations. Many existing reactor safety analysis codes use different kinds of treatments for the phase appearance/disappearance problem. However, to our best knowledge, there are no fully satisfactory solutions. Additionally, the majority of the existing reactor system analysis codes were developed using low-order numerical schemes in both space and time. In many situations, it is desirable to use high-resolution spatial discretization and fully implicit time integration schemes to reduce numerical errors. In this work, we adapted a high-resolution spatial discretization scheme on staggered grid mesh and fully implicit time integrationmore » methods (such as BDF1 and BDF2) to solve the two-phase flow problems. The discretized nonlinear system was solved by the Jacobian-free Newton Krylov (JFNK) method, which does not require the derivation and implementation of analytical Jacobian matrix. These methods were tested with a few two-phase flow problems with phase appearance/disappearance phenomena considered, such as a linear advection problem, an oscillating manometer problem, and a sedimentation problem. The JFNK method demonstrated extremely robust and stable behaviors in solving the two-phase flow problems with phase appearance/disappearance. No special treatments such as water level tracking or void fraction limiting were used. High-resolution spatial discretization and second- order fully implicit method also demonstrated their capabilities in significantly reducing numerical errors.« less
Personalized Guideline-Based Treatment Recommendations Using Natural Language Processing Techniques.
Becker, Matthias; Böckmann, Britta
2017-01-01
Clinical guidelines and clinical pathways are accepted and proven instruments for quality assurance and process optimization. Today, electronic representation of clinical guidelines exists as unstructured text, but is not well-integrated with patient-specific information from electronic health records. Consequently, generic content of the clinical guidelines is accessible, but it is not possible to visualize the position of the patient on the clinical pathway, decision support cannot be provided by personalized guidelines for the next treatment step. The Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT) provides common reference terminology as well as the semantic link for combining the pathways and the patient-specific information. This paper proposes a model-based approach to support the development of guideline-compliant pathways combined with patient-specific structured and unstructured information using SNOMED CT. To identify SNOMED CT concepts, a software was developed to extract SNOMED CT codes out of structured and unstructured German data to map these with clinical pathways annotated in accordance with the systematized nomenclature.
Integrating Buprenorphine Treatment into Office-based Practice: a Qualitative Study
Irwin, Kevin S.; Jones, Emlyn S.; Becker, William C.; Tetrault, Jeanette M.; Sullivan, Lynn E.; Hansen, Helena; O’Connor, Patrick G.; Schottenfeld, Richard S.; Fiellin, David A.
2008-01-01
BACKGROUND Despite the availability and demonstrated effectiveness of office-based buprenorphine maintenance treatment (BMT), the systematic examination of physicians’ attitudes towards this new medical practice has been largely neglected. OBJECTIVE To identify facilitators and barriers to the potential or actual implementation of BMT by office-based medical providers. DESIGN Qualitative study using individual and group semi-structured interviews. PARTICIPANTS Twenty-three practicing office-based physicians in New England. APPROACH Interviews were audiotaped, transcribed, and entered into a qualitative software program. The transcripts were thematically coded using the constant comparative method by a multidisciplinary team. RESULTS Eighty percent of the physicians were white; 55% were women. The mean number of years since graduating medical school was 14 (SD = 10). The primary areas of clinical specialization were internal medicine (50%), infectious disease (20%), and addiction medicine (15%). Physicians identified physician, patient, and logistical factors that would either facilitate or serve as a barrier to their integration of BMT into clinical practice. Physician facilitators included promoting continuity of patient care, positive perceptions of BMT, and viewing BMT as a positive alternative to methadone maintenance. Physician barriers included competing activities, lack of interest, and lack of expertise in addiction treatment. Physicians’ perceptions of patient-related barriers included concerns about confidentiality and cost, and low motivation for treatment. Perceived logistical barriers included lack of remuneration for BMT, limited ancillary support for physicians, not enough time, and a perceived low prevalence of opioid dependence in physicians’ practices. CONCLUSIONS Addressing physicians’ perceptions of facilitators and barriers to BMT is crucial to supporting the further expansion of BMT into primary care and office-based practices. PMID:19089500
Are Registration of Disease Codes for Adult Anaphylaxis Accurate in the Emergency Department?
Choi, Byungho; Lee, Hyeji
2018-01-01
Purpose There has been active research on anaphylaxis, but many study subjects are limited to patients registered with anaphylaxis codes. However, anaphylaxis codes tend to be underused. The aim of this study was to investigate the accuracy of anaphylaxis code registration and the clinical characteristics of accurate and inaccurate anaphylaxis registration in anaphylactic patients. Methods This retrospective study evaluated the medical records of adult patients who visited the university hospital emergency department between 2012 and 2016. The study subjects were divided into the groups with accurate and inaccurate anaphylaxis codes registered under anaphylaxis and other allergy-related codes and symptom-related codes, respectively. Results Among 211,486 patients, 618 (0.29%) had anaphylaxis. Of these, 161 and 457 were assigned to the accurate and inaccurate coding groups, respectively. The average age, transportation to the emergency department, past anaphylaxis history, cancer history, and the cause of anaphylaxis differed between the 2 groups. Cutaneous symptom manifested more frequently in the inaccurate coding group, while cardiovascular and neurologic symptoms were more frequently observed in the accurate group. Severe symptoms and non-alert consciousness were more common in the accurate group. Oxygen supply, intubation, and epinephrine were more commonly used as treatments for anaphylaxis in the accurate group. Anaphylactic patients with cardiovascular symptoms, severe symptoms, and epinephrine use were more likely to be accurately registered with anaphylaxis disease codes. Conclusions In case of anaphylaxis, more patients were registered inaccurately under other allergy-related codes and symptom-related codes rather than accurately under anaphylaxis disease codes. Cardiovascular symptoms, severe symptoms, and epinephrine treatment were factors associated with accurate registration with anaphylaxis disease codes in patients with anaphylaxis. PMID:29411554
RETRACTED — PMD mitigation through interleaving LDPC codes with polarization scramblers
NASA Astrophysics Data System (ADS)
Han, Dahai; Chen, Haoran; Xi, Lixia
2012-11-01
The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved as an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this paper as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10 MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes brings incremental performance of error correction, and the PMD tolerance is 10 ps at OSNR=11.4 dB. The results show that the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.
PMD mitigation through interleaving LDPC codes with polarization scramblers
NASA Astrophysics Data System (ADS)
Han, Dahai; Chen, Haoran; Xi, Lixia
2013-09-01
The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this article as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes bring incremental performance of error correction, and the PMD tolerance is 10ps at OSNR=11.4dB. The results show the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.
Auvray, F; Coddeville, M; Ritzenthaler, P; Dupont, L
1997-01-01
Bacteriophage mv4 is a temperate phage infecting Lactobacillus delbrueckii subsp. bulgaricus. During lysogenization, the phage integrates its genome into the host chromosome at the 3' end of a tRNA(Ser) gene through a site-specific recombination process (L. Dupont et al., J. Bacteriol., 177:586-595, 1995). A nonreplicative vector (pMC1) based on the mv4 integrative elements (attP site and integrase-coding int gene) is able to integrate into the chromosome of a wide range of bacterial hosts, including Lactobacillus plantarum, Lactobacillus casei (two strains), Lactococcus lactis subsp. cremoris, Enterococcus faecalis, and Streptococcus pneumoniae. Integrative recombination of pMC1 into the chromosomes of all of these species is dependent on the int gene product and occurs specifically at the pMC1 attP site. The isolation and sequencing of pMC1 integration sites from these bacteria showed that in lactobacilli, pMC1 integrated into the conserved tRNA(Ser) gene. In the other bacterial species where this tRNA gene is less or not conserved; secondary integration sites either in potential protein-coding regions or in intergenic DNA were used. A consensus sequence was deduced from the analysis of the different integration sites. The comparison of these sequences demonstrated the flexibility of the integrase for the bacterial integration site and suggested the importance of the trinucleotide CCT at the 5' end of the core in the strand exchange reaction. PMID:9068626
Reconciling Spiritual Values Conflicts for Counselors and Lesbian and Gay Clients
ERIC Educational Resources Information Center
Fallon, Kathleen M.; Dobmeier, Robert A.; Reiner, Summer M.; Casquarelli, Elaine J.; Giglia, Lauren A.; Goodwin, Eric
2013-01-01
Counselors and lesbian and gay clients experience parallel values conflicts between religious beliefs/spirituality and sexual orientation. This article uses critical thinking to assist counselors to integrate religious/spiritual beliefs with professional ethical codes. Clients are assisted to integrate religious/spiritual beliefs with sexual…
Production code control system for hydrodynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slone, D.M.
1997-08-18
We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less
Summary of papers on current and anticipated uses of thermal-hydraulic codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especiallymore » faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).« less
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Treatment of isomers in nucleosynthesis codes
NASA Astrophysics Data System (ADS)
Reifarth, René; Fiebiger, Stefan; Göbel, Kathrin; Heftrich, Tanja; Kausch, Tanja; Köppchen, Christoph; Kurtulgil, Deniz; Langer, Christoph; Thomas, Benedikt; Weigand, Mario
2018-03-01
The decay properties of long-lived excited states (isomers) can have a significant impact on the destruction channels of isotopes under stellar conditions. In sufficiently hot environments, the population of isomers can be altered via thermal excitation or de-excitation. If the corresponding lifetimes are of the same order of magnitude as the typical time scales of the environment, the isomers have to be treated explicitly. We present a general approach to the treatment of isomers in stellar nucleosynthesis codes and discuss a few illustrative examples. The corresponding code is available online at http://exp-astro.de/isomers/.