ERIC Educational Resources Information Center
Chou, Chun-Mei; Shen, Chien-Hua; Hsiao, Hsi-Chi; Chen, Su-Chang
2017-01-01
This study examines 1630 tertiary students in regard to their entrepreneurial career intentions (ECIs) and their influencing factors. The findings may serve as academic reference for the development of entrepreneurship-related education. The results show that students' computer self-efficacy (CSE) has a significant direct effect on ECIs, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, Daniel; Berzins, Martin; Pennington, Robert
On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observationsmore » and recommendations of the subcommittee.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escher, J. E.
Calculations for total cross sections and compound-nucleus (CN) formation cross sections for americium isotopes are described, for use in the 2017 NA-22 evaluation effort. The code ECIS 2006 was used in conjunction with Frank Dietrich's wrapper `runtemplate'.
van Weeghel, Michel; te Brinke, Heleen; van Lenthe, Henk; Kulik, Wim; Minkler, Paul E; Stoll, Maria S K; Sass, Jörn Oliver; Janssen, Uwe; Stoffel, Wilhelm; Schwab, K Otfried; Wanders, Ronald J A; Hoppel, Charles L; Houten, Sander M
2012-10-01
Mitochondrial enoyl-CoA isomerase (ECI1) is an auxiliary enzyme involved in unsaturated fatty acid oxidation. In contrast to most of the other enzymes involved in fatty acid oxidation, a deficiency of ECI1 has yet to be identified in humans. We used wild-type (WT) and Eci1-deficient knockout (KO) mice to explore a potential presentation of human ECI1 deficiency. Upon food withdrawal, Eci1-deficient mice displayed normal blood β-hydroxybutyrate levels (WT 1.09 mM vs. KO 1.10 mM), a trend to lower blood glucose levels (WT 4.58 mM vs. KO 3.87 mM, P=0.09) and elevated blood levels of unsaturated acylcarnitines, in particular C12:1 acylcarnitine (WT 0.03 μM vs. KO 0.09 μM, P<0.01). Feeding an olive oil-rich diet induced an even greater increase in C12:1 acylcarnitine levels (WT 0.01 μM vs. KO 0.04 μM, P<0.01). Overall, the phenotypic presentation of Eci1-deficient mice is mild, possibly caused by the presence of a second enoyl-CoA isomerase (Eci2) in mitochondria. Knockdown of Eci2 in Eci1-deficient fibroblasts caused a more pronounced accumulation of C12:1 acylcarnitine on incubation with unsaturated fatty acids (12-fold, P<0.05). We conclude that Eci2 compensates for Eci1 deficiency explaining the mild phenotype of Eci1-deficient mice. Hypoglycemia and accumulation of C12:1 acylcarnitine might be diagnostic markers to identify ECI1 deficiency in humans.
Gliko, Olga; Saggau, Peter; Brownell, William E
2009-08-19
In the outer hair cell (OHC), the extracisternal space (ECiS) is a conduit and reservoir of the molecular and ionic substrates of the lateral wall, including those necessary for electromotility. To determine the mechanisms through which molecules are transported in the ECiS of the OHC, we selectively imaged the time-dependent spatial distribution of fluorescent molecules in a <100 nm layer near the cell/glass interface of the recording chamber after their photolytic activation in a diffraction-limited volume. The effective diffusion coefficient was calculated using the analytical solution of the diffusion equation. It was found that diffusion in the ECiS is isotropic and not affected by depolarizing the OHC. Compared with free solution, the diffusion of 10 kDa dextran was slowed down in both the ECiS and the axial core by a factor of 4.6 and 1.6, respectively.
The Extreme Climate Index: a novel and multi-hazard index for extreme weather events.
NASA Astrophysics Data System (ADS)
Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro
2017-04-01
In this presentation we introduce the Extreme Climate Index (ECI): an objective, multi-hazard index capable of tracking changes in the frequency or magnitude of extreme weather events in African countries, thus indicating that a shift to a new climate regime is underway in a particular area. This index has been developed in the context of XCF (eXtreme Climate Facilities) project lead by ARC (African Risk Capacity, specialised agency of the African Union), and will be used in the payouts triggering mechanism of an insurance programme against risks related to the increase of frequency and magnitude of extreme weather events due to climate regimes' changes. The main hazards covered by ECI will be extreme dry, wet and heat events, with the possibility of adding region-specific risk events such as tropical cyclones for the most vulnerable areas. It will be based on data coming from consistent, sufficiently long, high quality historical records and will be standardized across broad geographical regions, so that extreme events occurring under different climatic regimes in Africa can be comparable. The first step to construct such an index is to define single hazard indicators. In this first study we focused on extreme dry/wet and heat events, using for their description respectively the well-known SPI (Standardized Precipitation Index) and an index developed by us, called SHI (Standardized Heat-waves Index). The second step consists in the development of a computational strategy to combine these, and possibly other indices, so that the ECI can describe, by means of a single indicator, different types of climatic extremes. According to the methodology proposed in this paper, the ECI is defined by two statistical components: the ECI intensity, which indicates whether an event is extreme or not; the angular component, which represent the contribution of each hazard to the overall intensity of the index. The ECI can thus be used to identify "extremes" after defining a suitable threshold above which the events can be held as extremes. In this presentation, after describing the methodology we used for the construction of the ECI, we present results obtained on different African regions, using NCEP Reanalysis dataset for air temperature at sig995 level and CHIRP dataset for precipitations. Particular attention will be devoted to 2015/2016 Malawi drought, which received some media attention due to the failure of the risk assessment model used to trigger due payouts: it will be shown how, on the contrary, combination of hydrological and temperature data used in ECI succeed in evaluating the extremeness of this event.
An approach to collective behavior in cell cultures: modeling and analysis of ECIS data
NASA Astrophysics Data System (ADS)
Rabson, David; Lafalce, Evan; Lovelady, Douglas; Lo, Chun-Min
2011-03-01
We review recent results in which statistical measures of noise in ECIS data distinguished healthy cell cultures from cancerous or poisoned ones: after subtracting the ``signal,'' the 1 /fα noise in the healthy cultures shows longer short-time and long-time correlations. We discuss application of an artificial neural network to detect the cancer signal, and we demonstrate a computational model of cell-cell communication that produces signals similar to those of the experimental data. The simulation is based on the q -state Potts model with inspiration from the Bak-Tang-Wiesenfeld sand-pile model. We view the level of organization larger than cells but smaller than organs or tissues as a kind of ``mesoscopic'' biological physics, in which few-body interactions dominate, and the experiments and computational model as ways of exploring this regime.
Neutron displacement cross-sections for tantalum and tungsten at energies up to 1 GeV
NASA Astrophysics Data System (ADS)
Broeders, C. H. M.; Konobeyev, A. Yu.; Villagrasa, C.
2005-06-01
The neutron displacement cross-section has been evaluated for tantalum and tungsten at energies from 10 -5 eV up to 1 GeV. The nuclear optical model, the intranuclear cascade model combined with the pre-equilibrium and evaporation models were used for the calculations. The number of defects produced by recoil atoms nuclei in materials was calculated by the Norgett, Robinson, Torrens model and by the approach combining calculations using the binary collision approximation model and the results of the molecular dynamics simulation. The numerical calculations were done using the NJOY code, the ECIS96 code, the MCNPX code and the IOTA code.
Dello Russo, Antonio; Fassini, Gaetano; Casella, Michela; Bologna, Fabrizio; Al-Nono, Osama; Colombo, Daniele; Biagioli, Viviana; Santangeli, Pasquale; Di Biase, Luigi; Zucchetti, Martina; Majocchi, Benedetta; Marino, Vittoria; Gallinghouse, Joseph J; Natale, Andrea; Tondo, Claudio
2014-06-01
Contact with cardiac tissue is a determinant of lesion efficacy during atrial fibrillation (AF) ablation. The Sensei®X Robotic Catheter System (Hansen Medical, CA) has been validated for contact force sensing. The electrical coupling index (ECI) from the EnSite Contact™ system (St. Jude Medical, MN) has been validated as an indicator of tissue contact. We aimed at analyzing ECI behavior during radiofrequency (RF) pulses maintaining a stable contact through the robotic navigation contact system. In 15 patients (age, 59 ± 12) undergoing AF ablation, pulmonary vein (PV) isolation was guided by the Sensei®X System, employing the Contact™ catheter. During the procedure, we assessed ECI changes associated with adequate contact based on the IntelliSense® force-sensing technology (Hansen Medical, CA. Baseline contact (27 ± 8 g/cm(2)) ECI value was 99 ± 13, whereas ECI values in a noncontact site (0 g/cm(2)) and in a light contact site (1-10 g/cm(2)) were respectively 66 ± 12 and 77 ± 10 (p < 0.0001). Baseline contact ECI values were not different depending on AF presentation (paroxysmal AF, 98 ± 9; persistent AF, 100 ± 9) or on cardiac rhythm (sinus rhythm, 97 ± 7; AF,101 ± 10). In all PVs, ECI was significantly reduced during and after ablation (ECI during RF, 56 ± 15; ECI after RF, 72 ± 16; p < 0.001). A mean reduction of 32.2% during RF delivery and 25.4% immediately after RF discontinuation compared with baseline ECI was observed. Successful PV isolation is associated with a significant decrease in ECI of at least 20 %. This may be used as a surrogate marker of effective lesion in AF ablation.
Evaluation of the VTI ECI-1 corrosion monitoring device.
DOT National Transportation Integrated Search
2006-01-01
This report describes an evaluation performed by the California Department of Transportation (Department) of the : ECI-1 Embedded Corrosion Instrument, developed by Virginia Technologies, Incorporated (VTI). : The ECI-1 Embedded Corrosion Instrument ...
Crespo-Maraver, Mariacruz; Doval, Eduardo; Fernández-Castro, Jordi; Giménez-Salinas, Jordi; Prat, Gemma; Bonet, Pere
2018-04-04
To adapt and to validate the Experience of Caregiving Inventory (ECI) in a Spanish population, providing empirical evidence of its internal consistency, internal structure and validity. Psychometric validation of the adapted version of the ECI. One hundred and seventy-two caregivers (69.2% women), mean age 57.51 years (range: 21-89) participated. Demographic and clinical data, standardized measures (ECI, suffering scale of SCL-90-R, Zarit burden scale) were used. The two scales of negative evaluation of the ECI most related to serious mental disorders (disruptive behaviours [DB] and negative symptoms [NS]) and the two scales of positive appreciation (positive personal experiences [PPE], and good aspects of the relationship [GAR]) were analyzed. Exploratory structural equation modelling was used to analyze the internal structure. The relationship between the ECI scales and the SCL-90-R and Zarit scores was also studied. The four-factor model presented a good fit. Cronbach's alpha (DB: 0.873; NS: 0.825; PPE: 0.720; GAR: 0.578) showed a higher homogeneity in the negative scales. The SCL-90-R scores correlated with the negative ECI scales, and none of the ECI scales correlated with the Zarit scale. The Spanish version of the ECI can be considered a valid, reliable, understandable and feasible self-report measure for its administration in the health and community context. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Decoding of exon splicing patterns in the human RUNX1-RUNX1T1 fusion gene.
Grinev, Vasily V; Migas, Alexandr A; Kirsanava, Aksana D; Mishkova, Olga A; Siomava, Natalia; Ramanouskaya, Tatiana V; Vaitsiankova, Alina V; Ilyushonak, Ilia M; Nazarov, Petr V; Vallar, Laurent; Aleinikova, Olga V
2015-11-01
The t(8;21) translocation is the most widespread genetic defect found in human acute myeloid leukemia. This translocation results in the RUNX1-RUNX1T1 fusion gene that produces a wide variety of alternative transcripts and influences the course of the disease. The rules of combinatorics and splicing of exons in the RUNX1-RUNX1T1 transcripts are not known. To address this issue, we developed an exon graph model of the fusion gene organization and evaluated its local exon combinatorics by the exon combinatorial index (ECI). Here we show that the local exon combinatorics of the RUNX1-RUNX1T1 gene follows a power-law behavior and (i) the vast majority of exons has a low ECI, (ii) only a small part is represented by "exons-hubs" of splicing with very high ECI values, and (iii) it is scale-free and very sensitive to targeted skipping of "exons-hubs". Stochasticity of the splicing machinery and preferred usage of exons in alternative splicing can explain such behavior of the system. Stochasticity may explain up to 12% of the ECI variance and results in a number of non-coding and unproductive transcripts that can be considered as a noise. Half-life of these transcripts is increased due to the deregulation of some key genes of the nonsense-mediated decay system in leukemia cells. On the other hand, preferred usage of exons may explain up to 75% of the ECI variability. Our analysis revealed a set of splicing-related cis-regulatory motifs that can explain "attractiveness" of exons in alternative splicing but only when they are considered together. Cis-regulatory motifs are guides for splicing trans-factors and we observed a leukemia-specific profile of expression of the splicing genes in t(8;21)-positive blasts. Altogether, our results show that alternative splicing of the RUNX1-RUNX1T1 transcripts follows strict rules and that the power-law component of the fusion gene organization confers a high flexibility to this process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Early Career Investigator Opportunities in Geophysics with IRIS
NASA Astrophysics Data System (ADS)
Colella, H.; Sumy, D. F.; Schutt, D.
2016-12-01
Early career geoscientists face many challenges as they transition from senior level graduate students into postdoctoral researchers, tenure-track faculty positions, or the vast array of employment opportunities outside of academia. However, few receive adequate mentoring or guidance on how to successfully make the leap from graduate school to a fulfilling career. In recognition of these hurdles and challenges, the Incorporated Research Institutions for Seismology (IRIS) created an early-career investigator (ECI) program in 2011 to help reduce barriers for newly minted scientists, researchers, and educators on their path to success. The core mission of the ECI program is to organize practical resources and professional development opportunities for ECIs. The initiative has encouraged and supported collaboration between ECIs and senior scientists through colloquium lectureships and visiting scientist collaborations, which aimed to increase the visibility of ECIs and their research and to promote interaction between junior and senior scientists outside of their home institutions. Additionally, ECI-centric events are held at various national meetings to showcase the range of career paths available in geophysics, openly discuss the challenges ECIs face (e.g., work-life balance, job search difficulties, teaching challenges), expose participants to the ECI program's initiatives and resources, and better inform IRIS about the needs of the community. Post-workshop evaluations reveal ECIs are eager to have exposure to a variety of workforce options and a forum in which to ask difficult questions. Of note, there is a variety of cultural knowledge and expectations assumed in both the academic and professional worlds that is often not formally disseminated. The ECI program aims to better understand and facilitate transfer of this knowledge and reduce barriers to success for ECIs from both traditional and non-traditional backgrounds. The program also features webinars focused on a range of research and career topics, which can be attended in real-time or viewed on YouTube, with over 6,000 total views to date. Moving forward, an additional focus will be on intentional mentorship opportunities as we continue to build a productive, supportive, and successful early career community.
Evaluating effective pair and multisite interactions for Ni-Mo system
NASA Astrophysics Data System (ADS)
Banerjee, Rumu H.; Arya, A.; Banerjee, S.
2018-04-01
Cluster expansion (CE) method was used to calculate the energies of various Ni-Mo phases. The clusters comprising of few nearest neighbours can describe any phase of Ni-Mo system by suitable choice of effective pair and multisite interaction parameters (ECI). The ECIs were evaluated in present study by fitting the ground state energies obtained by first principle calculations. The ECIs evaluated for Ni-Mo system were mostly pair clusters followed by triplets and quadruplet clusters with cluster diameters in the range 2.54 - 10.20 Å. The ECI values diminished for multi-body (triplets and quadruplets) clusters as compared to 2-point or pair clusters indicating a good convergence of CE model. With these ECIs the predicted energies of all the Ni-Mo structures across the Mo concentration range 0-100 at% were obtained. The quantitative error in the energies calculated by CE approach and first principle is very small (< 0.026 meV/atom). The appreciable values of 2-point ECIs upto 4th nearest neighbour reveal that two body interactions are dominant in the case of Ni-Mo system. These ECIs are compared with the reported values of compositional dependent effective pair interactions evaluated by first principle as well as by Monte Carlo method.
Early Childhood Intervention in China from the Families' Perspective
ERIC Educational Resources Information Center
Zheng, Yuzhu; Maude, Susan P.; Brotherson, Mary Jane; Merritts, Ashley
2016-01-01
Research highlights the importance of early childhood intervention (ECI) for children with disabilities, and there is an increasing interest in China with respect to research on ECI. However, little research exists exploring the experience of families of young children with disabilities receiving ECI services and supports in China. The purpose of…
Watterson, Jeannette M; Stallcup, Paulina; Escamilla, David; Chernay, Patrick; Reyes, Alfred; Trevino, Sylvia C
2007-01-01
After observing a high incidence of low positive hepatitis C virus (HCV) antibody screens by the Ortho-Clinical Vitros ECi test (Orthoclinical Diagnostics, Raritan, NJ), we compared results against those obtained using another chemiluminescent analyzer, as well as two U.S. Food and Drug Administration (FDA)-approved confirmatory methodologies. To ascertain the true anti-HCV status of samples deemed low-positive by the Ortho-Clinical Vitros ECi test, we tested samples using the ADVIA Centaur HCV screen test (Siemens Medical Solutions Diagnostics), the Chiron recombinant immunoblot assay (RIBA) test (Chiron Corp., Emeryville, CA), and the Roche COBAS Amplicor HCV qualitative test (Roche Diagnostics, Indianapolis, IN) in a series of studies. Of 94 specimens positive by Vitros ECi, 19% were observed to be negative by Centaur. A separate study of 91 samples with signal-to-cutoff (s/co) values less than 8.0 showed that all but one was negative for HCV ribonucleic acid (RNA). In comparison with RIBA, 100% (77) samples positive by the Vitros ECi test with s/co values less than 12.0 were negative or indeterminate by RIBA. A final study comparing all four methods side-by-side showed 63% disagreement by Centaur for Vitros ECi low-positive samples, 75% disagreement by RIBA, and 97% disagreement by polymerase chain reaction (PCR). In conclusion, the Ortho-Clinical Vitros ECi Anti-HCV test yields a high rate of false-positive results in the low s/co range in our patient population. (c) 2007 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Helmschrot, J.; Malherbe, J.; Chamunorwa, M.; Muthige, M.; Petitta, M.; Calmanti, S.; Cucchi, M.; Syroka, J.; Iyahen, E.; Engelbrecht, F.
2017-12-01
Climate services are a key component of National Adaptation Plan (NAP) processes, which require the analysis of current climate conditions, future climate change scenarios and the identification of adaptation strategies, including the capacity to finance and implement effective adaptation options. The Extreme Climate Facility (XCF) proposed by the African Risk Capacity (ARC) developed a climate index insurance scheme, which is based on the Extreme Climate Index (ECI): an objective, multi-hazard index capable of tracking changes in the frequency or magnitude of extreme weather events, thus indicating possible shifts to a new climate regime in various regions. The main hazards covered by ECI are extreme dry, wet and heat events, with the possibility of adding other region-specific risk events. The ECI is standardized across broad geographical regions, so that extreme events occurring under different climatic regimes in Africa can be compared. Initially developed by an Italian company specialized in Climate Services, research is now conducted at the CSIR and SASSCAL, to verify and further develop the ECI for application in southern African countries, through a project initiated by the World Food Programme (WFP) and ARC. The paper will present findings on the most appropriate definitions of extremely wet and dry conditions in Africa, in terms of their impact across a multitude of sub-regional climates of the African continent. Findings of a verification analysis of the ECI, as determined through vegetation monitoring data and the SASSCAL weather station network will be discussed. Changes in the ECI under climate change will subsequently be projected, using detailed regional projections generated by the CSIR and through the Coordinated Regional Downscaling Experiment (CORDEX). This work will be concluded by the development of a web-based climate service informing African Stakeholders on climate extremes.
Kesinger, Matthew R; Juengst, Shannon B; Bertisch, Hillary; Niemeier, Janet P; Krellman, Jason W; Pugh, Mary Jo; Kumar, Raj G; Sperry, Jason L; Arenth, Patricia M; Fann, Jesse R; Wagner, Amy K
2016-08-01
To determine whether severity of head and extracranial injuries (ECI) is associated with suicidal ideation (SI) or suicide attempt (SA) after traumatic brain injury (TBI). Factors associated with SI and SA were assessed in this inception cohort study using data collected 1, 2, and 5 years post-TBI from the National Trauma Data Bank and Traumatic Brain Injury Model Systems (TBIMS) databases. Level I trauma centers, inpatient rehabilitation centers, and the community. Participants with TBI from 15 TBIMS Centers with linked National Trauma Data Bank trauma data (N=3575). Not applicable. SI was measured via the Patient Health Questionnaire 9 (question 9). SA in the last year was assessed via interview. ECI was measured by the Injury Severity Scale (nonhead) and categorized as none, mild, moderate, or severe. There were 293 (8.2%) participants who had SI without SA and 109 (3.0%) who had SA at least once in the first 5 years postinjury. Random effects logit modeling showed a higher likelihood of SI when ECI was severe (odds ratio=2.73; 95% confidence interval, 1.55-4.82; P=.001). Drug use at time of injury was also associated with SI (odds ratio=1.69; 95% confidence interval, 1.11-2.86; P=.015). Severity of ECI was not associated with SA. Severe ECI carried a nearly 3-fold increase in the odds of SI after TBI, but it was not related to SA. Head injury severity and less severe ECI were not associated with SI or SA. These findings warrant additional work to identify factors associated with severe ECI that make individuals more susceptible to SI after TBI. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Massumi, Mohammad; Hoveizi, Elham; Baktash, Parvaneh; Hooti, Abdollah; Ghazizadeh, Leili; Nadri, Samad; Pourasgari, Farzaneh; Hajarizadeh, Athena; Soleimani, Masoud; Nabiuni, Mohammad; Khorramizadeh, Mohammad R
2014-03-10
Due to pluripotency of induced pluripotent stem (iPS) cells, and the lack of immunological incompatibility and ethical issues, iPS cells have been considered as an invaluable cell source for future cell replacement therapy. This study was aimed first at establishment of novel iPS cells, ECiPS, which directly reprogrammed from human Eye Conjunctiva-derived Mesenchymal Stem Cells (EC-MSCs); second, comparing the inductive effects of Wnt3a/Activin A biomolecules to IDE1 small molecule in derivation of definitive endoderm (DE) from the ECiPS cells. To that end, first, the EC-MSCs were transduced by SOKM-expressing lentiviruses and characterized for endogenous expression of embryonic markers Then the established ECiPS cells were induced to DE formation by Wnt3a/Activin A or IDE1. Quantification of GSC, Sox17 and Foxa2 expression, as DE-specific markers, in both mRNA and protein levels revealed that induction of ECiPS cells by either Wnt3a/Activin A or IDE1 could enhance the expression level of the genes; however the levels of increase were higher in Wnt3a/Activin A induced ECiPS-EBs than IDE1 induced cells. Furthermore, the flow cytometry analyses showed no synergistic effect between Activin A and Wnt3a to derive DE-like cells from ECiPS cells. The comparative findings suggest that although both Wnt3a/Activin A signaling and IDE1 molecule could be used for differentiation of iPS into DE cells, the DE-inducing effect of Wnt3a/Activin A was statistically higher than IDE1. Copyright © 2014 Elsevier Inc. All rights reserved.
Quantifying China's regional economic complexity
NASA Astrophysics Data System (ADS)
Gao, Jian; Zhou, Tao
2018-02-01
China has experienced an outstanding economic expansion during the past decades, however, literature on non-monetary metrics that reveal the status of China's regional economic development are still lacking. In this paper, we fill this gap by quantifying the economic complexity of China's provinces through analyzing 25 years' firm data. First, we estimate the regional economic complexity index (ECI), and show that the overall time evolution of provinces' ECI is relatively stable and slow. Then, after linking ECI to the economic development and the income inequality, we find that the explanatory power of ECI is positive for the former but negative for the latter. Next, we compare different measures of economic diversity and explore their relationships with monetary macroeconomic indicators. Results show that the ECI index and the non-linear iteration based Fitness index are comparative, and they both have stronger explanatory power than other benchmark measures. Further multivariate regressions suggest the robustness of our results after controlling other socioeconomic factors. Our work moves forward a step towards better understanding China's regional economic development and non-monetary macroeconomic indicators.
Lipid degradation promotes prostate cancer cell survival.
Itkonen, Harri M; Brown, Michael; Urbanucci, Alfonso; Tredwell, Gregory; Ho Lau, Chung; Barfeld, Stefan; Hart, Claire; Guldvik, Ingrid J; Takhar, Mandeep; Heemers, Hannelore V; Erho, Nicholas; Bloch, Katarzyna; Davicioni, Elai; Derua, Rita; Waelkens, Etienne; Mohler, James L; Clarke, Noel; Swinnen, Johan V; Keun, Hector C; Rekvig, Ole P; Mills, Ian G
2017-06-13
Prostate cancer is the most common male cancer and androgen receptor (AR) is the major driver of the disease. Here we show that Enoyl-CoA delta isomerase 2 (ECI2) is a novel AR-target that promotes prostate cancer cell survival. Increased ECI2 expression predicts mortality in prostate cancer patients (p = 0.0086). ECI2 encodes for an enzyme involved in lipid metabolism, and we use multiple metabolite profiling platforms and RNA-seq to show that inhibition of ECI2 expression leads to decreased glucose utilization, accumulation of fatty acids and down-regulation of cell cycle related genes. In normal cells, decrease in fatty acid degradation is compensated by increased consumption of glucose, and here we demonstrate that prostate cancer cells are not able to respond to decreased fatty acid degradation. Instead, prostate cancer cells activate incomplete autophagy, which is followed by activation of the cell death response. Finally, we identified a clinically approved compound, perhexiline, which inhibits fatty acid degradation, and replicates the major findings for ECI2 knockdown. This work shows that prostate cancer cells require lipid degradation for survival and identifies a small molecule inhibitor with therapeutic potential.
Engineering Software Suite Validates System Design
NASA Technical Reports Server (NTRS)
2007-01-01
EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... AD using a drawdown plan that includes a borescope inspection (BSI) or eddy current inspection (ECI... inspection (BSI) or eddy current inspection (ECI) of the disk rim according to the following schedule: (i...
Ovalle, Alfredo; Romero, Roberto; Gómez, Ricardo; Martínez, M Angélica; Nien, Jyh Kae; Ferrand, Pedro; Aspillaga, Carlos; Figueroa, Jorge
2006-08-01
To determine whether broad-spectrum antibiotic administration to patients with preterm labor and intact membranes is associated with an improvement in neonatal and maternal outcomes, particularly in patients with microbial invasion of the amniotic cavity (MIAC) or endocervical inflammation (ECI). A prospective clinical trial was conducted in which women in premature labor were alternately allocated to receive either antibiotics or placebo, and information about MIAC and ECI collected. Eighty-four pregnant women between 24 and 34 weeks of gestation with spontaneous preterm labor were enrolled. Exclusion criteria were cervical dilatation greater than 3 cm, clinical chorioamnionitis, abruption, rupture of membranes, vaginal bleeding, and several additional fetal and maternal conditions that may influence perinatal outcome. Amniocentesis was offered to all patients and the cervix and vagina were sampled for microbiological and cytological studies. Eligible patients were allocated to receive either clindamycin-gentamycin or placebo for 7 days. Corticosteroids and tocolysis with beta-adrenergic agents were used according to the standard management of our institution. MIAC was defined as the presence of a positive amniotic fluid culture obtained by trans-abdominal amniocentesis. ECI was diagnosed when a significant increase in the white blood cell count of the endocervical secretions was found. A composite neonatal morbidity/mortality outcome was created, including severe neonatal morbidity (respiratory distress syndrome, asphyxia, sepsis, pneumonia, intraventricular hemorrhage) and mortality. Thirty-nine women received antibiotics and 40 received placebo. The prevalence of ECI and MIAC in both groups was comparable (antibiotic group ECI 61.5% (24/39) and MIAC 20.5% (8/39); placebo group ECI 62.5% (25/40) and MIAC 20% (8/40); p > 0.05). Overall, there were no significant differences in maternal infections and composite neonatal outcomes between antibiotic and placebo groups. Women who received antibiotics had a lower rate of subsequent rupture of membranes compared to patients who received placebo (2.6% (1/39) vs. 25% (10/40), respectively; p = 0.007). A sub-analysis showed that among patients with ECI, antibiotic administration was associated with a lower rate of composite neonatal morbidity/mortality outcome compared to those who received placebo (4.2% (1/24) vs. 28% (7/25), respectively; p < 0.05). This association was also present in patients with ECI without MIAC (0% (0/16) vs. 27.8% (5/18); p < 0.05), but not in patients with ECI and MIAC (antibiotic group 12.5% (1/8) vs. placebo group 28.6% (2/7); p > 0.05). The combination of antibiotics used in this study did not improve maternal or perinatal outcome in patients with preterm labor and intact membranes. Further studies are required to determine if women with endocervicitis presenting with preterm labor and intact membranes may benefit from antibiotic administration.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-24
... (BSI) or eddy current inspection (ECI) of the disk outer rim front rail for cracks prior to... date of this AD. (ii) Perform a borescope inspection (BSI) or eddy current inspection (ECI) of the...
Early Childhood Inclusion in Israel
ERIC Educational Resources Information Center
Al-Yagon, Michal; Aram, Dorit; Margalit, Malka
2016-01-01
This article describes conceptual aspects, current policies and practices, and research representing the Israeli perspective regarding early childhood inclusion (ECI) at preschool ages (3-6 years). We review legislative, historical, attitudinal, philosophical, practical, empirical, and cultural issues regarding ECI in Israel. Finally, we focus on…
Professional Training in Early Intervention: A European Perspective
ERIC Educational Resources Information Center
Pretis, Manfred
2006-01-01
Professional training in early childhood intervention (ECI), particularly additional certificates, degrees, or continuing education, is currently a major topic within European working groups. The complexity of ECI, including medical, pedagogical, psychological, and social involvement, the need for both family- and child-centered work, and the…
Enhanced collective influence: A paradigm to optimize network disruption
NASA Astrophysics Data System (ADS)
Wu, Tao; Chen, Leiting; Zhong, Linfeng; Xian, Xingping
2017-04-01
The function of complex networks typically relies on the integrity of underlying structure. Sometimes, practical applications need to attack networks' function, namely inactivate and fragment networks' underlying structure. To effectively dismantle complex networks and regulate the function of them, a centrality measure, named CI (Morone and Makse, 2015), was proposed for node ranking. We observe that the performance of CI centrality in network disruption problem may deteriorate when it is used in networks with different topology properties. Specifically, the structural features of local network topology are overlooked in CI centrality, even though the local network topology of the nodes with a fixed CI value may have very different organization. To improve the ranking accuracy of CI, this paper proposes a variant ECI to CI by considering loop density and degree diversity of local network topology. And the proposed ECI centrality would degenerate into CI centrality with the reduction of the loop density and the degree diversity level. By comparing ECI with CI and classical centrality measures in both synthetic and real networks, the experimental results suggest that ECI can largely improve the performance of CI for network disruption. Based on the results, we analyze the correlation between the improvement and the properties of the networks. We find that the performance of ECI is positively correlated with assortative coefficient and community modularity and negatively correlated with degree inequality of networks, which can be used as guidance for practical applications.
Collaboration in Early Childhood Intervention Services in Gauteng: Caregiver Perspectives
ERIC Educational Resources Information Center
Kyarkanaye, Thilendree; Dada, Shakila; Samuels, Alecia E.
2017-01-01
A central tenant of early childhood intervention (ECI) is collaboration between professionals and the caregivers of children receiving these services. There are limited studies on caregiver perceptions of collaboration in ECI teams particularly in resource-limited countries. Sixty-four caregivers participated in this study by completing a…
Collaborative Practice in Early Childhood Intervention from the Perspectives of Service Providers
ERIC Educational Resources Information Center
Yang, Chih-Hung; Hossain, Syeda Zakia; Sitharthan, Gomathi
2013-01-01
Effective early childhood intervention (ECI) relies on collaboration among agencies, service providers, and families. Although previous literature has primarily focused on segments of collaboration within ECI service delivery, the actual process and how the adult stakeholders perceive and engage in collaborative practice have important…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-22
... applies to Rolls-Royce Corporation (RRC) AE 3007A series turbofan engines with high-pressure turbine (HPT... eddy current inspection (ECI) or surface wave ultrasonic test (SWUT) inspection on each affected high-pressure turbine (HPT) wheel. This AD requires removing or performing initial and repetitive ECIs or SWUT...
Depressive Symptoms, Rumination, and Emotion Reactivity Among Youth: Moderation by Gender.
Somers, Jennifer A; Borelli, Jessica L; Hilt, Lori M
2018-06-07
The emotion context-insensitivity hypothesis (ECI; Rottenberg et al., 2005) posits that depressive symptoms are associated with blunted emotional reactivity and is supported by the results of a meta-analysis (Bylsma et al., 2008). Yet it remains unclear how strongly ECI holds across emotional response domains, whether ECI operates similarly in male and female individuals, and whether this pattern of underreactivity is observed in youth. In contrast, rumination, a cognitive style strongly associated with depressive symptoms, may be associated with heightened reactivity. We assessed the effects of youth's depressive symptoms and rumination on subjective and physiological emotion reactivity (N = 160; M age = 12.67, SD age = 1.12; 48% female; 94% non-Hispanic). State sadness and respiratory sinus arrhythmia were assessed during a baseline activity (nature video) and a sad mood induction. As hypothesized, depressive symptoms predicted less subjective emotional reactivity, whereas rumination predicted more subjective reactivity. Exploratory analyses revealed that associations for physiological reactivity differed by child gender. ECI may be stronger in terms of subjective rather than physiological emotional reactivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pecover, J. D.; Chittenden, J. P.
A critical limitation of magnetically imploded systems such as magnetized liner inertial fusion (MagLIF) [Slutz et al., Phys. Plasmas 17, 056303 (2010)] is the magneto-Rayleigh-Taylor (MRT) instability which primarily disrupts the outer surface of the liner. MagLIF-relevant experiments have showed large amplitude multi-mode MRT instability growth growing from surface roughness [McBride et al., Phys. Rev. Lett. 109, 135004 (2012)], which is only reproduced by 3D simulations using our MHD code Gorgon when an artificially azimuthally correlated initialisation is added. We have shown that the missing azimuthal correlation could be provided by a combination of the electro-thermal instability (ETI) and anmore » “electro-choric” instability (ECI); describing, respectively, the tendency of current to correlate azimuthally early in time due to temperature dependent Ohmic heating; and an amplification of the ETI driven by density dependent resistivity around vapourisation. We developed and implemented a material strength model in Gorgon to improve simulation of the solid phase of liner implosions which, when applied to simulations exhibiting the ETI and ECI, gave a significant increase in wavelength and amplitude. Full circumference simulations of the MRT instability provided a significant improvement on previous randomly initialised results and approached agreement with experiment.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
...) stages 2-5 spools installed. This proposed AD would require eddy current inspection (ECI) or spot... on the static seal. This proposed AD would require ECI or spot FPI of the stages 1-2 rotating seal... part 39 continues to read as follows: [[Page 30575
Early Childhood Intervention in Portugal: An Overview Based on the Developmental Systems Model
ERIC Educational Resources Information Center
Pinto, Ana Isabel; Grande, Catarina; Aguiar, Cecilia; de Almeida, Isabel Chaves; Felgueiras, Isabel; Pimentel, Julia Serpa; Serrano, Ana Maria; Carvalho, Leonor; Brandao, Maria Teresa; Boavida, Tania; Santos, Paula; Lopes-dos-Santos, Pedro
2012-01-01
Research studies on early childhood intervention (ECI) in Portugal are diffuse regarding both program components and the geographical area under scrutiny. Since the 1990s, a growing body of knowledge and evidence in ECI is being gathered, based on postgraduate teaching, in-service training, and research. This article draws on the systems theory…
ERIC Educational Resources Information Center
Bagnato, Stephen J.; Suen, Hoi K.; Fevola, Antonio V.
2011-01-01
The accountability movement in education, in general, and early childhood intervention (ECI), specifically, have fueled the debate about the quality, benefits, and limitations of various types of publicly funded ECI and human service programs (PEW Charitable Trusts, 2008; National Research Council/National Academy of Sciences, 2009) not only in…
ERIC Educational Resources Information Center
Greenwood, Charles R.; Carta, Judith J.; Walker, Dale; Hughes, Kere; Weathers, Mary
2006-01-01
Early interventionists are accountable for the progress of children receiving their services. Technically adequate measures of the progress of individual children are needed. While the Early Communication Indicator (ECI) for infants and toddlers is one such measure, data to support its use are limited to a single research report. In this…
Identification of Prostate Cancer Prognostic Markers
2016-10-01
Technologies). For this, the oxygen consumption rate (OCR) in the PC-3 control and ECI1-overexpressing clones was measured following their maintenance...carnitine Carnitine β-oxydation Etomoxir Page 25 of 31 Figure 10: Mitochondrial Respiration in ECI1-overexpressing PC-3 Clones. Oxygen Consumption rate... FISH ), prognostic markers, biomarkers, tissue microarrays, autophagy 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES
Note on a Cross-cultural Test of Gilligan's Ethic of Care
ERIC Educational Resources Information Center
Vikan, Arne; Camino, Cleonice; Biaggio, Angela
2005-01-01
One hundred and twenty students from both Brazil and Norway were tested with Skoe's Ethic of Care Interview (ECI), which is a test of Gilligan's hypothesized gender-related ethic of care. Subjects were also tested with Bem's Sex Role Inventory and Triandis's Test of Cultural Orientations. The ECI was shown to be related neither to gender nor to…
ERIC Educational Resources Information Center
Poon, Kenneth K.; Lim, Ai-Keow
2012-01-01
Singapore is a young island nation with a diverse population. Its support for young children at risk has its roots in the 1950s, but early childhood intervention (ECI) programs for young children with disabilities emerged only in the 1980s. ECI programs have proliferated in the subsequent years, offering an increasing range of service delivery…
75 FR 38179 - Proposed Collection; Comment Request for Forms W-8BEN, W-8ECI, W-8EXP, and W-8IMY
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
... States Tax Withholding, Form W-8ECI, Certificate of Foreign Person's Claim for Exemption From Withholding on Income Effectively Connected With the Conduct of a Trade or Business in the United States, Form W-8EXP, Certificate of Foreign Government or Other Foreign Organization for United States Tax Withholding...
Morpho-syntactic reading comprehension in children with early and late cochlear implants.
López-Higes, Ramón; Gallego, Carlos; Martín-Aragoneses, María Teresa; Melle, Natalia
2015-04-01
This study explores morpho-syntactic reading comprehension in 19 Spanish children who received a cochlear implant (CI) before 24 months of age (early CI [e-CI]) and 19 Spanish children who received a CI after 24 months (late CI [l-CI]). They all were in primary school and were compared to a hearing control (HC) group of 19 children. Tests of perceptual reasoning, working memory, receptive vocabulary, and morpho-syntactic comprehension were used in the assessment. It was observed that while children with l-CI showed a delay, those with e-CI reached a level close to that which was obtained by their control peers in morpho-syntactic comprehension. Thus, results confirm a positive effect of early implantation on morpho-syntactic reading comprehension. Inflectional morphology and simple sentence comprehension were noted to be better in the e-CI group than in the l-CI group. The most important factor in distinguishing between the HC and l-CI groups or the e-CI and l-CI groups was verbal inflectional morphology. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Soroa, Goretti; Aritzeta, Aitor; Balluerka, Nekane; Gorostiaga, Arantxa
2016-06-03
Emotional creativity is defined as the ability to feel and express emotions in a new, effective and authentic way. There are currently no Basque-language self-report instruments to provide valid and reliable measures of this construct. Thus, this paper describes the process of adapting and validating the Emotional Creativity Inventory (ECI) for the Basque-speaking population. The sample was comprised of 594 higher education students (388 women and 206 men) aged between 18 and 32 years old (Mage = 20.47; SD = 2.48). The Basque version of the ECI was administered along with the TMMS-23, NEO PI-R, and PANAS. The results of exploratory and confirmatory factor analyses on the Basque ECI corroborated the original scale's three-factor structure (preparedness, novelty, and effectiveness/authenticity). Those dimensions showed acceptable indexes of internal consistency (α = .80, .83, and .83) and temporal stability (r = .70, .69, and .74). The study also provided some evidence of external validity (p < .05) based on the relationships found between emotional creativity and emotional intelligence, personality, affect, and sex. The Basque ECI can be regarded as a useful tool to evaluate perceived emotional creativity during the preparation and verification phases of the creative process.
ERIC Educational Resources Information Center
Diken, Ibrahim H.; Bayhan, Pinar; Turan, Figen; Sipal, R. Firat; Sucuoglu, Bulbin; Ceber-Bakkaloglu, Hatice; Gunel, Mintaze Kerem; Kara, Ozgun Kaya
2012-01-01
The purpose of this article was to provide an overview of early childhood intervention and early childhood special education (ECI/ECSE) services and practices in Turkey by using the Developmental System Approach (M. J. Guralnick, 2001). After pointing out the history of early childhood and ECI/ECSE services and current legislations with regard to…
Hughes, Josie S; Hurford, Amy; Finley, Rita L; Patrick, David M; Wu, Jianhong; Morris, Andrew M
2016-12-16
We aimed to construct widely useable summary measures of the net impact of antibiotic resistance on empiric therapy. Summary measures are needed to communicate the importance of resistance, plan and evaluate interventions, and direct policy and investment. As an example, we retrospectively summarised the 2011 cumulative antibiogram from a Toronto academic intensive care unit. We developed two complementary indices to summarise the clinical impact of antibiotic resistance and drug availability on empiric therapy. The Empiric Coverage Index (ECI) measures susceptibility of common bacterial infections to available empiric antibiotics as a percentage. The Empiric Options Index (EOI) varies from 0 to 'the number of treatment options available', and measures the empiric value of the current stock of antibiotics as a depletable resource. The indices account for drug availability and the relative clinical importance of pathogens. We demonstrate meaning and use by examining the potential impact of new drugs and threatening bacterial strains. In our intensive care unit coverage of device-associated infections measured by the ECI remains high (98%), but 37-44% of treatment potential measured by the EOI has been lost. Without reserved drugs, the ECI is 86-88%. New cephalosporin/β-lactamase inhibitor combinations could increase the EOI, but no single drug can compensate for losses. Increasing methicillin-resistant Staphylococcus aureus (MRSA) prevalence would have little overall impact (ECI=98%, EOI=4.8-5.2) because many Gram-positives are already resistant to β-lactams. Aminoglycoside resistance, however, could have substantial clinical impact because they are among the few drugs that provide coverage of Gram-negative infections (ECI=97%, EOI=3.8-4.5). Our proposed indices summarise the local impact of antibiotic resistance on empiric coverage (ECI) and available empiric treatment options (EOI) using readily available data. Policymakers and drug developers can use the indices to help evaluate and prioritise initiatives in the effort against antimicrobial resistance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Banham, David; Roder, David; Brown, Alex
2018-02-01
Aboriginal Australians have poorer cancer survival than other Australians. Diagnoses at later stages and correlates of remote area living influence, but do not fully explain, these disparities. Little is known of the prevalence and influence of comorbid conditions experienced by Aboriginal people, including their effect on cancer survival. This study quantifies hospital recorded comorbidities using the Elixhauser Comorbidity Index (ECI), examines their influence on risk of cancer death, then considers effect variation by Aboriginality. Cancers diagnosed among Aboriginal South Australians in 1990-2010 (N = 777) were matched with randomly selected non-Aboriginal cases by birth year, diagnostic year, sex, and primary site, then linked to administrative hospital records to the time of diagnosis. Competing risk regression summarised associations of Aboriginal status, stage, geographic attributes and comorbidities with risk of cancer death. A threshold of four or more ECI conditions was associated with increased risk of cancer death (sub-hazard ratio SHR 1.66, 95%CI 1.11-2.46). Alternatively, the presence of any one of a subset of ECI conditions was associated with similarly increased risk (SHR = 1.62, 95%CI 1.23-2.14). The observed effects did not differ between Aboriginal and matched non-Aboriginal cases. However, Aboriginal cases experienced three times higher exposure than non-Aboriginal to four or more ECI conditions (14.2% versus 4.5%) and greater exposure to the subset of ECI conditions (20.7% versus 8.0%). Comorbidities at diagnosis increased the risk of cancer death in addition to risks associated with Aboriginality, remoteness of residence and disease stage at diagnosis. The Aboriginal cohort experienced comparatively greater exposure to comorbidities which adds to disparities in cancer outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probing inhibitory effects of nanocrystalline cellulose: inhibition versus surface charge
NASA Astrophysics Data System (ADS)
Male, Keith B.; Leung, Alfred C. W.; Montes, Johnny; Kamen, Amine; Luong, John H. T.
2012-02-01
NCC derived from different biomass sources was probed for its plausible cytotoxicity by electric cell-substrate impedance sensing (ECIS). Two different cell lines, Spodoptera frugiperda Sf9 insect cells and Chinese hamster lung fibroblast V79, were exposed to NCC and their spreading and viability were monitored and quantified by ECIS. Based on the 50%-inhibition concentration (ECIS50), none of the NCC produced was judged to have any significant cytotoxicity on these two cell lines. However, NCC derived from flax exhibited the most pronounced inhibition on Sf9 compared to hemp and cellulose powder. NCCs from flax and hemp pre-treated with pectate lyase were also less inhibitory than NCCs prepared from untreated flax and hemp. Results also suggested a correlation between the inhibitory effect and the carboxylic acid contents on the NCC.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... Owner for United States Tax Withholding, Form W-8BEN-E, Certificate of Status of Beneficial Owner for United States Tax Withholding, Form W-8ECI, Certificate of Foreign Person's Claim for Exemption From Withholding on Income Effectively Connected With the Conduct of a Trade or Business in the United States, Form...
A Noble Bet in Early Care and Education: Lessons from One Community’s Experience
2002-01-01
Finally, the RAND study team explored existing empirical literature on ECE. Although ECI led to the establishment of new , high-quality ECE ser...objectives and articulates the breadth of the initiative’s ambition to create a comprehensive new system for delivering ECE to low-income chil- dren in...70 Incentives to Neighborhood Agencies .............. 71 Who Would Provide ECI Services? . . . . . . . . . . . . . . . . . . 73 New Providers and
"SWING": A European project for a new application of an ionospheric network
NASA Astrophysics Data System (ADS)
Zolesi, B.; Bianchi, C.; Meloni, A.; Baskaradas, J. A.; Belehaki, A.; Altadill, D.; Dalle Mese, E.
2016-05-01
The SWING (Short Wave critical Infrastructure Network based on a new Generation high survival radio communication system) is a European project aimed at studying a high survival high-frequency (HF) radio network to link European Critical Infrastructures (ECIs). This system is thought to replace broadband internet communication, maintaining the minimum flux of essential information for the ECIs management and control, in case of wide-scale threats, including terrorist attacks, able to put out of order internet links over the Mediterranean region. SWING is designed to evaluate the threat and increase the security awareness, as well as the level of protection, of analogous and/or interdependent ECIs. In order to meet these goals, SWING was finalized to recognize how and when the internet communication fails and to develop the standard software and hardware tools necessary for implementing communication protocols suited for a reliable and interoperable short-wave (SW) or high-frequency (HF) radio network backup. The internet broadband description and internet failure recognition were taken into consideration in the project but are not treated in this paper. It has been assessed that in case of complete failure of the internet broadband communication fundamental information for the management and control of ECIs over the Mediterranean region can be maintained with a HF network, even in case of moderate ionospheric perturbations.
An Automated Method to Monitor Cell Migration.
NASA Astrophysics Data System (ADS)
Giaever, Ivar; Keese, Charles R.
2002-03-01
Electric cell-substrate impedance sensing (ECIS) has been developed as a non-invasive means to follow cell behavior in culture. In this method cells are cultured on small (250 micrometer diameter) gold film electrodes. The impedance of the electrode is measured by an AC current about 1 microampere. When challenged by biochemical or physical stimuli the cells will respond by changing their morphology and motion. These changes are reflected in the measured impedance values. In this study, the basic ECIS system was used in both a non-invasive and invasive mode to carry out an automated wound-healing assay for quantifying cell migration activity. BSC-1, MDCK, and NRK cell lines were grown to confluence in ECIS wells before data was collected. An AC current of approximately 1 milliampere at 40,000 Hz was applied for several seconds, killing the cells in contact with the ECIS electrode and dropping the impedance to that of a cell-free electrode. For the next few hours following this incursion, the neighboring cells migrate into the wounded area replacing the dead cells, and the electrodes return to impedance values of unwounded controls. Data shows that the time required for the completion of this activity is strongly dependent upon cell type, medium composition, and the type of protein adsorbed to the substrate.
Peptide Fragmentation by Corona Discharge Induced Electrochemical Ionization
Lloyd, John R.; Hess, Sonja
2010-01-01
Fundamental studies have greatly improved our understanding of electrospray, including the underlying electrochemical reactions. Generally regarded as disadvantageous, we have recently shown that corona discharge (CD) can be used as an effective method to create a radical cation species [M]+•, thus optimizing the electrochemical reactions that occur on the surface of the stainless steel (SS) electrospray capillary tip. This technique is known as CD initiated electrochemical ionization (CD-ECI). Here, we report on the fundamental studies using CD-ECI to induce analytically useful in-source fragmentation of a range of molecules that complex transition metals. Compounds that have been selectively fragmented using CD-ECI include enolate forming phenylglycine containing peptides, glycopeptides, nucleosides and phosphopeptides. Collision induced dissociation (CID) or other activation techniques were not necessary for CD-ECI fragmentation. A four step mechanism was proposed: 1. Complexation using either Fe in the SS capillary tip material or Cu(II) as an offline complexation reagent; 2. Electrochemical oxidation of the complexed metal and thus formation of a radical cation (e.g.; Fe - e− → Fe +•); 3. Radical fragmentation of the complexed compound. 4. Electrospray ionization of the fragmented neutrals. Fragmentation patterns resembling b- and y-type ions were observed and allowed the localization of the phosphorylation sites. PMID:20869880
Widder, Mark W; Brennan, Linda M; Hanft, Elizabeth A; Schrock, Mary E; James, Ryan R; van der Schalie, William H
2015-07-01
The US Army's need for a reliable and field-portable drinking water toxicity sensor was the catalyst for the development and evaluation of an electric cell-substrate impedance sensing (ECIS) device. Water testing technologies currently available to soldiers in the field are analyte-specific and have limited capabilities to detect broad-based water toxicity. The ECIS sensor described here uses rainbow trout gill epithelial cells seeded on fluidic biochips to measure changes in impedance for the detection of possible chemical contamination of drinking water supplies. Chemicals selected for testing were chosen as representatives of a broad spectrum of toxic industrial compounds. Results of a US Environmental Protection Agency (USEPA)-sponsored evaluation of the field portable device were similar to previously published US Army testing results of a laboratory-based version of the same technology. Twelve of the 18 chemicals tested following USEPA Technology Testing and Evaluation Program procedures were detected by the ECIS sensor within 1 h at USEPA-derived human lethal concentrations. To simplify field-testing methods further, elimination of a procedural step that acclimated cells to serum-free media streamlined the test process with only a slight loss of chemical sensitivity. For field use, the ECIS sensor will be used in conjunction with an enzyme-based sensor that is responsive to carbamate and organophosphorus pesticides. Copyright © 2014 John Wiley & Sons, Ltd.
Abdolahad, Mohammad; Taghinejad, Mohammad; Taghinejad, Hossein; Janmaleki, Mohsen; Mohajerzadeh, Shams
2012-03-21
A novel vertically aligned carbon nanotube based electrical cell impedance sensing biosensor (CNT-ECIS) was demonstrated for the first time as a more rapid, sensitive and specific device for the detection of cancer cells. This biosensor is based on the fast entrapment of cancer cells on vertically aligned carbon nanotube arrays and leads to mechanical and electrical interactions between CNT tips and entrapped cell membranes, changing the impedance of the biosensor. CNT-ECIS was fabricated through a photolithography process on Ni/SiO(2)/Si layers. Carbon nanotube arrays have been grown on 9 nm thick patterned Ni microelectrodes by DC-PECVD. SW48 colon cancer cells were passed over the surface of CNT covered electrodes to be specifically entrapped on elastic nanotube beams. CNT arrays act as both adhesive and conductive agents and impedance changes occurred as fast as 30 s (for whole entrapment and signaling processes). CNT-ECIS detected the cancer cells with the concentration as low as 4000 cells cm(-2) on its surface and a sensitivity of 1.7 × 10(-3)Ω cm(2). Time and cell efficiency factor (TEF and CEF) parameters were defined which describe the sensor's rapidness and resolution, respectively. TEF and CEF of CNT-ECIS were much higher than other cell based electrical biosensors which are compared in this paper.
Grillon, Christian; Franco-Chaves, Jose A.; Mateus, Camilo F.; Ionescu, Dawn F.; Zarate, Carlos A.
2013-01-01
According to the emotion-context insensitivity (ECI) hypothesis, major depressive disorder (MDD) is associated with a diminished ability to react emotionally to positive stimuli and with blunting of defensive responses to threat. That defensive responses are blunted in MDD seems inconsistent with the conceptualization and diagnostic nosology of MDD. The present study tested the ECI hypothesis in MDD using a threat of shock paradigm. Twenty-eight patients with MDD (35.5±10.4 years) were compared with 28 controls (35.1±7.4 years). Participants were exposed to three conditions: no shock, predictable shock, and unpredictable shock. Startle magnitude was used to assess defensive responses. Inconsistent with the ECI hypothesis, startle potentiation to predictable and unpredictable shock was not reduced in the MDD group. Rather, MDD patients showed elevated startle throughout testing as well as increased contextual anxiety during the placement of the shock electrodes and in the predictable condition. A regression analysis indicated that illness duration and Beck depression inventory scores explained 37% (p<.005) of the variance in patients’ startle reactivity. MDD is not associated with emotional blunting but rather enhanced defensive reactivity during anticipation of harm. These results do not support a strong version of the ECI hypothesis. Understanding the nature of stimuli or situations that lead to blunted or enhanced defensive reactivity will provide better insight into dysfunctional emotional experience in MDD. PMID:23951057
Extracorporeal irradiation for malignant bone tumors.
Hong, A; Stevens, G; Stalley, P; Pendlebury, S; Ahern, V; Ralston, A; Estoesta, E; Barrett, I
2001-06-01
Extracorporeal irradiation (ECI) has been used selectively in the management of primary malignant bone tumors since 1996. We report our techniques for ECI and the short-term oncologic and orthopedic outcomes. Sixteen patients with primary malignant bone tumors were treated with ECI from 1996 to 2000. The median age was 14 years. The histologic diagnoses were Ewing's sarcoma (11), osteosarcoma (4) and chondrosarcoma (1). The treated sites were femur (7), tibia (4), humerus (2), ilium (2), and sacrum (1). Following induction chemotherapy in Ewing's sarcomas and osteosarcoma, en bloc resection of the tumor and tumor-bearing bone was performed. A single dose of 50 Gy was delivered to the bone extracorporeally using either a linear accelerator (9 cases) or a blood product irradiator (7 cases). The orthopedic outcome was recorded using a standard functional scale. At a median follow-up of 19.5 months, there were no cases of local recurrence or graft failure. One patient required amputation due to chronic osteomyelitis. For the 10 patients with follow-up greater than 18 months, the functional outcomes were graded good to excellent. The short-term oncologic and orthopedic results are encouraging and suggest that ECI provides a good alternative for reconstruction in limb conservative surgery in selected patients. This technique should only be used in a multidisciplinary setting, where careful follow-up is available to assess the long-term outcomes.
Jones, Michael A; Webster, David; Wong, Kelvin C K; Hayes, Christopher; Qureshi, Norman; Rajappan, Kim; Bashir, Yaver; Betts, Timothy R
2014-12-01
We sought to investigate the use of tissue contact monitoring by means of the electrical coupling index (ECI) in a prospective randomised control trial of patients undergoing cavotricuspid isthmus (CTI) ablation for atrial flutter. Patients with ECG-documented typical flutter undergoing their first CTI ablation were randomised to ECI™-guided or non-ECI™-guided ablation. An irrigated-tip ablation catheter was used in all cases. Consecutive 50-W, 60-s radiofrequency lesions were applied to the CTI, from the tricuspid valve to inferior vena cava, with no catheter movement permitted during radiofrequency (RF) delivery. The ablation endpoint was durable CTI block at 20 min post-ablation. Patients underwent routine clinic follow-up post-operatively. A total of 101 patients (79 male), mean age 66 (+/-11), 50 ECI-guided and 51 control cases were enrolled in the study. CTI block was achieved in all. There were no acute complications. All patients were alive at follow-up. CTI block was achieved in a single pass in 36 ECI-guided and 30 control cases (p = 0.16), and at 20 min post-ablation, re-conduction was seen in 5 and 12 cases, respectively (p = 0.07). There was no significant difference in total procedure time (62.7 ± 33 vs. 62.3 ± 33 min, p = 0.92), RF requirement (580 ± 312 vs. 574 ± 287 s, p = 0.11) or fluoroscopy time (718 ± 577 vs. 721 ± 583 s, p = 0.78). After 6 ± 4 months, recurrence of flutter had occurred in 1 (2 %) ECI vs. 8 (16 %) control cases (OR 0.13, 95 % CI 0.01-1.08, p = 0.06). ECI-guided CTI ablation demonstrated a non-statistically significant reduction in late recurrence of atrial flutter, at no cost to procedural time, radiation exposure or RF requirement.
Justus, Jennifer; Weigand, Edgar
2014-06-01
Auxiliary enzymes participate in β-oxidation of unsaturated fatty acids. The objective of the study was to investigate the impact of a moderate zinc deficiency and a high intake of polyunsaturated fat on Δ(3)Δ(2)-enoyl-CoA isomerase (ECI) in the liver and other tissues. Five groups of eight weanling rats each were fed moderately zinc-deficient (ZD) or zinc-adequate (ZA) semisynthetic diets (7 or 50 mg Zn/kg) enriched with 22 % cocoa butter (CB) or 22 % safflower oil (SO) for 4 weeks: (1) ZD-CB, fed free choice; (2) ZA-CBR, ZA-CB diet fed in equivalent amounts consumed by the ZD-CB group; (3) ZD-SO, fed free choice; (4) ZA-SOR, ZA-SO diet fed in equivalent amounts consumed by the ZD-SO group; and (5) ZA-SO, fed free choice. Growth and Zn status markers were markedly reduced in the ZD groups. ECI activity in the liver of the animals fed the ZD- and ZA-SO diets were significantly higher (approximately 2- and 3-fold, respectively) as compared with the CB-fed animals, whereas activities in extrahepatic tissues (kidneys, heart, skeletal muscle, testes, adipose tissue) were not altered by dietary treatments. Transcript levels of the mitochondrial Eci gene in the liver did not significantly differ between ZD and ZA rats, but were 1.6-fold higher in the ZA-SO- than in the ZD-CB-fed animals (P < 0.05). It is concluded that diets enriched with safflower oil as a source high in linoleic acid induce markedly increased hepatic ECI activities and that a moderate Zn deficiency does not affect transcription of the mitochondrial Eci gene in the liver.
McGowan, Laura; Pot, Gerda K; Stephen, Alison M; Lavelle, Fiona; Spence, Michelle; Raats, Monique; Hollywood, Lynsey; McDowell, Dawn; McCloat, Amanda; Mooney, Elaine; Caraher, Martin; Dean, Moira
2016-10-26
Interventions to increase cooking skills (CS) and food skills (FS) as a route to improving overall diet are popular within public health. This study tested a comprehensive model of diet quality by assessing the influence of socio-demographic, knowledge- and psychological-related variables alongside perceived CS and FS abilities. The correspondence of two measures of diet quality further validated the Eating Choices Index (ECI) for use in quantitative research. A cross-sectional survey was conducted in a quota-controlled nationally representative sample of 1049 adults aged 20-60 years drawn from the Island of Ireland. Surveys were administered in participants' homes via computer-assisted personal interviewing (CAPI) assessing a range of socio-demographic, knowledge- and psychological-related variables alongside perceived CS and FS abilities. Regression models were used to model factors influencing diet quality. Correspondence between 2 measures of diet quality was assessed using chi-square and Pearson correlations. ECI score was significantly negatively correlated with DINE Fat intake (r = -0.24, p < 0.001), and ECI score was significantly positively correlated with DINE Fibre intake (r = 0.38, p < 0.001), demonstrating a high agreement. Findings indicated that males, younger respondents and those with no/few educational qualifications scored significantly lower on both CS and FS abilities. The relative influence of socio-demographic, knowledge, psychological variables and CS and FS abilities on dietary outcomes varied, with regression models explaining 10-20 % of diet quality variance. CS ability exerted the strongest relationship with saturated fat intake (β = -0.296, p < 0.001) and was a significant predictor of fibre intake (β = -0.113, p < 0.05), although not for healthy food choices (ECI) (β = 0.04, p > 0.05). Greater CS and FS abilities may not lead directly to healthier dietary choices given the myriad of other factors implicated; however, CS appear to have differential influences on aspects of the diet, most notably in relation to lowering saturated fat intake. Findings suggest that CS and FS should not be singular targets of interventions designed to improve diet; but targeting specific sub-groups of the population e.g. males, younger adults, those with limited education might be more fruitful. A greater understanding of the interaction of factors influencing cooking and food practices within the home is needed.
CheMentor Software System by H. A. Peoples
NASA Astrophysics Data System (ADS)
Reid, Brian P.
1997-09-01
CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.
Krupat, Edward; Borges, Nicole J; Brower, Richard D; Haidet, Paul M; Schroth, W Scott; Fleenor, Thomas J; Uijtdehaage, Sebastian
2017-12-01
To develop an instrument to assess educational climate, a critical aspect of the medical school learning environment that previous tools have not explored in depth. Fifty items were written, capturing aspects of Dweck's performance-learning distinction, to distinguish students' perceptions of the educational climate as learning/mastery oriented (where the goal is growth and development) versus performance oriented (where the goal is appearance of competence). These items were included in a 2014 survey of first-, second-, and third-year students at six diverse medical schools. Students rated their preclerkship or clerkship experiences and provided demographic and other data. The final Educational Climate Inventory (ECI) was determined via exploratory and confirmatory factor analysis. Relationships between scale scores and other variables were calculated. Responses were received from 1,441/2,590 students (56%). The 20-item ECI resulted, with three factors: centrality of learning and mutual respect; competitiveness and stress; and passive learning and memorization. Clerkship students' ratings of their learning climate were more performance oriented than preclerkship students' ratings (P < .001). Among preclerkship students, ECI scores were more performance oriented in schools with grading versus pass-fail systems (P < .04). Students who viewed their climate as more performance oriented were less satisfied with their medical school (P < .001) and choice of medicine as a career (P < .001). The ECI allows educators to assess students' perceptions of the learning climate. It has potential as an evaluation instrument to determine the efficacy of attempts to move health professions education toward learning and mastery.
The Use of Personal Information Technology in Military Area of Operations
2013-04-19
Mauhew FlrDll Approved: .19. Fly. DiI !e:(c=i , . imes f ut T T a l nt (’J11 i 1 b s rscml i at i: llOl lY l ta " TTED LL E U · in Of(l l...more capable and enable Coast Guard members to access the network. The Defense department is testing Android and iOS devices with CAC card ...Access Card authorizes use on desktop computers.48 Another item of concern as the military attempts to move into the PIT realm is who is making the
500-Watt Solid-State RF Power Amplifier AM-7209( )/VRC.
1983-03-18
AD-A127 462 580-WRATT SOLID-STATE RF POWER AMPLIFIER AM-7289( )/VRC 1/2 (U) E- SYSTEMS INC ST PETERSBURG FL ECI DIV N HARRIS 18 MAR 83 60-6±289 CECOM...AND DEVELOPMENT TECHNICAL REPORT CECOM-82-C-J23 1 500-WATT SOLID-STATE RF POWER AMPLIFIER AM-7209( )/VRC M. Harris E- SYSTEMS , INC., ECI DIVISION 1502...CONTRACT OR GRANT NUMSER(t) M. Harris DAABO7-82-C-J231 9m PERFORMING ORGANIZATION NAME AND ADDRESS II. PROGRAM ELEMENT. PROJECT TASK E- SYSTEMS , INC
Müller, Jakob; Thirion, Christian; Pfaffl, Michael W
2011-01-15
Recombinant viral vectors are widespread tools for transfer of genetic material in various modern biotechnological applications like for example RNA interference (RNAi). However, an accurate and reproducible titer assignment represents the basic step for most downstream applications regarding a precise multiplicity of infection (MOI) adjustment. As necessary scaffold for the studies described in this work we introduce a quantitative real-time PCR (qPCR) based approach for viral particle measurement. Still an implicated problem concerning physiological effects is that the appliance of viral vectors is often attended by toxic effects on the individual target. To determine the critical viral dose leading to cell death we developed an electric cell-substrate impedance sensing (ECIS) based assay. With ECIS technology the impedance change of a current flow through the cell culture medium in an array plate is measured in a non-invasive manner, visualizing effects like cell attachment, cell-cell contacts or proliferation. Here we describe the potential of this online measurement technique in an in vitro model using the porcine ileal epithelial cell line IPI-2I in combination with an adenoviral transfection vector (Ad5-derivate). This approach shows a clear dose-depending toxic effect, as the amount of applied virus highly correlates (p<0.001) with the level of cell death. Thus this assay offers the possibility to discriminate the minimal non-toxic dose of the individual transfection method. In addition this work suggests that the ECIS-device bears the feasibility to transfer this assay to multiple other cytotoxicological questions. Copyright © 2010 Elsevier B.V. All rights reserved.
Direct Geolocation of Satellite Images with the EO-CFI Libraries
NASA Astrophysics Data System (ADS)
de Miguel, Eduardo; Prado, Elena; Estebanez, Monica; Martin, Ana I.; Gonzalez, Malena
2016-08-01
The INTA Remote Sensing Laboratory has implemented a tool for the direct geolocation of satellite images. The core of the tool is a C code based on the "Earth Observation Mission CFI SW" from ESA. The tool accepts different types of inputs for satellite attitude (euler angles, quaternions, default attitude models). Satellite position can be provided either in ECEF or ECI coordinates. The line of sight of each individual detector is imported from an external file or is generated by the tool from camera parameters. Global DEM ACE2 is used to define ground intersection of the LOS.The tool has been already tailored for georeferencing images from the forthcoming Spanish Earth Observation mission SEOSat/Ingenio, and for the camera APIS onboard the INTA cubesat OPTOS. The next step is to configure it for the geolocation of Sentinel 2 L1b images.The tool has been internally validated by different means. This validation shows that the tool is suitable for georeferencing images from high spatial resolution missions. As part of the validation efforts, a code for simulating orbital info for LEO missions using EO-CFI has been produced.
Zhou, Ruilian; Dov, Paternak; Zhao, Halin
2002-08-01
Responses of six varieties (Suwannee, Coast cross, Tifton44, Tifton68, Tifton78 and Tifton85) of Cynodon dactylon to irrigation-water salinity were investigated in field by means of a double line source experimental design. The digestibility of the grass by goat was analyzed using the rumen gastric justice digestion method. The results showed that the six varieties grew well, and had a high yield of fresh grass when eletro-conductivity (Eci) < 10 ds.m-1. Particularly when Eci = 4.4 ds.m-1, the fresh grass yield of Coast cross, Suwannee, Tifton44, Tifton68, Tifton78 and Tifton85 was respectively increased by 142.1%, 61.1%, 136%, 121.1%, 202.3% and 109.7%, in comparison with fresh water (Eci = 1.2) irrigation. Dry matter yield was also elevated with the increasing salinity of irrigated water. There was no obvious effect of salinity on crude protein, fiber and ash contents in the six varieties. Goats were fond of eating the hay irrigated by saline water, perhaps because of its higher digestibility. At least, the yield and quality of Cynodon dactylon were not effected by saltwater irrigation.
Lovelady, Douglas C.; Friedman, Jennifer; Patel, Sonali; Rabson, David A.; Lo, Chun-Min
2009-01-01
We performed micromotion experiments using electric cell-substrate impedance sensing (ECIS) on a confluent layer of 3T3 fibroblasts exposed to different low levels of the toxin cytochalasin B. This toxin is know to affect actin polymerization and to disrupt cytoskeletal structure and function in cells, changing the morphology of confluent cell cultures and altering the nature of the cellular micromotion, which is measured by ECIS as changes in impedance. By looking at several measures to characterize the long- and short-term correlations in the noise of the impedance time series, we are able to detect the effects of the toxin at concentrations down to 1 μM; there are intriguing hints that the effects may be discernible at levels as low as 0.1 μM. These measures include the power spectrum, the Hurst and detrended-fluctuation-analysis exponents, and the first zero and first 1/e crossings of the autocorrelation function. While most published work with ECIS uses only average impedance values, we demonstrate that noise analysis provides a more sensitive probe. PMID:19026529
Commercial Off-the-Shelf (COTS) Components and Enterprise Component Information System (eCIS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Minihan; Ed Schmidt; Greg Enserro
The purpose of the project was to develop the processes for using commercial off-the-shelf (COTS) parts for WR production and to put in place a system for implementing the data management tools required to disseminate, store, track procurement, and qualify vendors. Much of the effort was devoted to determining if the use of COTS parts was possible. A basic question: How does the Nuclear Weapons Complex (NWC) begin to use COTS in the weapon Stockpile Life Extension Programs with high reliability, affordability, while managing risk at acceptable levels? In FY00, it was determined that a certain weapon refurbishment program couldmore » not be accomplished without the use of COTS components. The elements driving the use of COTS components included decreased cost, greater availability, and shorter delivery time. Key factors that required implementation included identifying the best suppliers and components, defining life cycles and predictions of obsolescence, testing the feasibility of using COTS components with a test contractor to ensure capability, as well as quality and reliability, and implementing the data management tools required to disseminate, store, track procurement, and qualify vendors. The primary effort of this project then was to concentrate on the risks involved in the use of COTS and address the issues of part and vendor selection, procurement and acceptance processes, and qualification of the parts via part and sample testing. The Enterprise Component Information System (eCIS) was used to manage the information generated by the COTS process. eCIS is a common interface for both the design and production of NWC components and systems integrating information between SNL National Laboratory (SNL) and the Kansas City Plant (KCP). The implementation of COTS components utilizes eCIS from part selection through qualification release. All part related data is linked across an unclassified network for access by both SNL and KCP personnel. The system includes not only NWC part information but also includes technical reference data for over 25 Million electronic and electromechanical commercial and military parts via a data subscription. With the capabilities added to the system through this project, eCIS provides decision support, parts list/BOM analysis, editing, tracking, workflows, reporting, and history/legacy information integrating manufacturer reference, company technical, company business, and design data.« less
Hoshino, Masami; Haraguchi, Yoshikura; Hirasawa, Hiroyuki; Sakai, Motohiro; Saegusa, Hiroshi; Hayashi, Kazushiro; Horita, Naoki; Ohsawa, Hiroyuki
2001-01-01
Background: Glucose tolerance (GT) has not been taken into consideration in investigations concerning relationships between coagulopathy and multiple organ dysfunction syndrome (MODS), and endothelial cell activation/endothelial cell injury (ECA/ECI) in septic patients, although coagulopathy is known to be influenced by blood glucose level. We investigated those relationships under strict blood glucose control and evaluation of GT with the glucose clamp method by means of the artificial pancreas in nine septic patients with glucose intolerance. The relationships between GT and blood stress related hormone levels (SRH) were also investigated. Methods: The amount of metabolized glucose (M value), as the parameter of GT, was measured by the euglycemic hyperinsulinemic glucose clamp method, in which the blood glucose level was clamped at 80 mg/dl under a continuous insulin infusion rate of 1.12 mU/kg per min, using the artificial pancreas, STG-22. Multiple organ failure (MOF) score was calculated using the MOF criteria of Japanese Association for Critical Care Medicine. Regarding coagulopathy, the following parameters were used: disseminated intravascular coagulation (DIC) score (calculated from the DIC criteria of the Ministry of Health and Welfare of Japan) and the parameters used for calculating DIC score, protein-C, protein-S, plasminogen, antithrombin III (AT-III), plasminogen activator inhibitor-1 (PAI-1), and tissue plasminogen activator-PAI-1 (tPA-PAI-1) complex. Thrombomodulin (TM) was measured as the indicator of ECI. Results: There were no significant correlations between M value and SRH, parameters indicating coagulopathy and the MOF score. The MOF score and blood TM levels were positively correlated with DIC score, thrombin-AT-III complex and tPA-PAI-1 complex, and negatively correlated with blood platelet count. Conclusions: GT was not significantly related to SRH, coagulopathy and MODS under strict blood glucose control. Hypercoagulability was closely related to MODS and ECI. Among the parameters indicating coagulopathy, tPA-PAI-1 complex, which is considered to originate from ECA, seemed to be a sensitive parameter of MODS and ECI, and might be a predictive marker of MODS. The treatment for reducing hypercoagulability and ECA/ECI were thought to be justified as one of the therapies for acutely ill septic patients. PMID:11299067
Effect of a prenatal nutritional intervention program on initiation and duration of breastfeeding.
Léger-Leblanc, Gisèle; Rioux, France M
2008-01-01
To evaluate initiation and duration of breastfeeding of infants born to mothers who participated in the Early Childhood Initiative (ECI) program. Factors affecting the initiation and the early cessation of breastfeeding were also explored. Twenty-five pregnant women participating in the ECI program completed this prospective study. At 36 weeks' gestation, a questionnaire was administered to assess socioeconomic status, intention to breastfeed and breastfeeding experience. When the infants were three and six months of age, feeding practices were assessed with a questionnaire. The breastfeeding initiation rate was 62.5%. At one and three months postpartum, exclusive breastfeeding rates were 39% and 4%, respectively. At six months, none of the women was exclusively breastfeeding. Primiparity, prenatal classes, having been breastfed and intention to breastfeed at 36 weeks' gestation were positively associated with breastfeeding initiation. Father's education, intention to breastfeed at 36 weeks' gestation, no water or formula given to the infant during hospitalization and higher maternal hemoglobin level at 36 weeks' gestation were positively associated with the duration of breastfeeding. The rate of initiation and duration of breastfeeding for ECI participants were low. To achieve successful interventions, it is important to target modifiable factors known to influence the initiation and duration of breastfeeding within this population.
Enhancing the care of people with HIV.
Goodwin, J
1999-01-01
30% of South Africa's adults are infected with HIV. Of those who have been infected with HIV, many have died and left behind orphans, making South Africa's large number of orphans one of the country's most important problems. As the number of people infected with HIV in the developing world has grown dramatically, the cost of their care has quickly surpassed regionally available resources. In countries where the HIV/AIDS epidemic is exacting the heaviest damage, most people have no access to antiretroviral therapies and routine prophylaxes against opportunistic infections. The search for cost-effective methods to treat and care for the ill has therefore become a global imperative. The Enhancing Care Initiative (ECI), a multinational program recently launched by the Harvard AIDS Institute, aims to improve the care of all HIV-infected people living in resource-scarce countries. Funded by an initial 5-year grant from the Merck Company Foundation, ECI is coordinated by the Harvard AIDS Institute. The core of ECI is its AIDS care teams, groups formed by, composed of, and led by people from the countries involved. The teams are comprised of regional experts in clinical care, epidemiology, international health, human rights, behavioral science, economics, political science, and public health working to initiate meaningful changes in regional AIDS policies.
The astrophysical S-factor of the direct 18O(p, γ)19F capture by the ANC method
NASA Astrophysics Data System (ADS)
Burjan, V.; Hons, Z.; Kroha, V.; Mrázek, J.; Piskoř, Š.; Mukhamedzhanov, A. M.; Trache, L.; Tribble, R. E.; La Cognata, M.; Lamia, L.; Pizzone, G. R.; Romano, S.; Spitaleri, C.; Tumino, A.
2018-01-01
We attempted to determine the astrophysical S-factor of the direct part of the 18O(p, γ)19F capture by the indirect method of asymptotic normalization coefficients (ANC). We measured the differential cross section of the transfer reaction 18O(3He, d)19F at a 3He energy of 24.6 MeV. The measurement was realized on the cyclotron of the NPI in Řež, Czech Republic, with the gas target consisting of the high purity 18O (99.9 %). The reaction products were measured by eight ΔE-E telescopes composed from thin and thick silicon surface-barrier detectors. The parameters of the optical model for the input channel were deduced by means of the code ECIS and the analysis of transfer reactions to 12 levels of the 19F nucleus up to 8.014 MeV was made by the code FRESCO. The deduced ANCs were then used to specify the direct contribution to the 18O(p, γ)19F capture process and were compared with the mutually different results of two works.
NASA Astrophysics Data System (ADS)
Bagnaninchi, Pierre O.; Holmes, Christina; Drummond, Nicola; Daoud, Jamal; Tabrizian, Maryam
2011-08-01
Cell viability assays are essential tools for cell biology. They assess healthy cells in a sample and enable the quantification of cellular responses to reagents of interest. Noninvasive and label-free assays are desirable in two-dimensional (2D) and three-dimensional (3D) cell culture to facilitate time-course viability studies. Cellular micromotion, emanating from cell to substrate distance variations, has been demonstrated as a marker of cell viability with electric cell-substrate impedance sensing (ECIS). In this study we investigated if optical coherence phase microscopy (OCPM) was able to report phase fluctuations of adult stem cells in 2D and 3D that could be associated with cellular micromotion. An OCPM has been developed around a Thorlabs engine (λo = 930 nm) and integrated in an inverted microscope with a custom scanning head. Human adipose derived stem cells (ADSCs, Invitrogen) were cultured in Mesenpro RS medium and seeded either on ECIS arrays, 2D cell culture dishes, or in 3D highly porous microplotted polymeric scaffolds. ADSC micromotion was confirmed by ECIS analysis. Live and fixed ADSCs were then investigated in 2D and 3D with OCPM. Significant differences were found in phase fluctuations between the different conditions. This study indicated that OCPM could potentially assess cell vitality in 2D and in 3D microstructures.
Cognitive individualism and the child as scientist program.
Wringe, Bill
2011-12-01
In this paper, I examine the charge that Gopnik and Meltzoff's 'Child as Scientist' program, outlined and defended in their 1997 book Words, Thoughts and Theories is vitiated by a form of 'cognitive individualism' about science. Although this charge has often been leveled at Gopnik and Meltzoff's work, it has rarely been developed in any detail. I suggest that we should distinguish between two forms of cognitive individualism which I refer to as 'ontic' and 'epistemic' cognitive individualism (OCI and ECI respectively). I then argue - contra Ronald Giere - that Gopnik and Meltzoff's commitment to OCI is relatively unproblematic, since it is an easily detachable part of their view. By contrast, and despite their explicit discussion of the issue, their commitment to ECI is much more problematic. Copyright © 2011 Elsevier Ltd. All rights reserved.
29 CFR 4.52 - Fringe benefit determinations.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., pension, etc., which are not required as a matter of law (i.e., excluding Social Security, unemployment... benefits contained in the U.S. Bureau of Labor Statistics, Employment Cost Index (ECI), for all employees...
Eccentric connectivity index of chemical trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haoer, R. S., E-mail: raadsehen@gmail.com; Department of Mathematics, Faculty of Computer Sciences and Mathematics, University Of Kufa, Najaf; Atan, K. A., E-mail: kamel@upm.edu.my
Let G = (V, E) be a simple connected molecular graph. In such a simple molecular graph, vertices and edges are depicted atoms and chemical bonds respectively, we refer to the sets of vertices by V (G) and edges by E (G). If d(u, v) be distance between two vertices u, v ∈ V(G) and can be defined as the length of a shortest path joining them. Then, the eccentricity connectivity index (ECI) of a molecular graph G is ξ(G) = ∑{sub v∈V(G)} d(v) ec(v), where d(v) is degree of a vertex v ∈ V(G). ec(v) is the length ofmore » a greatest path linking to another vertex of v. In this study, we focus the general formula for the eccentricity connectivity index (ECI) of some chemical trees as alkenes.« less
Impacts of ambient temperature on the burden of bacillary dysentery in urban and rural Hefei, China.
Cheng, J; Xie, M Y; Zhao, K F; Wu, J J; Xu, Z W; Song, J; Zhao, D S; Li, K S; Wang, X; Yang, H H; Wen, L Y; Su, H; Tong, S L
2017-06-01
Bacillary dysentery continues to be a major health issue in developing countries and ambient temperature is a possible environmental determinant. However, evidence about the risk of bacillary dysentery attributable to ambient temperature under climate change scenarios is scarce. We examined the attributable fraction (AF) of temperature-related bacillary dysentery in urban and rural Hefei, China during 2006-2012 and projected its shifting pattern under climate change scenarios using a distributed lag non-linear model. The risk of bacillary dysentery increased with the temperature rise above a threshold (18·4 °C), and the temperature effects appeared to be acute. The proportion of bacillary dysentery attributable to hot temperatures was 18·74% (95 empirical confidence interval (eCI): 8·36-27·44%). Apparent difference of AF was observed between urban and rural areas, with AF varying from 26·87% (95% eCI 16·21-36·68%) in urban area to -1·90% (95 eCI -25·03 to 16·05%) in rural area. Under the climate change scenarios alone (1-4 °C rise), the AF from extreme hot temperatures (>31·2 °C) would rise greatly accompanied by the relatively stable AF from moderate hot temperatures (18·4-31·2 °C). If climate change proceeds, urban area may be more likely to suffer from rapidly increasing burden of disease from extreme hot temperatures in the absence of effective mitigation and adaptation strategies.
Lu, Fuhua; Lu, Zhaoyu; Liu, Xusheng; Chen, Cha; Qu, Pinghua; Li, Dingcheng; Hua, Zhengshuang; Qu, Yanni; Zou, Chuan
2016-01-01
Gut microbiota plays a dual role in chronic kidney disease (CKD) and is closely linked to production of uremic toxins. Strategies of reducing uremic toxins by targeting gut microbiota are emerging. It is known that Chinese medicine rhubarb enema can reduce uremic toxins and improve renal function. However, it remains unknown which ingredient or mechanism mediates its effect. Here we utilized a rat CKD model of 5/6 nephrectomy to evaluate the effect of emodin, a main ingredient of rhubarb, on gut microbiota and uremic toxins in CKD. Emodin was administered via colonic irrigation at 5ml (1mg/day) for four weeks. We found that emodin via colonic irrigation (ECI) altered levels of two important uremic toxins, urea and indoxyl sulfate (IS), and changed gut microbiota in rats with CKD. ECI remarkably reduced urea and IS and improved renal function. Pyrosequencing and Real-Time qPCR analyses revealed that ECI resumed the microbial balance from an abnormal status in CKD. We also demonstrated that ten genera were positively correlated with Urea while four genera exhibited the negative correlation. Moreover, three genera were positively correlated with IS. Therefore, emodin altered the gut microbiota structure. It reduced the number of harmful bacteria, such as Clostridium spp. that is positively correlated with both urea and IS, but augmented the number of beneficial bacteria, including Lactobacillus spp. that is negatively correlated with urea. Thus, changes in gut microbiota induced by emodin via colonic irrigation are closely associated with reduction in uremic toxins and mitigation of renal injury. PMID:27003359
Kling, Beata; Bücherl, Daniel; Palatzky, Peter; Matysik, Frank-Michael; Decker, Michael; Wegener, Joachim; Heilmann, Jörg
2014-03-28
A real-time and label-free in vitro assay based on electric cell-substrate impedance sensing (ECIS) was established, validated, and compared to an end-point MTT assay within an experimental trial addressing the cytoprotective effects of 19 different flavonoids, flavonoid metabolites, and phenolic acids and their methyl esters on the HT-22 neuronal cell line, after induction of oxidative stress with tert-butyl hydroperoxide. Among the flavonoids under study, only those with a catechol unit and an additional 4-keto group provided cytoprotection. The presence of a 2,3-double bond was not a structural prerequisite for a neuroprotective effect. In the case of the phenolics, catechol substitution was the only structural requirement for activity. The flavonoids and other phenolics with a ferulic acid substitution or a single hydroxy group showed no activity. Electrochemical characterization of all compounds via square-wave voltammetry provided a rather specific correlation between cytoprotective activity and redox potential for the active flavonoids, but not for the active phenolics with a low molecular weight. Moreover this study was used to compare label-free ECIS recordings with results of the established MTT assay. Whereas the former provides time-resolved and thus entirely unbiased information on changes of cell morphology that are unequivocally associated with cell death, the latter requires predefined exposure times and a strict causality between metabolic activity and cell death. However, MTT assays are based on standard lab equipment and provide a more economic way to higher throughput.
Zeng, Yu-Qun; Dai, Zhenhua; Lu, Fuhua; Lu, Zhaoyu; Liu, Xusheng; Chen, Cha; Qu, Pinghua; Li, Dingcheng; Hua, Zhengshuang; Qu, Yanni; Zou, Chuan
2016-04-05
Gut microbiota plays a dual role in chronic kidney disease (CKD) and is closely linked to production of uremic toxins. Strategies of reducing uremic toxins by targeting gut microbiota are emerging. It is known that Chinese medicine rhubarb enema can reduce uremic toxins and improve renal function. However, it remains unknown which ingredient or mechanism mediates its effect. Here we utilized a rat CKD model of 5/6 nephrectomy to evaluate the effect of emodin, a main ingredient of rhubarb, on gut microbiota and uremic toxins in CKD. Emodin was administered via colonic irrigation at 5ml (1mg/day) for four weeks. We found that emodin via colonic irrigation (ECI) altered levels of two important uremic toxins, urea and indoxyl sulfate (IS), and changed gut microbiota in rats with CKD. ECI remarkably reduced urea and IS and improved renal function. Pyrosequencing and Real-Time qPCR analyses revealed that ECI resumed the microbial balance from an abnormal status in CKD. We also demonstrated that ten genera were positively correlated with Urea while four genera exhibited the negative correlation. Moreover, three genera were positively correlated with IS. Therefore, emodin altered the gut microbiota structure. It reduced the number of harmful bacteria, such as Clostridium spp. that is positively correlated with both urea and IS, but augmented the number of beneficial bacteria, including Lactobacillus spp. that is negatively correlated with urea. Thus, changes in gut microbiota induced by emodin via colonic irrigation are closely associated with reduction in uremic toxins and mitigation of renal injury.
A Novel Cell-Based Hybrid Acoustic Wave Biosensor with Impedimetric Sensing Capabilities
Liu, Fei; Li, Fang; Nordin, Anis Nurashikin; Voiculescu, Ioana
2013-01-01
A novel multiparametric biosensor system based on living cells will be presented. The biosensor system includes two biosensing techniques on a single device: resonant frequency measurements and electric cell-substrate impedance sensing (ECIS). The multiparametric sensor system is based on the innovative use of the upper electrode of a quartz crystal microbalance (QCM) resonator as working electrode for the ECIS technique. The QCM acoustic wave sensor consists of a thin AT-cut quartz substrate with two gold electrodes on opposite sides. For integration of the QCM with the ECIS technique a semicircular counter electrode was fabricated near the upper electrode on the same side of the quartz crystal. Bovine aortic endothelial live cells (BAECs) were successfully cultured on this hybrid biosensor. Finite element modeling of the bulk acoustic wave resonator using COMSOL simulations was performed. Simultaneous gravimetric and impedimetric measurements performed over a period of time on the same cell culture were conducted to validate the device's sensitivity. The time necessary for the BAEC cells to attach and form a compact monolayer on the biosensor was 35∼45 minutes for 1.5 × 104 cells/cm2 BAECs; 60 minutes for 2.0 × 104 cells/cm2 BAECs; 70 minutes for 3.0 × 104 cells/cm2 BAECs; and 100 minutes for 5.0 × 104 cells/cm2 BAECs. It was demonstrated that this time is the same for both gravimetric and impedimetric measurements. This hybrid biosensor will be employed in the future for water toxicity detection. PMID:23459387
BC measurement activities at the U.S. Environmental Protection Agency
Black carbon (BC)--sometimes referred to as soot, char, or elemental carbon (EC)--is a refractory form of light-absorbing carbon produced from incomplete combustion. Accurate measurement of BC in combustion source emissions is important for understanding anthropogenic climate for...
Department of Defense Fire and Emergency Services Certification Program
1995-12-12
Support Agency 2. CDC Career Development Course 3. CIMP Certification Information Management Program 4. ECI Extension Course Institute 5. HAZMAT...10. Notify IFSAC of the date and location of performance evaluations. 11. Maintain and update the Certification Information Management Program ( CIMP
Liakopoulos, Oliver J; Ho, Jonathan K; Yezbick, Aaron B; Sanchez, Elizabeth; Singh, Vivek; Mahajan, Aman
2010-11-01
Augmentation of coronary perfusion may improve right ventricular (RV) failure following acute increases of RV afterload. We investigated whether intra-aortic balloon counterpulsation (IABP) can improve cardiac function by enhancing myocardial perfusion and reversing compromised biventricular interactions using a model of acute pressure overload. In 10 anesthetized pigs, RV failure was induced by pulmonary artery constriction and systemic hypertension strategies with IABP, phenylephrine (PE), or the combination of both were tested. Systemic and ventricular hemodynamics [cardiac index(CI), ventricular pressures, coronary driving pressures (CDP)] were measured and echocardiography was used to assess tricuspid valve regurgitation, septal positioning (eccentricity index (ECI)), and changes in ventricular and septal dimensions and function [myocardial performance index (MPI), peak longitudinal strain]. Pulmonary artery constriction resulted in doubling of RV systolic pressure (54 ± 4mm Hg), RV distension, severe TR (4+) with decreased RV function (strain: -33%; MPI: +56%), septal flattening (Wt%: -35%) and leftward septal shift (ECI:1.36), resulting in global hemodynamic deterioration (CI: -51%; SvO(2): -26%), and impaired CDP (-30%; P<0.05). IABP support alone failed to improve RV function despite higher CDP (+33%; P<0.05). Systemic hypertension by PE improved CDP (+70%), RV function (strain: +22%; MPI: -21%), septal positioning (ECI:1.12) and minimized TR, but LV dysfunction (strain: -25%; MPI: +31%) occurred after LV afterloading (P<0.05). With IABP, less PE (-41%) was needed to maintain hypertension and CDP was further augmented (+25%). IABP resulted in LV unloading and restored LV function, and increased CI (+46%) and SvO(2) (+29%; P<0.05). IABP with minimal vasopressors augments myocardial perfusion pressure and optimizes RV function after pressure-induced failure. Copyright © 2010 Elsevier Inc. All rights reserved.
A portable cell-based impedance sensor for toxicity testing of drinking water.
Curtis, Theresa M; Widder, Mark W; Brennan, Linda M; Schwager, Steven J; van der Schalie, William H; Fey, Julien; Salazar, Noe
2009-08-07
A major limitation to using mammalian cell-based biosensors for field testing of drinking water samples is the difficulty of maintaining cell viability and sterility without an on-site cell culture facility. This paper describes a portable automated bench-top mammalian cell-based toxicity sensor that incorporates enclosed fluidic biochips containing endothelial cells monitored by Electric Cell-substrate Impedance Sensing (ECIS) technology. Long-term maintenance of cells on the biochips is made possible by using a compact, self-contained disposable media delivery system. The toxicity sensor monitors changes in impedance of cell monolayers on the biochips after the introduction of water samples. The fluidic biochip includes an ECIS electronic layer and a polycarbonate channel layer, which together reduce initial impedance disturbances seen in commercially available open well ECIS chips caused by the mechanics of pipetting while maintaining the ability of the cells to respond to toxicants. A curve discrimination program was developed that compares impedance values over time between the control and treatment channels on the fluidic biochip and determines if they are significantly different. Toxicant responses of bovine pulmonary artery endothelial cells grown on fluidic biochips are similar to cells on commercially-available open well chips, and these cells can be maintained in the toxicity sensor device for at least nine days using an automated media delivery system. Longer-term cell storage is possible; bovine lung microvessel endothelial cells survive for up to four months on the fluidic biochips and remain responsive to a model toxicant. This is the first demonstration of a portable bench top system capable of both supporting cell health over extended periods of time and obtaining impedance measurements from endothelial cell monolayers after toxicant exposure.
Kho, Dan T; Johnson, Rebecca H; O'Carroll, Simon J; Angel, Catherine E; Graham, E Scott
2017-09-21
Herein we demonstrate the sensitive nature of human blood-brain barrier (BBB) endothelial cells to sodium azide and its gaseous product. Sodium azide is known to be acutely cytotoxic at low millimolar concentrations, hence its use as a biological preservative (e.g., in antibodies). Loss of barrier integrity was noticed in experiments using Electric Cell-substrate Impedance Sensing (ECIS) biosensor technology, to measure endothelial barrier integrity continuously in real-time. Initially the effect of sodium azide was observed as an artefact where it was present in antibodies being employed in neutralisation experiments. This was confirmed where antibody clones that were azide-free did not mediate loss of barrier function. A delayed loss of barrier function in neighbouring wells implied the influence of a liberated gaseous product. ECIS technology demonstrated that the BBB endothelial cells had a lower level of direct sensitivity to sodium azide of ~3 µM. Evidence of gaseous toxicity was consistently observed at 30 µM and above, with disrupted barrier function and cell death in neighbouring wells. We highlight the ability of this cellular biosensor technology to reveal both the direct and gaseous toxicity mediated by sodium azide. The sensitivity and temporal dimension of ECIS technology was instrumental in these observations. These findings have substantial implications for the wide use of sodium azide in biological reagents, raising issues of their application in live-cell assays and with regard to the protection of the user. This research also has wider relevance highlighting the sensitivity of brain endothelial cells to a known mitochondrial disruptor. It is logical to hypothesise that BBB endothelial dysfunction due to mitochondrial dys-regulation could have an important but underappreciated role in a range of neurological diseases.
Morales-Ramos, Juan A; Rojas, M Guadalupe
2015-10-01
Crowding conditions of larvae may have a significant impact on commercial production efficiency of some insects, such as Tenebrio molitor L. (Coleoptera: Tenebrionidae). Although larval densities are known to affect developmental time and growth in T. molitor, no reports were found on the effects of crowding on food utilization. The effect of larval density on food utilization efficiency of T. molitor larvae was studied by measuring efficiency of ingested food conversion (ECI), efficiency of digested food conversion (EDC), and mg of larval weight gain per gram of food consumed (LWGpFC) at increasing larval densities (12, 24, 36, 48, 50, 62, 74, and 96 larvae per dm(2)) over four consecutive 3-wk periods. Individual larval weight gain and food consumption were negatively impacted by larval density. Similarly, ECI, ECD, and LWGpFC were negatively impacted by larval density. Larval ageing, measured as four consecutive 3-wk periods, significantly and independently impacted ECI, ECD, and LWGpFC in a negative way. General linear model analysis showed that age had a higher impact than density on food utilization parameters of T. molitor larvae. Larval growth was determined to be responsible for the age effects, as measurements of larval mass density (in grams of larvae per dm(2)) had a significant impact on food utilization parameters across ages and density treatments (in number of larvae per dm(2)). The importance of mass versus numbers per unit of area as measurements of larval density and the implications of negative effects of density on food utilization for insect biomass production are discussed. Published by Oxford University Press on behalf of Entomological Society of America 2015. This work is written by US Government employees and is in the public domain in the US.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
...); Electronic Contracting Initiative (ECI) AGENCY: Office of Acquisition Policy, General Services Administration..., Electronic Contracting Initiative, by any of the following methods: Regulations.gov : http://www.regulations... the rewrite of GSAR Part 538, Electronic Contracting Initiative (Modifications). On December 17, 2012...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hintermeier, Peter H.; Eckstein, Sebastian; Mei, Donghai
Hydronium ions in the pores of zeolite H-ZSM5 show high catalytic activity in the elimination of water from cyclohexanol in aqueous phase. Substitution induces subtle changes in rates and reaction pathways, which are concluded to be related to steric effects. Exploring the reaction pathways of 2-, 3-, and 4-methylcyclohexanol (2-McyOH, 3-McyOH, and 4-McyOH), 2- and 4-ethylcyclohexanol (2-EcyOH and 4-EcyOH), 2- n-propylcyclohexanol (2-PcyOH), and cyclohexanol (CyOH) it is shown that the E2 character increases with closer positioning of the alkyl and hydroxyl groups. Thus, 4-McyOH dehydration proceeds via an E1-type elimination, while cis-2-McyOH preferentially reacts via an E2 pathway. The entropymore » of activation decreased with increasing alkyl chain length (ca. 20 J mol -1 K -1 per CH 2 unit) for 2-substituted alcohols, which is concluded to result from constraints influencing the configurational entropy of the transition states.« less
Manual and automation testing and verification of TEQ [ECI PROPIRETRY
NASA Astrophysics Data System (ADS)
Abhichandra, Ravi; Jasmine Pemeena Priyadarsini, M.
2017-11-01
The telecommunication industry has progressed from 1G to 4G and now 5G is gaining prominence. Given the pace of this abrupt transformation, technological obsolescence is becoming a serious issue to deal with. Adding to this fact is that the execution of each technology requires ample investment into network, infrastructure, development etc. As a result, the industry is becoming more dynamic and strategy oriented. It requires professionals who not only can understand technology but also can evaluate the same from a business perspective. The “Information Revolution” and the dramatic advances in telecommunications technology, which has made this possible, currently drive the global economy in large part. As wireless networks become more advanced and far-reaching, we are redefining the notion of connectivity and the possibilities of communications technology. In this paper I test and verify the optical cards and automate this test procedure by using a new in-house technology “TEQ” developed by ECI TELECOM which uses one the optical cards itself to pump traffic of 100gbps.
USDA-ARS?s Scientific Manuscript database
The plant genus Erythroxylum is known for four cultivated taxa, Erythroxylum coca var. coca (Ecc), Erythroxylum coca var. ipadu (Eci), Erythroxylum novogranatense var. novogranatense (Enn) and Erythroxylum novogranatense var. truxillense (Ent) that are cultivated primarily for the illicit extraction...
76 FR 71073 - Proposed Collection, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... reverting to a national survey design in order to preserve the reliability of the ECI and EBS, after the... comments concerning the proposed revision of the ``National Compensation Survey.'' A copy of the proposed... INFORMATION: I. Background The National Compensation Survey (NCS) is an ongoing survey of earnings and...
Materials and Equipment: ECI-4. Early Childhood Intervention Catalog Module.
ERIC Educational Resources Information Center
Evans, Joyce; Bricker, Donna
The fourth of seven modules for professionals working with young (birth to age 3) handicapped children provides information on equipment, materials, and activities in early intervention. Background information discusses ways to catalog materials and equipment and describes teaching approaches using toys and manipulatives. A variety of teaching…
Early Childhood Intervention in China
ERIC Educational Resources Information Center
Zheng, Yuzhu; Maude, Susan P.; Brotherson, Mary Jane
2015-01-01
With rapid economic development and increasing awareness of the importance of early childhood intervention (ECI), China is re-examining its social and educational practices for young children with disabilities. This re-examination may have a significant impact on young children with disabilities in China. It may also set an example for other…
Code of Federal Regulations, 2011 CFR
2011-07-01
... Department of Pollution Control and Ecology, 8001 National Drive, P.O. Box 9583, Little Rock, AR 72209. (F... State Department of Ecology (Ecology), P.O. Box 47600, Olympia, WA 98504-7600, http://www.ecy.wa.gov... not delegated. NSPS Subparts Delegated to Washington Air Agencies Subpart 1 Washington Ecology 2 BCAA...
Discriminating Emotional Intelligence-Based Competencies of IT Employees and Salespeople
ERIC Educational Resources Information Center
Yildirim, Osman
2007-01-01
Purpose: This study aims at investigating emotional intelligence based on competencies for sales and IT people. Design/methodology/approach: A study was conducted on 111 employees of 12 firms from four different sectors in which firms benefited extensively from IT and sales activities. Findings: Emotional Competency Inventory (ECI, 2.0) was used…
Defense Systems Modernization and Sustainment Initiative
2008-07-21
surface coatings, including metals and plastics , and coating application processes were developed for repairing bearing surfaces. The Modernization through...technologies and applications utilized by the NC3R team. " : eea -cy c le en- ne Oglsti t trei eci d ’leet ~h SSt re*,0 an tnding ecsio Uiprrts ren.c" O e
Should the Increase in Military Pay Be Slowed?
2012-01-01
similar and are not shown. However, the results differ for workers ages 23–27 with more than a bachelor’s degree. These people are “fast- burners ...overestimated civilian wage growth and have argued that in a time of eco - nomic expansion the ECI might underestimate civilian wage growth. Since the CPS
1984-04-12
fodder. There is no bacteria down there, so no decay. Amundsen’s suc- cessful mission also began from the Ross ice shelf but from the west, far from...cause the premises have long ago been internalized. In Nueva Eci- ja when we arrive in a new barrio, long arms and all, the peasants re- ceive us
Parenting the Young Handicapped Child: ECI-6. Early Childhood Intervention Catalog Module.
ERIC Educational Resources Information Center
Evans, Joyce; Bricker, Donna
The sixth of seven monographs on early intervention for young (birth to age 3) handicapped children is intended for parents and other family members. Materials which focus on five topics appropriate for parent groups (overcoming communication barriers, legal rights and responsibility, self concept and their children, toy construction for learning,…
Space Based Satellite Tracking and Characterization Utilizing Non-Imaging Passive Sensors
2008-03-01
vary from only slightly here. The classical orbital elements are: a - The Semimajor Axis e - Eccentricity i - Inclination Ω - Right Ascension of the...Eccentricity . . . . . . . . . . . . . . . . . . . . . . . . . . 7 ~h Axis normal to orbital plane . . . . . . . . . . . . . . . . . 7 Ω Right ascension of...transistion matrix . . . . . . . . . . . . . . . . . . . 27 i Orbital inclination . . . . . . . . . . . . . . . . . . . . . . 28 Ẑ Unit vector in ECI frame
77 FR 2932 - Airworthiness Directives; Rolls-Royce plc (RR) Turbofan Engines
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-20
...-pressure (IP) compressor rotor shaft rear balance land for cracks, which could lead to engine failure. This... and when in the shop, repetitive eddy current inspections (ECIs) for cracks on the rear balance land... for cracks on the rear balance land. That NPRM also proposed certain optional terminating actions...
ERIC Educational Resources Information Center
Daniel, Shannon M.
2015-01-01
In this self-study, the author reflects on her implementation of empathetic, critical integrations of multiple perspectives (ECI), which she designed to afford preservice teachers the opportunity to discuss and collectively reflect upon the oft-diverging multiple perspectives, values, and practices they experience during their practicum (Daniel,…
Emotional Creativity and Real-Life Involvement in Different Types of Creative Leisure Activities
ERIC Educational Resources Information Center
Trnka, Radek; Zahradnik, Martin; Kuška, Martin
2016-01-01
The role of emotional creativity in practicing creative leisure activities and in the preference of college majors remains unknown. This study aims to explore how emotional creativity measured by the Emotional Creativity Inventory (ECI; Averill, 1999) is interrelated with the real-life involvement in different types of specific creative leisure…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... repetitive fluorescent penetrant inspections (FPIs) and eddy current inspections (ECIs) of the high-pressure... high- pressure compressor rotor (HPCR) 8-10 stage spool, part number (P/N) 1844M90G01 or 1844M90G02...
The Entrepreneurial Campus Initiative: Understanding the Entrepreneurial Orientation of Students
ERIC Educational Resources Information Center
Center for Rural Policy and Development, 2005
2005-01-01
The Entrepreneurial Campus Initiative (ECI) is a collaborative effort between the Northland Foundation, the Northeast Entrepreneur Fund and the Center for Rural Policy and Development. The purpose of the initiative is to explore and design new ways in which rural college campuses can be more engaged in entrepreneurship and business development in…
1975-09-01
This report assumes a familiarity with the GIFT and MAGIC computer codes. The EDIT-COMGEOM code is a FORTRAN computer code. The EDIT-COMGEOM code...converts the target description data which was used in the MAGIC computer code to the target description data which can be used in the GIFT computer code
Szulcek, Robert; van Bezu, Jan; Boonstra, Johannes; van Loon, Jack J. W. A.; van Nieuw Amerongen, Geerten P.
2015-01-01
Background Endothelial cells (EC) guard vascular functions by forming a dynamic barrier throughout the vascular system that sensitively adapts to ‘classical’ biomechanical forces, such as fluid shear stress and hydrostatic pressure. Alterations in gravitational forces might similarly affect EC integrity, but remain insufficiently studied. Methods In an unique approach, we utilized Electric Cell-substrate Impedance Sensing (ECIS) in the gravity-simulators at the European Space Agency (ESA) to study dynamic responses of human EC to simulated micro- and hyper-gravity as well as to classical forces. Results Short intervals of micro- or hyper-gravity evoked distinct endothelial responses. Stimulated micro-gravity led to decreased endothelial barrier integrity, whereas hyper-gravity caused sustained barrier enhancement by rapid improvement of cell-cell integrity, evidenced by a significant junctional accumulation of VE-cadherin (p = 0.011), significant enforcement of peripheral F-actin (p = 0.008) and accompanied by a slower enhancement of cell-matrix interactions. The hyper-gravity triggered EC responses were force dependent and nitric-oxide (NO) mediated showing a maximal resistance increase of 29.2±4.8 ohms at 2g and 60.9±6.2 ohms at 4g vs. baseline values that was significantly suppressed by NO blockage (p = 0.011). Conclusion In conclusion, short-term application of hyper-gravity caused a sustained improvement of endothelial barrier integrity, whereas simulated micro-gravity weakened the endothelium. In clear contrast, classical forces of shear stress and hydrostatic pressure induced either short-lived or no changes to the EC barrier. Here, ECIS has proven a powerful tool to characterize subtle and distinct EC gravity-responses due to its high temporal resolution, wherefore ECIS has a great potential for the study of gravity-responses such as in real space flights providing quantitative assessment of a variety of cell biological characteristics of any adherent growing cell type in an automated and continuous fashion. PMID:26637177
Global estimate of the incidence of clinical pneumonia among children under five years of age.
Rudan, Igor; Tomaskovic, Lana; Boschi-Pinto, Cynthia; Campbell, Harry
2004-01-01
OBJECTIVE: Clinical pneumonia (defined as respiratory infections associated with clinical signs of pneumonia, principally pneumonia and bronchiolitis) in children under five years of age is still the leading cause of childhood mortality in the world. In this paper we aim to estimate the worldwide incidence of clinical pneumonia in young children. METHODS: Our estimate for the developing world is based on an analysis of published data on the incidence of clinical pneumonia from community based longitudinal studies. Among more than 2000 studies published since 1961, we identified 46 studies that reported the incidence of clinical pneumonia, and 28 of these met pre-defined quality criteria. FINDINGS: The estimate of the median incidence from those studies was 0.28 episodes per child-year (e/cy). The 25-75% interquartile range was 0.21-0.71. We assessed the plausibility of this estimate using estimates of global mortality from acute respiratory infections and reported case fatality rates for all episodes of clinical pneumonia reported in community-based studies or the case-fatality rate reported only for severe cases and estimates of the proportion of severe cases occurring in a defined population or community. CONCLUSION: The overlap between the ranges of the estimates implies that a plausible incidence estimate of clinical pneumonia for developing countries is 0.29 e/cy. This equates to an annual incidence of 150.7 million new cases, 11-20 million (7-13%) of which are severe enough to require hospital admission. In the developed world no comparable data are available. However, large population-based studies report that the incidence of community-acquired pneumonia among children less than five years old is approximately 0.026 e/cy, suggesting that more than 95% of all episodes of clinical pneumonia in young children worldwide occur in developing countries. PMID:15654403
ERIC Educational Resources Information Center
Greenwood, Charles R.; Walker, Dale; Buzhardt, Jay
2010-01-01
The Early Communication Indicator (ECI) is a measure relevant to intervention decision making and progress monitoring for infants and toddlers. With increasing recognition of the importance of quality early childhood education and intervention for all children, measurement plays an important role in documenting children's progress and outcomes of…
ERIC Educational Resources Information Center
Pratt, Michael W.; Skoe, Eva E.; Arnold, Mary Louise
2004-01-01
Over the past 20 years, care reasoning has been increasingly recognised as an important aspect of moral development. Skoe has developed an interview measure of levels of care reasoning about the needs of self and other in relationships, the Ethic of Care Interview or ECI. In the present longitudinal research, we investigated developmental changes…
USDA-ARS?s Scientific Manuscript database
Listeria monocytogenes remains a major foodborne pathogen with three serotype 4b clonal groups (ECI, ECII, ECIa) repeatedly implicated in human listeriosis. For reasons that are unknown, many of these strains are also resistant to heavy metals, i.e. cadmium and arsenic. The acquisition and fitness i...
An Environmental Sentinel Biomonitor System for Drinking Water Protection
2008-12-01
threat chemicals. Potential interferences include chemicals commonly used for drinking water disinfection (chlorine and chloramine ), byproducts of...range. Of the potential interferences tested, the ECIS test was affected only by the disinfectants chlorine and chloramine ; chlorine is typically...Industrial and Potential Interfering Chemicals Used to Evaluate ESB System Toxicity Sensors TICs Potential Interferences Acrylonitrile Chloramine
ERIC Educational Resources Information Center
Giallourakis, Angie; Kent, Kristie Pretti-Frontczak; Cook, Bryan
2005-01-01
This research project sought to develop a measure to examine the family-centered beliefs, skills, work systems, and work practices of early childhood intervention, (ECI) and early childhood education (ECE) graduate students. The study was guided by four research questions: (1) To what extent do graduate students from preservice preparation…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-18
... with front compressor front hub (fan hub), part number (P/N) 5000501-01 installed. That AD currently requires cleaning the front compressor front hubs (fan hubs), initial and repetitive eddy current (ECI) and.... In addition, that AD currently requires reporting the findings of cracked fan hubs and monthly...
ERIC Educational Resources Information Center
LoCasale-Crouch, Jennifer; Hamre, Bridget; Roberts, Amy; Neesen, Kathy
2016-01-01
The "Effective Classroom Interactions" (ECI) online courses were designed to provide an engaging, effective and scalable approach to enhancing early childhood teachers' use of classroom practices that impact children's school readiness. The created courses included several versions aimed at testing whether or not certain design aspects…
Welcome Home and Early Start: An Assessment of Program Quality and Outcomes
ERIC Educational Resources Information Center
Daro, Deborah, Howard, Eboni; Tobin, Jennifer; Harden, Allen
2005-01-01
Chapin Hall Center for Children at the University of Chicago, in collaboration with Westat Associates, designed and implemented a comprehensive evaluation of the Early Childhood Initiative's (ECI) two home visitation programs: Welcome Home, a universal home visitation program that provides a single home visit to all first-time and teen parents,…
Wang, Fulin; He, Jiewang; Shi, Jianghua; Zheng, Tao; Xu, Fei; Wu, Guanting; Liu, Renhu; Liu, Shengyi
2016-01-01
Seed coat color is determined by the type of pigment deposited in the seed coat cells. It is related to important agronomic traits of seeds such as seed dormancy, longevity, oil content, protein content and fiber content. In Brassica napus, inheritance of seed coat color is related to maternal effects and pollen effects (xenia effects). In this research we isolated a mutation of yellow seeded B. napus controlled by a single Mendelian locus, which is named Embryonal Control of Yellow seed coat 1 (Ecy1). Microscopy of transverse sections of the mature seed show that pigment is deposited only in the outer layer of the seed coat. Using Illumina Hisequation 2000 sequencing technology, a total of 12 GB clean data, 116× coverage of coding sequences of B. napus, was achieved from seeds 26 d after pollination (DAP). It was assembled into 172,238 independent transcripts, and 55,637 unigenes. A total of 139 orthologous genes of Arabidopsis transparent testa (TT) genes were mapped in silico to 19 chromosomes of B. napus. Only 49 of the TT orthologous genes are transcribed in seeds. However transcription of all orthologs was independent of embryonal control of seed coat color. Only 55 genes were found to be differentially expressed between brown seeds and the yellow mutant. Of these 55, 50 were upregulated and five were downregulated in yellow seeds as compared to their brown counterparts. By KEGG classification, 14 metabolic pathways were significantly enriched. Of these, five pathways: phenylpropanoid biosynthesis, cyanoamino acid metabolism, plant hormone signal transduction, metabolic pathways, and biosynthesis of secondary metabolites, were related with seed coat pigmentation. Free amino acid quantification showed that Ala and Phe were present at higher levels in the embryos of yellow seeds as compared to those of brown seeds. This increase was not observed in the seed coat. Moreover, the excess amount of free Ala was exactly twice that of Phe in the embryo. The pigment substrate chalcone is synthesized from two molecules of Ala and one molecule of Phe. The correlation between accumulation of Ala and Phe, and disappearance of pigment in the yellow seeded mutant, suggests that embryonal control of seed coat color is related with Phe and Ala metabolism in the embryo of B. napus. PMID:26896439
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-13
... (ECI) of the propeller hub for cracks. That proposed AD resulted from reports of 3 hubs found cracked... NPRM results from reports of 7 additional hubs found cracked during propeller overhaul, totaling 10 cracked hubs. We are proposing this supplemental AD to prevent failure of the propeller hub, which could...
Trajectory Control and Optimization for Responsive Spacecraft
2012-03-22
Orbital Elements and Local-Vertical-Local-Horizontal Frame 10 2.3 Equinoctial Frame with respect to ECI Frame [17] . . . . . . . . . 14 3.1...position and velocity, classical orbital elements , and equinoctial elements . These methods are detailed in the following sections. 2.1.1 Inertial Position...trajectory. However, if the singularities are unavoidable equinoctial orbital elements could be used. 2.1.3 Equinoctial Elements . Equinoctial
A Trade Study of Thermosphere Empirical Neutral Density Models
2014-08-01
n,m = Degree and order, respectively ′ = Geocentric latitude Approved for public release; distribution is unlimited. 2 λ = Geocentric ...coordinate. The ECI coordinate system also known as the Approved for public release; distribution is unlimited. 3 geocentric equatorial system has...seconds for numerical integration. The EGM96 model specifies V in the Earth-Center, Earth-Fixed (ECEF) coordinate frame, a geocentric coordinate
Family Quality of Life: A Key Outcome in Early Childhood Intervention Services--A Scoping Review
ERIC Educational Resources Information Center
Bhopti, Anoo; Brown, Ted; Lentin, Primrose
2016-01-01
A scoping review was conducted to identify factors influencing the quality of life of families of children with disability. The review also explored the scales used to measure family quality of life (FQOL) as an outcome in early childhood intervention services (ECIS). Multiple databases were searched from 2000 to 2013 to include studies pertinent…
The research on the fairness of carbon emissions for China's energy based on GIS
NASA Astrophysics Data System (ADS)
Wang, Qiuxian; Gao, Zhiqiang; Ning, Jicai; Lu, Qingshui; Shi, Runhe; Gao, Wei
2013-09-01
This article firstly calculated China's energy carbon emissions of 30 provinces in 2010 with the method of carbon emission inventories of 2006 IPCC based on the data of China energy statistical yearbook, and then calculated its carbon emission intensity with GDP data in China's statistical yearbook. Next according to the formed formula the author calculated the EEI (Economic Efficiency Index) and ECI (Ecological Carrying Index) and made some corresponding figures with the help of GIS to analyze the fairness of the China's energy CO2 emissions in 2010.The results showed that the distribution of China's CO2 emissions for energy in 2010 become lower from the Bohai bay to the surroundings and the west circle provinces are with the lowest energy carbon emissions. The intensity distribution of China's CO2 emissions for energy in 2010 becomes higher from southeast China to north China. The distributions of EEI, ECI and for China's energy CO2 emissions are quite different from each other, and also with their comprehensive result. As to the fairness of China's energy CO2 emissions in 2010, we can say that the south provinces are better than those of Bohai bay areas (except Beijing and Tianjing).
Effects of osmolarity on human epithelial conjunctival cells using an electrical technique.
Bellotti, Mariela; Bast, Walter; Berra, Alejandro; Bonetto, Fabian J
2011-12-01
The purpose of this study is to report the effect of different media osmolarity on a cell line monolayer of normal human conjunctival epithelia (IOBA-NHC) using Electric Cell-substrate Impedance Sensing (ECIS). We built our own ECIS system. We fabricated biocompatible microelectrodes. We used a monolayer of IOBA-NHC cells with media at different osmolarities (315, 360, 446, and 617 mOsm/l). When there is an increase in hyperosmolarity, there is a slight decrease in the measured resistance of the naked microelectrode (without cells), whereas its capacitance remained practically unchanged. The evaluation of resistance and capacitance of a microelectrode covered by a monolayer of IOBA-NHC in relation to a naked microelectrode showed no difference in the standard media (315 mOsm/l), a small difference with 360 mOsm/l, and significant differences with hyperosmolarities of 446 mOsm/l and 610 mOsm/l. The resistance with a confluent cell monolayer is up to three times greater compared to the value of the resistance of the naked electrode with standard media. Both resistance and capacitance measurements for the cell monolayer were sensitive to changes in osmolarity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.
1995-12-31
In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less
Development and application of the GIM code for the Cyber 203 computer
NASA Technical Reports Server (NTRS)
Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.
1982-01-01
The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.
2009-12-01
biblio /92109.html WEF 2004a. “Proposed Research for Developing the Biotic Ligand Model for Establishing Water Quality Criteria. Water Environment...Systems Center, Bremerton, WA. August 2006. Ecology Publication Number 06-10-54 http://www.ecy.wa.gov/ biblio /0610054.html Eriksen, R.S., Mackey
ERIC Educational Resources Information Center
Bakar, Zaharah Abu; Brown, P. Margaret; Remine, Maria D.
2010-01-01
This study investigated the potential effects of cochlear implantation and age at implantation on maternal interactional sensitivity. Three groups of dyads were studied at two points over 1 year. The hearing aid (HA) group wore hearing aids throughout the study, the early cochlear implanted (ECI) group were implanted prior to 22 months of age, and…
Tay, Sze Yan; Davison, John; Jin, Ng Chong; Yap, Philip Lin Kiat
2015-11-01
Advance care planning (ACP) is an important component to holistic care for people with early cognitive impairment (ECI) who are generally still mentally capable of making rational decisions. This study explores the willingness of people with ECI to engage in ACP and how clinical and behavioral variables shape their decisions. Ninety-eight persons with ECI (mild cognitive impairment or early dementia, Clinical Dementia Rating = 0.5/1.0) and assessed to have adequate insight (Clinical Insight Rating Scale) underwent an ACP information session. They were assessed on their knowledge of dementia (Knowledge of Alzheimer Disease) and willingness to engage in ACP, and were interviewed on their beliefs and attitudes toward ACP (Perceived Barriers and Benefits Scale to ACP). Univariate analysis and logistic regression identified factors that predicted persons willing or unwilling to engage in ACP based on their responses to the perceived barriers and benefits of ACP as well as sociodemographic and clinical variables. Fifty-two persons (54%) were keen on ACP. On univariate analysis, persons with better executive function [Frontal Assessment Battery (FAB)] (t = -3.63, P < .0001), were more educated (t = -2.11, P = .038) and had lower barriers to ACP (t = 2.47, P = .015), particularly less "passive coping" (t = 2.83, P = .006), were more keen on ACP. Logistic regression revealed that the combined predictive model comprising FAB, education, and passive coping as a whole reliably differentiated those unwilling to engage in ACP from those who were willing (χ(2) = 18.995, P = .001 with df = 4). Wald criterion showed that FAB (P = .016) and passive coping (P = .023) contributed most to prediction. Better education and executive function predict willingness to engage in ACP, and these factors are instrumental in a person's ability to acquire knowledge and process information. Initiating ACP discussions early, increasing efforts at education, and providing tailored information are important interventions that facilitate completion of ACP. Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Implementation of a 3D mixing layer code on parallel computers
NASA Technical Reports Server (NTRS)
Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.
1995-01-01
This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
2012-04-22
... clet.a in 1 eci i in e s ti cj|a t iii on cif tllrie ni:::, Incj patt ne i n sa net c Iriii act€ i i st i c: s of tin i ii tie in iiiiiss p itss i IP g id IE i INI th tliiii Cci ••• 11 f) i i ...
Insecticidal activity of certain medicinal plants.
Pavela, Roman
2004-12-01
The methanol extracts of eight species of medicinal plants were tested for insecticidal activity in third instar larvae of Egyptian cottonworm (Spodoptera littoralis). All extracts showed a certain degree of larval toxicity. The extracts of Ocimum basilicum, Origanum majorana and Salvia officinalis appeared to be highly toxic. The extracts significantly affected the growth indexes [relative growth rate (RGR), efficiency of conversion of ingested food (ECI), efficiency of conversion of digested food (ECD)].
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... repetitive fluorescent penetrant inspections (FPIs) and eddy current inspections (ECIs) of the high-pressure... engine operation, since the high-pressure module overhaul interval is 48,000 hours or 6,000 cycles when...-90B; and GE90-94B turbofan engines with a high- pressure compressor rotor (HPCR) 8-10 stage spool...
Strengthening United States National Security Through Education in the African American Community
1993-04-01
Howard University , 1974, "Can America Solve Its Biggest Problems?" Focus. February & March 1992: 5-6. Chubb, John E. and...Antoine M.,"Educating and Motivating African American Males to Succeed", The Journal of Negro Eci. Howard University , Washington DC: Howard University Press...Jerome,"Strategies for Success", The Journal of Nearo Education. Howard University , Washington DC: Howard University Press, Volume 59, Number 2,
Cytoprotective effects of atmospheric-pressure plasmas against hypoxia-induced neuronal injuries
NASA Astrophysics Data System (ADS)
Yan, Xu; Meng, Zhaozhong; Ouyang, Jiting; Qiao, Yajun; Li, Jiaxin; Jia, Mei; Yuan, Fang; (Ken Ostrikov, Kostya
2018-02-01
Atmospheric pressure plasma jet (APPJ) has recently been the focus of cytoprotective research due to the physiological roles of ROS and RNS. In the current study, we investigated the effect of APPJ treatment on the hypoxia (1% oxygen) induced cell injuries. SH-SY5Y cells were treated by APPJ for different duration and incubated in normoxic condition (20% oxygen) for 5 h followed by 24 h hypoxia treatment. Cell viability was evaluated by lactate dehydrogenase (LDH) release and further monitored using the electric cell-substrate impedance sensing (ECIS) system after APPJ treatment. Results showed that APPJ could reduce cell injuries after 24 h hypoxia, which was consistent with the ECIS results. Furthermore, extracellular NO and H2O2 production was significantly increased with the APPJ treatment. It was also interesting to find that APPJ treatment reduced SH-SY5Y cells proliferation in the hypoxic microenvironment during the first 20 h of hypoxia. Although more work was still need to clarify whether the cell viability maintenance was related to the cell proliferation during hypoxia, our results provide the first evidence of real-time cell viability changes after APPJ treatment under both normoxic and hypoxic conditions, which could provide evidence for the neuroprotective applications of APPJ.
NASA Astrophysics Data System (ADS)
Schulze, Martin H.; Heuer, Henning
2012-04-01
Carbon fiber based materials are used in many lightweight applications in aeronautical, automotive, machine and civil engineering application. By the increasing automation in the production process of CFRP laminates a manual optical inspection of each resin transfer molding (RTM) layer is not practicable. Due to the limitation to surface inspection, the quality parameters of multilayer 3 dimensional materials cannot be observed by optical systems. The Imaging Eddy- Current (EC) NDT is the only suitable inspection method for non-resin materials in the textile state that allows an inspection of surface and hidden layers in parallel. The HF-ECI method has the capability to measure layer displacements (misaligned angle orientations) and gap sizes in a multilayer carbon fiber structure. EC technique uses the variation of the electrical conductivity of carbon based materials to obtain material properties. Beside the determination of textural parameters like layer orientation and gap sizes between rovings, the detection of foreign polymer particles, fuzzy balls or visualization of undulations can be done by the method. For all of these typical parameters an imaging classification process chain based on a high resolving directional ECimaging device named EddyCus® MPECS and a 2D-FFT with adapted preprocessing algorithms are developed.
Cryogenic lifetime tests on a commercial epoxy resin high voltage bushing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwenterly, S W; Pleva, Ed; Ha, Tam T
2012-06-12
High-temperature superconducting (HTS) power devices operating in liquid nitrogen frequently require high-voltage bushings to carry the current leads from the superconducting windings to the room temperature grid connections. Oak Ridge National Laboratory is collaborating with Waukesha Electric Systems, SuperPower, and Southern California Edison to develop and demonstrate an HTS utility power transformer. Previous dielectric high voltage tests in support of this program have been carried out in test cryostats with commercial epoxy resin bushings from Electro Composites Inc. (ECI). Though the bushings performed well in these short-term tests, their long-term operation at high voltage in liquid nitrogen needs to bemore » verified for use on the utility grid. Long-term tests are being carried out on a sample 28-kV-class ECI bushing. The bushing has a monolithic cast, cycloaliphatic resin body and is fire- and shatter-resistant. The test cryostat is located in an interlocked cage and is energized at 25 kVac around the clock. Liquid nitrogen (LN) is automatically refilled every 9.5 hours. Partial discharge, capacitance, and leakage resistance tests are periodically performed to check for deviations from factory values. At present, over 2400 hours have been accumulated with no changes in these parameters. The tests are scheduled to run for four to six months.« less
Green, Theresa; Haley, Eryka; Eliasziw, Michael; Hoyte, Keith
2007-01-01
Motivational interviewing and stages of change are approaches to increasing knowledge and effecting behavioural change. This study examined the application of this approach on stroke knowledge acquisition and changing individual lifestyle risk factors in an outpatient clinic. RCT in which 200 participants were allocated to an education-counselling interview (ECI) or a control group. ECI group participants mapped their individual risk factors on a stage of change model and received an appointment to the next group lifestyle class. Participants completed a stroke knowledge questionnaire at baseline (T1), post-appointment, and three months (T3) post-appointment. Passive to active changes in lifestyle behaviour were self-reported at three months. There was a statistically significant difference between groups from T1 toT3 in stroke knowledge (p < 0.001). While there was a significant shift from a passive to active stage of change for the overall study sample (p < 0.000), there was no significant difference between groups on the identified risk factors. Although contact with patients in ambulatory clinical settings is limited due to time constraints, it is still possible to improve knowledge and initiate lifestyle changes utilizing motivational interviewing and a stage of change model. Stroke nurses may wish to consider these techniques in their practice setting.
Cryogenic lifetime tests on a commercial epoxy resin high voltage bushing
NASA Astrophysics Data System (ADS)
Schwenterly, S. W.; Pleva, E. F.; Ha, T. T.
2012-06-01
High-temperature superconducting (HTS) power devices operating in liquid nitrogen frequently require high-voltage bushings to carry the current leads from the superconducting windings to the room temperature grid connections. Oak Ridge National Laboratory (ORNL) is collaborating with Waukesha Electric Systems (WES), SuperPower (SP), and Southern California Edison (SCE) to develop and demonstrate an HTS utility power transformer. Previous dielectric high voltage tests in support of this program have been carried out in test cryostats with commercial epoxy resin bushings from Electro Composites Inc. (ECI). Though the bushings performed well in these short-term tests, their long-term operation at high voltage in liquid nitrogen (LN) needs to be verified for use on the utility grid. Long-term tests are being carried out on a sample 28-kV-rms-class ECI bushing. The bushing has a monolithic cast, cycloaliphatic resin body and is fire- and shatter-resistant. The test cryostat is located in an interlocked cage and is continuously energized at 25 kVac rms. LN is automatically refilled every 9.5 hours. Partial discharge, capacitance, and leakage resistance tests are periodically performed to check for deviations from factory values. At present, over 2400 hours have been accumulated with no changes in these parameters. The tests are scheduled to run for four to six months.
Computer Description of Black Hawk Helicopter
1979-06-01
Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents
NASA Astrophysics Data System (ADS)
Iwamoto, C.; Utsunomiya, H.; Tamii, A.; Akimune, H.; Nakada, H.; Shima, T.; Yamagata, T.; Kawabata, T.; Fujita, Y.; Matsubara, H.; Shimbara, Y.; Nagashima, M.; Suzuki, T.; Fujita, H.; Sakuda, M.; Mori, T.; Izumi, T.; Okamoto, A.; Kondo, T.; Bilgier, B.; Kozer, H. C.; Lui, Y.-W.; Hatanaka, K.
2012-06-01
A high-resolution measurement of inelastic proton scattering off Zr90 near 0° was performed at 295 MeV with a focus on a pronounced strength previously reported in the low-energy tail of giant dipole resonance. A forest of fine structure was observed in the excitation energy region 7-12 MeV. A multipole decomposition analysis of the angular distribution for the forest was carried out using the ECIS95 distorted-wave Born approximation code with the Hartree-Fock plus random-phase approximation model of E1 and M1 transition densities and inclusion of E1 Coulomb excitation. The analysis separated pygmy dipole and M1 resonances in the forest at EPDR=9.15±0.18MeV with ΓPDR=2.91±0.64MeV and at EM1=9.53±0.06MeV with ΓM1=2.70±0.17MeV in the Lorentzian function, respectively. The B(E1)↑ value for pygmy dipole resonance over 7-11 MeV is 0.75±0.08e2fm2, which corresponds to 2.1±0.2% of the Thomas-Reiche-Kuhn sum rule.
User manual for semi-circular compact range reflector code: Version 2
NASA Technical Reports Server (NTRS)
Gupta, Inder J.; Burnside, Walter D.
1987-01-01
A computer code has been developed at the Ohio State University ElectroScience Laboratory to analyze a semi-circular paraboloidal reflector with or without a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the reflector or its individual components at a given distance from the center of the paraboloid. The code computes the fields along a radial, horizontal, vertical or axial cut at that distance. Thus, it is very effective in computing the size of the sweet spot for a semi-circular compact range reflector. This report describes the operation of the code. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.
Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burk, K.W.; Andrews, G.L.
1989-02-01
The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less
Eurocan plus report: feasibility study for coordination of national cancer research activities.
2008-01-01
The EUROCAN+PLUS Project, called for by the European Parliament, was launched in October 2005 as a feasibility study for coordination of national cancer research activities in Europe. Over the course of the next two years, the Project process organized over 60 large meetings and countless smaller meetings that gathered in total over a thousand people, the largest Europe-wide consultation ever conducted in the field of cancer research.Despite a strong tradition in biomedical science in Europe, fragmentation and lack of sustainability remain formidable challenges for implementing innovative cancer research and cancer care improvement. There is an enormous duplication of research effort in the Member States, which wastes time, wastes money and severely limits the total intellectual concentration on the wide cancer problem. There is a striking lack of communication between some of the biggest actors on the European scene, and there are palpable tensions between funders and those researchers seeking funds.It is essential to include the patients' voice in the establishment of priority areas in cancer research at the present time. The necessity to have dialogue between funders and scientists to establish the best mechanisms to meet the needs of the entire community is evident. A top priority should be the development of translational research (in its widest form), leading to the development of effective and innovative cancer treatments and preventive strategies. Translational research ranges from bench-to-bedside innovative cancer therapies and extends to include bringing about changes in population behaviours when a risk factor is established.The EUROCAN+PLUS Project recommends the creation of a small, permanent and independent European Cancer Initiative (ECI). This should be a model structure and was widely supported at both General Assemblies of the project. The ECI should assume responsibility for stimulating innovative cancer research and facilitating processes, becoming the common voice of the cancer research community and serving as an interface between the cancer research community and European citizens, patients' organizations, European institutions, Member States, industry and small and medium enterprises (SMEs), putting into practice solutions aimed at alleviating barriers to collaboration and coordination of cancer research activities in the European Union, and dealing with legal and regulatory issues. The development of an effective ECI will require time, but this entity should be established immediately. As an initial step, coordination efforts should be directed towards the creation of a platform on translational research that could encompass (1) coordination between basic, clinical and epidemiological research; (2) formal agreements of co-operation between comprehensive cancer centres and basic research laboratories throughout Europe and (3) networking between funding bodies at the European level.The European Parliament and its instruments have had a major influence in cancer control in Europe, notably in tobacco control and in the implementation of effective population-based screening. To make further progress there is a need for novelty and innovation in cancer research and prevention in Europe, and having a platform such as the ECI, where those involved in all aspects of cancer research can meet, discuss and interact, is a decisive development for Europe.
User's manual for semi-circular compact range reflector code
NASA Technical Reports Server (NTRS)
Gupta, Inder J.; Burnside, Walter D.
1986-01-01
A computer code was developed to analyze a semi-circular paraboloidal reflector antenna with a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the antenna or its individual components at a given distance from the center of the paraboloid. Thus, it is very effective in computing the size of the sweet spot for RCS or antenna measurement. The operation of the code is described. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.
Highly fault-tolerant parallel computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spielman, D.A.
We re-introduce the coded model of fault-tolerant computation in which the input and output of a computational device are treated as words in an error-correcting code. A computational device correctly computes a function in the coded model if its input and output, once decoded, are a valid input and output of the function. In the coded model, it is reasonable to hope to simulate all computational devices by devices whose size is greater by a constant factor but which are exponentially reliable even if each of their components can fail with some constant probability. We consider fine-grained parallel computations inmore » which each processor has a constant probability of producing the wrong output at each time step. We show that any parallel computation that runs for time t on w processors can be performed reliably on a faulty machine in the coded model using w log{sup O(l)} w processors and time t log{sup O(l)} w. The failure probability of the computation will be at most t {center_dot} exp(-w{sup 1/4}). The codes used to communicate with our fault-tolerant machines are generalized Reed-Solomon codes and can thus be encoded and decoded in O(n log{sup O(1)} n) sequential time and are independent of the machine they are used to communicate with. We also show how coded computation can be used to self-correct many linear functions in parallel with arbitrarily small overhead.« less
An emulator for minimizing computer resources for finite element analysis
NASA Technical Reports Server (NTRS)
Melosh, R.; Utku, S.; Islam, M.; Salama, M.
1984-01-01
A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).
A generalized one-dimensional computer code for turbomachinery cooling passage flow calculations
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Roelke, Richard J.; Meitner, Peter L.
1989-01-01
A generalized one-dimensional computer code for analyzing the flow and heat transfer in the turbomachinery cooling passages was developed. This code is capable of handling rotating cooling passages with turbulators, 180 degree turns, pin fins, finned passages, by-pass flows, tip cap impingement flows, and flow branching. The code is an extension of a one-dimensional code developed by P. Meitner. In the subject code, correlations for both heat transfer coefficient and pressure loss computations were developed to model each of the above mentioned type of coolant passages. The code has the capability of independently computing the friction factor and heat transfer coefficient on each side of a rectangular passage. Either the mass flow at the inlet to the channel or the exit plane pressure can be specified. For a specified inlet total temperature, inlet total pressure, and exit static pressure, the code computers the flow rates through the main branch and the subbranches, flow through tip cap for impingement cooling, in addition to computing the coolant pressure, temperature, and heat transfer coefficient distribution in each coolant flow branch. Predictions from the subject code for both nonrotating and rotating passages agree well with experimental data. The code was used to analyze the cooling passage of a research cooled radial rotor.
Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †
Murdani, Muhammad Harist; Hong, Bonghee
2018-01-01
In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc) and neighborhood proximity (Top-K). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space. PMID:29587366
Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.
Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee
2018-03-24
In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.
Volume accumulator design analysis computer codes
NASA Technical Reports Server (NTRS)
Whitaker, W. D.; Shimazaki, T. T.
1973-01-01
The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.
ERIC Educational Resources Information Center
Bliss, Leonard B.; Sandiford, Janice R.
The study behaviors of Spanish-speaking students at a large two-year public college in the United States were studied using the Inventario de Comportamiento de Estudio (ECI) (L. Bliss, D. Vinay, and F. Koenigner), the Spanish version of the Learning and Study Strategies Inventory (C. Weinstein, 1987). Behaviors of these students were compared with…
"Hour of Code": Can It Change Students' Attitudes toward Programming?
ERIC Educational Resources Information Center
Du, Jie; Wimmer, Hayden; Rada, Roy
2016-01-01
The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…
Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses
ERIC Educational Resources Information Center
Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan
2013-01-01
Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…
Guidelines for developing vectorizable computer programs
NASA Technical Reports Server (NTRS)
Miner, E. W.
1982-01-01
Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.
The Helicopter Antenna Radiation Prediction Code (HARP)
NASA Technical Reports Server (NTRS)
Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.
1990-01-01
The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.
Enhanced fault-tolerant quantum computing in d-level systems.
Campbell, Earl T
2014-12-05
Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
Nonuniform code concatenation for universal fault-tolerant quantum computing
NASA Astrophysics Data System (ADS)
Nikahd, Eesa; Sedighi, Mehdi; Saheb Zamani, Morteza
2017-09-01
Using transversal gates is a straightforward and efficient technique for fault-tolerant quantum computing. Since transversal gates alone cannot be computationally universal, they must be combined with other approaches such as magic state distillation, code switching, or code concatenation to achieve universality. In this paper we propose an alternative approach for universal fault-tolerant quantum computing, mainly based on the code concatenation approach proposed in [T. Jochym-O'Connor and R. Laflamme, Phys. Rev. Lett. 112, 010505 (2014), 10.1103/PhysRevLett.112.010505], but in a nonuniform fashion. The proposed approach is described based on nonuniform concatenation of the 7-qubit Steane code with the 15-qubit Reed-Muller code, as well as the 5-qubit code with the 15-qubit Reed-Muller code, which lead to two 49-qubit and 47-qubit codes, respectively. These codes can correct any arbitrary single physical error with the ability to perform a universal set of fault-tolerant gates, without using magic state distillation.
Green's function methods in heavy ion shielding
NASA Technical Reports Server (NTRS)
Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.
1993-01-01
An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.
1982-01-01
A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.
Automated apparatus and method of generating native code for a stitching machine
NASA Technical Reports Server (NTRS)
Miller, Jeffrey L. (Inventor)
2000-01-01
A computer system automatically generates CNC code for a stitching machine. The computer determines the locations of a present stitching point and a next stitching point. If a constraint is not found between the present stitching point and the next stitching point, the computer generates code for making a stitch at the next stitching point. If a constraint is found, the computer generates code for changing a condition (e.g., direction) of the stitching machine's stitching head.
Computer codes developed and under development at Lewis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1992-01-01
The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zizin, M. N.; Zimin, V. G.; Zizina, S. N., E-mail: zizin@adis.vver.kiae.ru
2010-12-15
The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit ofmore » the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.« less
NASA Astrophysics Data System (ADS)
Zizin, M. N.; Zimin, V. G.; Zizina, S. N.; Kryakvin, L. V.; Pitilimov, V. A.; Tereshonok, V. A.
2010-12-01
The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit of the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.
Users manual and modeling improvements for axial turbine design and performance computer code TD2-2
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.
An Object-Oriented Approach to Writing Computational Electromagnetics Codes
NASA Technical Reports Server (NTRS)
Zimmerman, Martin; Mallasch, Paul G.
1996-01-01
Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.
1986-03-01
MC: i I Ii atar 0V_ l - (1C)dVI IC I-o fla’.iUcr aniid S trog-tq3tZ. dummoi tsi~r acec Lio ie i stenicv Of "forbi ddern zones’’ for 51 ec.,i) inl data...1971, pp. 109-148. ""roughton, R. Fiorhythm.4c variations in consciousress and psychological functions. Canad. Psychol. Rev., 1975, 16%217-239. Campbell
Computer Description of the Field Artillery Ammunition Supply Vehicle
1983-04-01
Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and
Global change in the trophic functioning of marine food webs.
Maureaud, Aurore; Gascuel, Didier; Colléter, Mathieu; Palomares, Maria L D; Du Pontavice, Hubert; Pauly, Daniel; Cheung, William W L
2017-01-01
The development of fisheries in the oceans, and other human drivers such as climate warming, have led to changes in species abundance, assemblages, trophic interactions, and ultimately in the functioning of marine food webs. Here, using a trophodynamic approach and global databases of catches and life history traits of marine species, we tested the hypothesis that anthropogenic ecological impacts may have led to changes in the global parameters defining the transfers of biomass within the food web. First, we developed two indicators to assess such changes: the Time Cumulated Indicator (TCI) measuring the residence time of biomass within the food web, and the Efficiency Cumulated Indicator (ECI) quantifying the fraction of secondary production reaching the top of the trophic chain. Then, we assessed, at the large marine ecosystem scale, the worldwide change of these two indicators over the 1950-2010 time-periods. Global trends were identified and cluster analyses were used to characterize the variability of trends between ecosystems. Results showed that the most common pattern over the study period is a global decrease in TCI, while the ECI indicator tends to increase. Thus, changes in species assemblages would induce faster and apparently more efficient biomass transfers in marine food webs. Results also suggested that the main driver of change over that period had been the large increase in fishing pressure. The largest changes occurred in ecosystems where 'fishing down the marine food web' are most intensive.
Stolwijk, Judith A.; Matrougui, Khalid; Renken, Christian W.; Trebak, Mohamed
2014-01-01
The past 20 years have seen significant growth in using impedance-based assays to understand the molecular underpinning of endothelial and epithelial barrier function in response to physiological agonists, pharmacological and toxicological compounds. Most studies on barrier function use G protein coupled receptor (GPCR) agonists which couple to fast and transient changes in barrier properties. The power of impedance based techniques such as Electric Cell-Substrate Impedance Sensing (ECIS) reside in its ability to detect minute changes in cell layer integrity label-free and in real-time ranging from seconds to days. We provide a comprehensive overview of the biophysical principles, applications and recent developments in impedance-based methodologies. Despite extensive application of impedance analysis in endothelial barrier research little attention has been paid to data analysis and critical experimental variables, which are both essential for signal stability and reproducibility. We describe the rationale behind common ECIS data presentation and interpretation and illustrate practical guidelines to improve signal intensity by adapting technical parameters such as electrode layout, monitoring frequency or parameter (resistance versus impedance magnitude). Moreover, we discuss the impact of experimental parameters, including cell source, liquid handling and agonist preparation on signal intensity and kinetics. Our discussions are supported by experimental data obtained from human microvascular endothelial cells challenged with three GPCR agonists, thrombin, histamine and Sphingosine-1-Phosphate. PMID:25537398
Stolwijk, Judith A; Matrougui, Khalid; Renken, Christian W; Trebak, Mohamed
2015-10-01
The past 20 years has seen significant growth in using impedance-based assays to understand the molecular underpinning of endothelial and epithelial barrier function in response to physiological agonists and pharmacological and toxicological compounds. Most studies on barrier function use G protein-coupled receptor (GPCR) agonists which couple to fast and transient changes in barrier properties. The power of impedance-based techniques such as electric cell-substrate impedance sensing (ECIS) resides in its ability to detect minute changes in cell layer integrity label-free and in real-time ranging from seconds to days. We provide a comprehensive overview of the biophysical principles, applications, and recent developments in impedance-based methodologies. Despite extensive application of impedance analysis in endothelial barrier research, little attention has been paid to data analysis and critical experimental variables, which are both essential for signal stability and reproducibility. We describe the rationale behind common ECIS data presentation and interpretation and illustrate practical guidelines to improve signal intensity by adapting technical parameters such as electrode layout, monitoring frequency, or parameter (resistance versus impedance magnitude). Moreover, we discuss the impact of experimental parameters, including cell source, liquid handling, and agonist preparation on signal intensity and kinetics. Our discussions are supported by experimental data obtained from human microvascular endothelial cells challenged with three GPCR agonists, thrombin, histamine, and sphingosine-1-phosphate.
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2011 CFR
2011-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2012 CFR
2012-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2014 CFR
2014-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2010 CFR
2010-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...
NASA Technical Reports Server (NTRS)
Harper, Warren
1989-01-01
Two electromagnetic scattering codes, NEC-BSC and ESP3, were delivered and installed on a NASA VAX computer for use by Marshall Space Flight Center antenna design personnel. The existing codes and certain supplementary software were updated, the codes installed on a computer that will be delivered to the customer, to provide capability for graphic display of the data to be computed by the use of the codes and to assist the customer in the solution of specific problems that demonstrate the use of the codes. With the exception of one code revision, all of these tasks were performed.
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2013 CFR
2013-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...
Parallel Computation of the Jacobian Matrix for Nonlinear Equation Solvers Using MATLAB
NASA Technical Reports Server (NTRS)
Rose, Geoffrey K.; Nguyen, Duc T.; Newman, Brett A.
2017-01-01
Demonstrating speedup for parallel code on a multicore shared memory PC can be challenging in MATLAB due to underlying parallel operations that are often opaque to the user. This can limit potential for improvement of serial code even for the so-called embarrassingly parallel applications. One such application is the computation of the Jacobian matrix inherent to most nonlinear equation solvers. Computation of this matrix represents the primary bottleneck in nonlinear solver speed such that commercial finite element (FE) and multi-body-dynamic (MBD) codes attempt to minimize computations. A timing study using MATLAB's Parallel Computing Toolbox was performed for numerical computation of the Jacobian. Several approaches for implementing parallel code were investigated while only the single program multiple data (spmd) method using composite objects provided positive results. Parallel code speedup is demonstrated but the goal of linear speedup through the addition of processors was not achieved due to PC architecture.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.
2004-09-14
This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.
Development of a model and computer code to describe solar grade silicon production processes
NASA Technical Reports Server (NTRS)
Gould, R. K.; Srivastava, R.
1979-01-01
Two computer codes were developed for describing flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides. The first is the CHEMPART code, an axisymmetric, marching code which treats two phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. It can be used to described flow reactors in which reactants, mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, deposition of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail.
Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO
NASA Technical Reports Server (NTRS)
Stallworth, R.; Meyers, C. A.; Stinson, H. C.
1989-01-01
Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.
Computational Predictions of the Performance Wright 'Bent End' Propellers
NASA Technical Reports Server (NTRS)
Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)
2002-01-01
Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.
Proceduracy: Computer Code Writing in the Continuum of Literacy
ERIC Educational Resources Information Center
Vee, Annette
2010-01-01
This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…
Computer Code Aids Design Of Wings
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.
1993-01-01
AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
APC: A New Code for Atmospheric Polarization Computations
NASA Technical Reports Server (NTRS)
Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.
2014-01-01
A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
The Light Infantry Company and Tactical Mobility: A Step in Which Direction?
1997-12-18
leviewigtecfl, ~ no infomtin led cohmmtewnts roearoeding this burden estimate or any other ,,,eci of it coJectOn f ,l~r~tOn, -including S Ifsr9ist s for reducing...Leave blank) j .REPORT DATE I3. REPORT TYPE AND DATES COVERED P. MONOGRAPH 4. TITLE AND SUBTITLE 1, S FUNDING NUMBERS 6. AUTHOR( S ) 7. PERFORMING...ORGANIZATION NAME( S ) AND ADDRESS(ES) B. PERFORMING ORGANIZATION REPORT NUMBER School of Advanced Military Studies Command and General StajffCollege Fort
Official Guard and Reserve Manpower Strengths and Statistics. FY 2004
2004-01-01
Comments concerned with policy issues of use of these reports for Reserve Component studies should be addressed to the Office of the Assistant Secretary...CD in v eg o COCOCOI^COGOCOGOr-COinr-. OJCOCOOJCDr-cn*- CDCDCDCDCDCDCDCncn& cnCD cocooiNtotoncocMcococoeor^o ddr^wi^wtbp ccccr-eci-trcTO...API 1 OASD (RA> RESERVE FORCES POLICY BOARD 5 OASD (MM&PP) (O&EPM) 1 OASD (HAKMRl (MP&T) 1 OASD (FM&P) (MM&PP) (APi 1 OASD/RA(M&Pl 20 OASD (FM
The Role of AR- and VDR-Modulated miRNAs in Sensitization of Prostate Cancer Cells to Therapy
2012-10-01
detected elevated levels of the AR-V7 variant in VCaP xenografts after castration (109). Prostate epithelial markers, cytokeratin 8 and 18 , are expressed...OF: 17. LIMITATION OF ABSTRACT 18 . NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON USAMRMC a. REPORT U b. ABSTRACT U c. THIS PAGE U...Author. 6) Gaupel AC1, Wang WLW1, Mordan-McCombs S, Lee ECY & Tenniswood M. Xenograft , transgenic and knockout models of prostate cancer. In: Conn
The Addition of Enhanced Capabilities to NATO GMTIF STANAG 4607 to Support RADARSAT-2 GMTI Data
2007-12-01
However, the cost is a loss in the accuracy of the position specification and its dependence on the particular ellipsoid and/or geoid models used in...platform provides these parameters. Table B-3. Reference Coordinate Systems COORDINATE SYSTEM VALUE Unidentified 0 GEI: Geocentric Equatorial...Inertial, also known as True Equator and True Equinox of Date, True of Date (TOD), ECI, or GCI 1 J2000: Geocentric Equatorial Inertial for epoch J2000.0
1977-02-01
usually with a ceiling height well in excess of head height. Formations (speleothems) include flowstone, stalagmites , stalactites , helictites, columns, onyx...Dots. Entered) READ INSTPUCTIONSREP-RTDOCUMENTATION PAGE B3EFORE COMPLETING FORM 1. REPORT NUMBER 12. qO T ACCESSION NO. 3. RECIPIENT’S CATALOG...practices. It is axiomatic that endangered sý.ecies require some form of management, as, by definition, most of them would become extinct otherwise. The
1983-12-01
observations of qray whales from the waters inside of Wasbington including the eastern Strait of Juan de ruca, the San Juan Islands, Puget Sound , and Hood...waters in winter. in the North Pacific this E.ecies is presently estimated tc number about 17,000 animals. One fin whale was pursued in Puget Sound i...owns submerged lands from tideland elevation -4.5 feet MLLW to deep water in Puget Sound . The Lummi Tribe (local sponsor) owns Reservation lands above
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanhoy, J. R.; Ramirez, A. P.; Alcorn-Dominguez, D. K.
Neutron inelastic scattering cross sections measured directly through (n,n) or deduced from g-ray production cross sections following inelastic neutron scattering (n,n0) are a focus of basic and applied research at the University of Kentucky Accelerator Laboratory (www.pa.uky.edu/accelerator). For nuclear data applications, angle-integrated cross sections are desired over a wide range of fast neutron energies. Several days of experimental beam time are required for a data set at each incident neutron energy, which limits the number of angular distributions that can be measured in a reasonable amount of time. Approximations can be employed to generate cross sections with a higher energymore » resolution, since at 125°, the a 2P 2 term of the Legendre expansion is identically zero and the a 4P 4 is assumed to be very small. Provided this assumption is true, a single measurement at 125o would produce the g-ray production cross section. Finally, this project tests these assumptions and energy dependences using the codes CINDY/SCAT and TALYS/ECIS06/SCAT. It is found that care must be taken when interpreting g-ray excitation functions as cross sections when the incident neutron energy is <1000 keV above threshold or before the onset of feeding.« less
Vanhoy, J. R.; Ramirez, A. P.; Alcorn-Dominguez, D. K.; ...
2017-09-13
Neutron inelastic scattering cross sections measured directly through (n,n) or deduced from g-ray production cross sections following inelastic neutron scattering (n,n0) are a focus of basic and applied research at the University of Kentucky Accelerator Laboratory (www.pa.uky.edu/accelerator). For nuclear data applications, angle-integrated cross sections are desired over a wide range of fast neutron energies. Several days of experimental beam time are required for a data set at each incident neutron energy, which limits the number of angular distributions that can be measured in a reasonable amount of time. Approximations can be employed to generate cross sections with a higher energymore » resolution, since at 125°, the a 2P 2 term of the Legendre expansion is identically zero and the a 4P 4 is assumed to be very small. Provided this assumption is true, a single measurement at 125o would produce the g-ray production cross section. Finally, this project tests these assumptions and energy dependences using the codes CINDY/SCAT and TALYS/ECIS06/SCAT. It is found that care must be taken when interpreting g-ray excitation functions as cross sections when the incident neutron energy is <1000 keV above threshold or before the onset of feeding.« less
NASA Astrophysics Data System (ADS)
Vanhoy, J. R.; Ramirez, A. P.; Alcorn-Dominguez, D. K.; Hicks, S. F.; Peters, E. E.; McEllistrem, M. T.; Mukhopadhyay, S.; Yates, S. W.
2017-09-01
Neutron inelastic scattering cross sections measured directly through (n,n) or deduced from γ-ray production cross sections following inelastic neutron scattering (n,n'γ) are a focus of basic and applied research at the University of Kentucky Accelerator Laboratory (
NASA Technical Reports Server (NTRS)
Norment, H. G.
1980-01-01
Calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Any subsonic, external, non-lifting flow can be accommodated; flow into, but not through, inlets also can be simulated. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Code descriptions include operating instructions, card inputs and printouts for example problems, and listing of the FORTRAN codes. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.
Utilizing GPUs to Accelerate Turbomachinery CFD Codes
NASA Technical Reports Server (NTRS)
MacCalla, Weylin; Kulkarni, Sameer
2016-01-01
GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.
PASCO: Structural panel analysis and sizing code: Users manual - Revised
NASA Technical Reports Server (NTRS)
Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.
1981-01-01
A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.
Computation of Reacting Flows in Combustion Processes
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Chen, Kuo-Huey
1997-01-01
The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.
NASA Rotor 37 CFD Code Validation: Glenn-HT Code
NASA Technical Reports Server (NTRS)
Ameri, Ali A.
2010-01-01
In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.
Final report for the Tera Computer TTI CRADA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, G.S.; Pavlakos, C.; Silva, C.
1997-01-01
Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less
Operations analysis (study 2.1). Program listing for the LOVES computer code
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1974-01-01
A listing of the LOVES computer program is presented. The program is coded partially in SIMSCRIPT and FORTRAN. This version of LOVES is compatible with both the CDC 7600 and the UNIVAC 1108 computers. The code has been compiled, loaded, and executed successfully on the EXEC 8 system for the UNIVAC 1108.
ERIC Educational Resources Information Center
Knowlton, Marie; Wetzel, Robin
2006-01-01
This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…
A MATLAB based 3D modeling and inversion code for MT data
NASA Astrophysics Data System (ADS)
Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.
2017-07-01
The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.
Applications of automatic differentiation in computational fluid dynamics
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.
1994-01-01
Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.
NASA Astrophysics Data System (ADS)
Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.
2016-02-01
The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.
Performance assessment of KORAT-3D on the ANL IBM-SP computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.
1999-09-01
The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Thomas; Hamilton, Steven; Slattery, Stuart
Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less
Fast H.264/AVC FRExt intra coding using belief propagation.
Milani, Simone
2011-01-01
In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.
2,445 Hours of Code: What I Learned from Facilitating Hour of Code Events in High School Libraries
ERIC Educational Resources Information Center
Colby, Jennifer
2015-01-01
This article describes a school librarian's experience with initiating an Hour of Code event for her school's student body. Hadi Partovi of Code.org conceived the Hour of Code "to get ten million students to try one hour of computer science" (Partovi, 2013a), which is implemented during Computer Science Education Week with a goal of…
NASA Technical Reports Server (NTRS)
Chaderjian, Neal M.
1991-01-01
Computations from two Navier-Stokes codes, NSS and F3D, are presented for a tangent-ogive-cylinder body at high angle of attack. Features of this steady flow include a pair of primary vortices on the leeward side of the body as well as secondary vortices. The topological and physical plausibility of this vortical structure is discussed. The accuracy of these codes are assessed by comparison of the numerical solutions with experimental data. The effects of turbulence model, numerical dissipation, and grid refinement are presented. The overall efficiency of these codes are also assessed by examining their convergence rates, computational time per time step, and maximum allowable time step for time-accurate computations. Overall, the numerical results from both codes compared equally well with experimental data, however, the NSS code was found to be significantly more efficient than the F3D code.
User's Manual for FEMOM3DR. Version 1.0
NASA Technical Reports Server (NTRS)
Reddy, C. J.
1998-01-01
FEMoM3DR is a computer code written in FORTRAN 77 to compute radiation characteristics of antennas on 3D body using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. The code is written to handle different feeding structures like coaxial line, rectangular waveguide, and circular waveguide. This code uses the tetrahedral elements, with vector edge basis functions for FEM and triangular elements with roof-top basis functions for MoM. By virtue of FEM, this code can handle any arbitrary shaped three dimensional bodies with inhomogeneous lossy materials; and due to MoM the computational domain can be terminated in any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.
Selection of a computer code for Hanford low-level waste engineered-system performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGrail, B.P.; Mahoney, L.A.
Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less
User's manual for a material transport code on the Octopus Computer Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naymik, T.G.; Mendez, G.D.
1978-09-15
A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.
EMPIRE: Nuclear Reaction Model Code System for Data Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herman, M.; Capote, R.; Carlson, B.V.
EMPIRE is a modular system of nuclear reaction codes, comprising various nuclear models, and designed for calculations over a broad range of energies and incident particles. A projectile can be a neutron, proton, any ion (including heavy-ions) or a photon. The energy range extends from the beginning of the unresolved resonance region for neutron-induced reactions ({approx} keV) and goes up to several hundred MeV for heavy-ion induced reactions. The code accounts for the major nuclear reaction mechanisms, including direct, pre-equilibrium and compound nucleus ones. Direct reactions are described by a generalized optical model (ECIS03) or by the simplified coupled-channels approachmore » (CCFUS). The pre-equilibrium mechanism can be treated by a deformation dependent multi-step direct (ORION + TRISTAN) model, by a NVWY multi-step compound one or by either a pre-equilibrium exciton model with cluster emission (PCROSS) or by another with full angular momentum coupling (DEGAS). Finally, the compound nucleus decay is described by the full featured Hauser-Feshbach model with {gamma}-cascade and width-fluctuations. Advanced treatment of the fission channel takes into account transmission through a multiple-humped fission barrier with absorption in the wells. The fission probability is derived in the WKB approximation within the optical model of fission. Several options for nuclear level densities include the EMPIRE-specific approach, which accounts for the effects of the dynamic deformation of a fast rotating nucleus, the classical Gilbert-Cameron approach and pre-calculated tables obtained with a microscopic model based on HFB single-particle level schemes with collective enhancement. A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers, moments of inertia and {gamma}-ray strength functions. The results can be converted into ENDF-6 formatted files using the accompanying code EMPEND and completed with neutron resonances extracted from the existing evaluations. The package contains the full EXFOR (CSISRS) library of experimental reaction data that are automatically retrieved during the calculations. Publication quality graphs can be obtained using the powerful and flexible plotting package ZVView. The graphic user interface, written in Tcl/Tk, provides for easy operation of the system. This paper describes the capabilities of the code, outlines physical models and indicates parameter libraries used by EMPIRE to predict reaction cross sections and spectra, mainly for nucleon-induced reactions. Selected applications of EMPIRE are discussed, the most important being an extensive use of the code in evaluations of neutron reactions for the new US library ENDF/B-VII.0. Future extensions of the system are outlined, including neutron resonance module as well as capabilities of generating covariances, using both KALMAN and Monte-Carlo methods, that are still being advanced and refined.« less
NASA Technical Reports Server (NTRS)
Logan, Terry G.
1994-01-01
The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.
Computer Description of the M561 Utility Truck
1984-10-01
GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eyler, L L; Trent, D S; Budden, M J
During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.
Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August
2018-07-01
Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.
Adiabatic topological quantum computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesare, Chris; Landahl, Andrew J.; Bacon, Dave
Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less
Adiabatic topological quantum computing
Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; ...
2015-07-31
Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less
Fast Computation of the Two-Point Correlation Function in the Age of Big Data
NASA Astrophysics Data System (ADS)
Pellegrino, Andrew; Timlin, John
2018-01-01
We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.
Design of convolutional tornado code
NASA Astrophysics Data System (ADS)
Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu
2017-09-01
As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.
Three-dimensional turbopump flowfield analysis
NASA Technical Reports Server (NTRS)
Sharma, O. P.; Belford, K. A.; Ni, R. H.
1992-01-01
A program was conducted to develop a flow prediction method applicable to rocket turbopumps. The complex nature of a flowfield in turbopumps is described and examples of flowfields are discussed to illustrate that physics based models and analytical calculation procedures based on computational fluid dynamics (CFD) are needed to develop reliable design procedures for turbopumps. A CFD code developed at NASA ARC was used as the base code. The turbulence model and boundary conditions in the base code were modified, respectively, to: (1) compute transitional flows and account for extra rates of strain, e.g., rotation; and (2) compute surface heat transfer coefficients and allow computation through multistage turbomachines. Benchmark quality data from two and three-dimensional cascades were used to verify the code. The predictive capabilities of the present CFD code were demonstrated by computing the flow through a radial impeller and a multistage axial flow turbine. Results of the program indicate that the present code operated in a two-dimensional mode is a cost effective alternative to full three-dimensional calculations, and that it permits realistic predictions of unsteady loadings and losses for multistage machines.
Brennan, Linda M.; Widder, Mark W.; McAleer, Michael K.; Mayo, Michael W.; Greis, Alex P.; van der Schalie, William H.
2016-01-01
This manuscript describes how to prepare fluidic biochips with Rainbow trout gill epithelial (RTgill-W1) cells for use in a field portable water toxicity sensor. A monolayer of RTgill-W1 cells forms on the sensing electrodes enclosed within the biochips. The biochips are then used for testing in a field portable electric cell-substrate impedance sensing (ECIS) device designed for rapid toxicity testing of drinking water. The manuscript further describes how to run a toxicity test using the prepared biochips. A control water sample and the test water sample are mixed with pre-measured powdered media and injected into separate channels of the biochip. Impedance readings from the sensing electrodes in each of the biochip channels are measured and compared by an automated statistical software program. The screen on the ECIS instrument will indicate either "Contamination Detected" or "No Contamination Detected" within an hour of sample injection. Advantages are ease of use and rapid response to a broad spectrum of inorganic and organic chemicals at concentrations that are relevant to human health concerns, as well as the long-term stability of stored biochips in a ready state for testing. Limitations are the requirement for cold storage of the biochips and limited sensitivity to cholinesterase-inhibiting pesticides. Applications for this toxicity detector are for rapid field-portable testing of drinking water supplies by Army Preventative Medicine personnel or for use at municipal water treatment facilities. PMID:27023147
Brennan, Linda M; Widder, Mark W; McAleer, Michael K; Mayo, Michael W; Greis, Alex P; van der Schalie, William H
2016-03-07
This manuscript describes how to prepare fluidic biochips with Rainbow trout gill epithelial (RTgill-W1) cells for use in a field portable water toxicity sensor. A monolayer of RTgill-W1 cells forms on the sensing electrodes enclosed within the biochips. The biochips are then used for testing in a field portable electric cell-substrate impedance sensing (ECIS) device designed for rapid toxicity testing of drinking water. The manuscript further describes how to run a toxicity test using the prepared biochips. A control water sample and the test water sample are mixed with pre-measured powdered media and injected into separate channels of the biochip. Impedance readings from the sensing electrodes in each of the biochip channels are measured and compared by an automated statistical software program. The screen on the ECIS instrument will indicate either "Contamination Detected" or "No Contamination Detected" within an hour of sample injection. Advantages are ease of use and rapid response to a broad spectrum of inorganic and organic chemicals at concentrations that are relevant to human health concerns, as well as the long-term stability of stored biochips in a ready state for testing. Limitations are the requirement for cold storage of the biochips and limited sensitivity to cholinesterase-inhibiting pesticides. Applications for this toxicity detector are for rapid field-portable testing of drinking water supplies by Army Preventative Medicine personnel or for use at municipal water treatment facilities.
Ward, Todd J.; Graves, Lewis M.; Tarr, Cheryl L.; Siletzky, Robin M.; Kathariou, Sophia
2014-01-01
Listeria monocytogenes can cause severe food-borne disease (listeriosis). Numerous outbreaks have involved three serotype 4b epidemic clones (ECs): ECI, ECII, and ECIa. However, little is known about the population structure of L. monocytogenes serotype 4b from sporadic listeriosis in the United States, even though most cases of human listeriosis are in fact sporadic. Here we analyzed 136 serotype 4b isolates from sporadic cases in the United States, 2003 to 2008, utilizing multiple tools including multilocus genotyping, pulsed-field gel electrophoresis, and sequence analysis of the inlAB locus. ECI, ECII, and ECIa were frequently encountered (32, 17, and 7%, respectively). However, annually 30 to 68% of isolates were outside these ECs, and several novel clonal groups were identified. An estimated 33 and 17% of the isolates, mostly among the ECs, were resistant to cadmium and arsenic, respectively, but resistance to benzalkonium chloride was uncommon (3%) among the sporadic isolates. The frequency of clonal groups fluctuated within the 6-year study period, without consistent trends. However, on several occasions, temporal clusters of isolates with indistinguishable genotypes were detected, suggesting the possibility of hidden multistate outbreaks. Our analysis suggests a complex population structure of serotype 4b L. monocytogenes from sporadic disease, with important contributions by ECs and several novel clonal groups. Continuous monitoring will be needed to assess long-term trends in clonality patterns and population structure of L. monocytogenes from sporadic listeriosis. PMID:24705322
Lee, Sangmi; Ward, Todd J; Graves, Lewis M; Tarr, Cheryl L; Siletzky, Robin M; Kathariou, Sophia
2014-06-01
Listeria monocytogenes can cause severe food-borne disease (listeriosis). Numerous outbreaks have involved three serotype 4b epidemic clones (ECs): ECI, ECII, and ECIa. However, little is known about the population structure of L. monocytogenes serotype 4b from sporadic listeriosis in the United States, even though most cases of human listeriosis are in fact sporadic. Here we analyzed 136 serotype 4b isolates from sporadic cases in the United States, 2003 to 2008, utilizing multiple tools including multilocus genotyping, pulsed-field gel electrophoresis, and sequence analysis of the inlAB locus. ECI, ECII, and ECIa were frequently encountered (32, 17, and 7%, respectively). However, annually 30 to 68% of isolates were outside these ECs, and several novel clonal groups were identified. An estimated 33 and 17% of the isolates, mostly among the ECs, were resistant to cadmium and arsenic, respectively, but resistance to benzalkonium chloride was uncommon (3%) among the sporadic isolates. The frequency of clonal groups fluctuated within the 6-year study period, without consistent trends. However, on several occasions, temporal clusters of isolates with indistinguishable genotypes were detected, suggesting the possibility of hidden multistate outbreaks. Our analysis suggests a complex population structure of serotype 4b L. monocytogenes from sporadic disease, with important contributions by ECs and several novel clonal groups. Continuous monitoring will be needed to assess long-term trends in clonality patterns and population structure of L. monocytogenes from sporadic listeriosis.
Global change in the trophic functioning of marine food webs
Gascuel, Didier; Colléter, Mathieu; Palomares, Maria L. D.; Du Pontavice, Hubert; Pauly, Daniel; Cheung, William W. L.
2017-01-01
The development of fisheries in the oceans, and other human drivers such as climate warming, have led to changes in species abundance, assemblages, trophic interactions, and ultimately in the functioning of marine food webs. Here, using a trophodynamic approach and global databases of catches and life history traits of marine species, we tested the hypothesis that anthropogenic ecological impacts may have led to changes in the global parameters defining the transfers of biomass within the food web. First, we developed two indicators to assess such changes: the Time Cumulated Indicator (TCI) measuring the residence time of biomass within the food web, and the Efficiency Cumulated Indicator (ECI) quantifying the fraction of secondary production reaching the top of the trophic chain. Then, we assessed, at the large marine ecosystem scale, the worldwide change of these two indicators over the 1950–2010 time-periods. Global trends were identified and cluster analyses were used to characterize the variability of trends between ecosystems. Results showed that the most common pattern over the study period is a global decrease in TCI, while the ECI indicator tends to increase. Thus, changes in species assemblages would induce faster and apparently more efficient biomass transfers in marine food webs. Results also suggested that the main driver of change over that period had been the large increase in fishing pressure. The largest changes occurred in ecosystems where ‘fishing down the marine food web’ are most intensive. PMID:28800358
Vanderhasselt, Marie-Anne; De Raedt, Rudi; Namur, Victoria; Valiengo, Leandro C L; Lotufo, Paulo A; Bensenor, Isabela M; Baeken, Chris; Boggio, Paulo S; Brunoni, Andre R
2016-01-15
Emotional Context Insensitivity (ECI) is a psychological feature observed in depressed patients characterized by a decreased emotional reactivity when presented to positive- and negative valence-loaded stimuli. Given that fronto-cingulate-limbic circuits are implicated in abnormal reactivity to valence-loaded stimuli, neurocognitive treatments engaging the prefrontal cortex may be able to modulate this emotional blunting observed in MDD. Therefore, our goal was to evaluate emotional reactivity in depressed patients before and after a combination of neurocognitive interventions that engage the prefrontal cortex (cognitive control training and/or transcranial direct current stimulation). In line with the premises of the ECI framework, before the start of the antidepressant intervention, patients showed blunted emotional reactivity after exposure to negative valence-loaded stimuli. This emotional reactivity pattern changed after 9 sessions of the intervention: positive affect decreased and negative affect increased after watching a series of negative valence-loaded stimuli (i.e. images). Interestingly, higher emotional reactivity (as indexed by a larger increase in negative affect after watching the valence-loaded stimuli) at baseline predicted reductions in depression symptoms after the intervention. On the other hand, higher emotional reactivity (as indexed by a decrease in positive affect) after the intervention was marginally associated with reductions in depression symptoms. To conclude, emotional reactivity increased after the neurocognitive antidepressant intervention and it was directly associated to the degree of depression improvement. Copyright © 2015 Elsevier B.V. All rights reserved.
Hasheminia, Seyedeh M; Sendi, Jalal J; Jahromi, Khalil T; Moharramipour, Saeid
2013-01-01
The methanolic extract of milk thistle, Silybium marianum L. (Asterales: Asteraceae), was investigated for its effects on the mortality, growth, feeding indices, enzymatic activity, and levels of non-enzymatic molecules of the small white butterfly, Pieris rapae L. (Lepidoptera: Pieridae), a pest of cruciferous plants. Feeding indices including approximate digestibility (AD), efficiency of conversion of digested food (ECD), efficiency of conversion of ingested food (ECI), relative growth rate (RGR), and relative consumption rate (RCR) were measured. These indices were variously affected: the RGR, RCR, and AD decreased, but the ECD and ECI increased. The LC50 and LC25 values were estimated as 2.94% and 1.20%, respectively. At the lowest concentration of S. marianum extract (0.625%), the feeding deterrence index was 40.48%. The duration of the pupal stage and the rate of larval growth decreased. These changes may be due to alterations in metabolic activity, such as the increase in alkaline phosphatase activity, which is likely involved in detoxification. Additionally, the activities of alanine aminotransferase and aspartate aminotransferase, which are key components of amino acid catabolism, decreased. The amount of glucose (an energy source) and uric acid (the excreted end product) increased, while total protein (another energy source) and cholesterol decreased. These results indicate that this plant possesses potential secondary metabolites that may be useful for the future study of the control of insect pests.
NASA Technical Reports Server (NTRS)
Smith, S. D.
1984-01-01
A users manual for the RAMP2 computer code is provided. The RAMP2 code can be used to model the dominant phenomena which affect the prediction of liquid and solid rocket nozzle and orbital plume flow fields. The general structure and operation of RAMP2 are discussed. A user input/output guide for the modified TRAN72 computer code and the RAMP2F code is given. The application and use of the BLIMPJ module are considered. Sample problems involving the space shuttle main engine and motor are included.
NASA Technical Reports Server (NTRS)
Chan, William M.
1995-01-01
Algorithms and computer code developments were performed for the overset grid approach to solving computational fluid dynamics problems. The techniques developed are applicable to compressible Navier-Stokes flow for any general complex configurations. The computer codes developed were tested on different complex configurations with the Space Shuttle launch vehicle configuration as the primary test bed. General, efficient and user-friendly codes were produced for grid generation, flow solution and force and moment computation.
NASA Technical Reports Server (NTRS)
Wigton, Larry
1996-01-01
Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.
ISSYS: An integrated synergistic Synthesis System
NASA Technical Reports Server (NTRS)
Dovi, A. R.
1980-01-01
Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.
User's manual for a two-dimensional, ground-water flow code on the Octopus computer network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naymik, T.G.
1978-08-30
A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.
Interactive Synthesis of Code Level Security Rules
2017-04-01
Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University
NASA Technical Reports Server (NTRS)
1986-01-01
AGDISP, a computer code written for Langley by Continuum Dynamics, Inc., aids crop dusting airplanes in targeting pesticides. The code is commercially available and can be run on a personal computer by an inexperienced operator. Called SWA+H, it is used by the Forest Service, FAA, DuPont, etc. DuPont uses the code to "test" equipment on the computer using a laser system to measure particle characteristics of various spray compounds.
The adaption and use of research codes for performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebetrau, A.M.
1987-05-01
Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less
Topological color codes on Union Jack lattices: a stable implementation of the whole Clifford group
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katzgraber, Helmut G.; Theoretische Physik, ETH Zurich, CH-8093 Zurich; Bombin, H.
We study the error threshold of topological color codes on Union Jack lattices that allow for the full implementation of the whole Clifford group of quantum gates. After mapping the error-correction process onto a statistical mechanical random three-body Ising model on a Union Jack lattice, we compute its phase diagram in the temperature-disorder plane using Monte Carlo simulations. Surprisingly, topological color codes on Union Jack lattices have a similar error stability to color codes on triangular lattices, as well as to the Kitaev toric code. The enhanced computational capabilities of the topological color codes on Union Jack lattices with respectmore » to triangular lattices and the toric code combined with the inherent robustness of this implementation show good prospects for future stable quantum computer implementations.« less
Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter
2007-08-31
latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced
Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation
NASA Technical Reports Server (NTRS)
Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.
2000-01-01
An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.
NASA Technical Reports Server (NTRS)
Hartenstein, Richard G., Jr.
1985-01-01
Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.
NASA Astrophysics Data System (ADS)
Wei, Xiaohui; Li, Weishan; Tian, Hailong; Li, Hongliang; Xu, Haixiao; Xu, Tianfu
2015-07-01
The numerical simulation of multiphase flow and reactive transport in the porous media on complex subsurface problem is a computationally intensive application. To meet the increasingly computational requirements, this paper presents a parallel computing method and architecture. Derived from TOUGHREACT that is a well-established code for simulating subsurface multi-phase flow and reactive transport problems, we developed a high performance computing THC-MP based on massive parallel computer, which extends greatly on the computational capability for the original code. The domain decomposition method was applied to the coupled numerical computing procedure in the THC-MP. We designed the distributed data structure, implemented the data initialization and exchange between the computing nodes and the core solving module using the hybrid parallel iterative and direct solver. Numerical accuracy of the THC-MP was verified through a CO2 injection-induced reactive transport problem by comparing the results obtained from the parallel computing and sequential computing (original code). Execution efficiency and code scalability were examined through field scale carbon sequestration applications on the multicore cluster. The results demonstrate successfully the enhanced performance using the THC-MP on parallel computing facilities.
Code of Ethical Conduct for Computer-Using Educators: An ICCE Policy Statement.
ERIC Educational Resources Information Center
Computing Teacher, 1987
1987-01-01
Prepared by the International Council for Computers in Education's Ethics and Equity Committee, this code of ethics for educators using computers covers nine main areas: curriculum issues, issues relating to computer access, privacy/confidentiality issues, teacher-related issues, student issues, the community, school organizational issues,…
ERIC Educational Resources Information Center
Whitney, Michael; Lipford, Heather Richter; Chu, Bill; Thomas, Tyler
2018-01-01
Many of the software security vulnerabilities that people face today can be remediated through secure coding practices. A critical step toward the practice of secure coding is ensuring that our computing students are educated on these practices. We argue that secure coding education needs to be included across a computing curriculum. We are…
NASA Technical Reports Server (NTRS)
Norment, H. G.
1985-01-01
Subsonic, external flow about nonlifting bodies, lifting bodies or combinations of lifting and nonlifting bodies is calculated by a modified version of the Hess lifting code. Trajectory calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Inlet flow can be accommodated, and high Mach number compressibility effects are corrected for approximately. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.
Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.
1990-09-01
IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve
A COTS-Based Replacement Strategy for Aging Avionics Computers
2001-12-01
Communication Control Unit. A COTS-Based Replacement Strategy for Aging Avionics Computers COTS Microprocessor Real Time Operating System New Native Code...Native Code Objec ts Native Code Thread Real - Time Operating System Legacy Function x Virtual Component Environment Context Switch Thunk Add-in Replace
PARAVT: Parallel Voronoi tessellation code
NASA Astrophysics Data System (ADS)
González, R. E.
2016-10-01
In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.
NASA Technical Reports Server (NTRS)
Almroth, B. O.; Brogan, F. A.
1978-01-01
Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.
Holonomic surface codes for fault-tolerant quantum computation
NASA Astrophysics Data System (ADS)
Zhang, Jiang; Devitt, Simon J.; You, J. Q.; Nori, Franco
2018-02-01
Surface codes can protect quantum information stored in qubits from local errors as long as the per-operation error rate is below a certain threshold. Here we propose holonomic surface codes by harnessing the quantum holonomy of the system. In our scheme, the holonomic gates are built via auxiliary qubits rather than the auxiliary levels in multilevel systems used in conventional holonomic quantum computation. The key advantage of our approach is that the auxiliary qubits are in their ground state before and after each gate operation, so they are not involved in the operation cycles of surface codes. This provides an advantageous way to implement surface codes for fault-tolerant quantum computation.
NASA Technical Reports Server (NTRS)
Chima, R. V.; Strazisar, A. J.
1982-01-01
Two and three dimensional inviscid solutions for the flow in a transonic axial compressor rotor at design speed are compared with probe and laser anemometers measurements at near-stall and maximum-flow operating points. Experimental details of the laser anemometer system and computational details of the two dimensional axisymmetric code and three dimensional Euler code are described. Comparisons are made between relative Mach number and flow angle contours, shock location, and shock strength. A procedure for using an efficient axisymmetric code to generate downstream pressure input for computationally expensive Euler codes is discussed. A film supplement shows the calculations of the two operating points with the time-marching Euler code.
Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution
NASA Astrophysics Data System (ADS)
Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi
2015-05-01
In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.
EAC: A program for the error analysis of STAGS results for plates
NASA Technical Reports Server (NTRS)
Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.
1989-01-01
A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.
CFD Modeling of Free-Piston Stirling Engines
NASA Technical Reports Server (NTRS)
Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.
2001-01-01
NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.
On the error statistics of Viterbi decoding and the performance of concatenated codes
NASA Technical Reports Server (NTRS)
Miller, R. L.; Deutsch, L. J.; Butman, S. A.
1981-01-01
Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.
New double-byte error-correcting codes for memory systems
NASA Technical Reports Server (NTRS)
Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.
1996-01-01
Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.
SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacMillan, D.B.
1960-06-01
>A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)
Computer-assisted coding and clinical documentation: first things first.
Tully, Melinda; Carmichael, Angela
2012-10-01
Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.
The CF6 engine performance improvement
NASA Technical Reports Server (NTRS)
Fasching, W. A.
1982-01-01
As part of the NASA-sponsored Engine Component Improvement (ECI) Program, a feasibility analysis of performance improvement and retention concepts for the CF6-6 and CF6-50 engines was conducted and seven concepts were identified for development and ground testing: new fan, new front mount, high pressure turbine aerodynamic performance improvement, high pressure turbine roundness, high pressure turbine active clearance control, low pressure turbine active clearance control, and short core exhaust nozzle. The development work and ground testing are summarized, and the major test results and an enomic analysis for each concept are presented.
1986-07-01
demonstrated that plasmid pLS20 of B. subtilis ( natto ) is capa le of promoting the transfer of pBC16 from B. subtilis to a variety of Bacillus s ecies...anthracis. Hofetver, results of recent experiments demonstrate that pL320, a 34-megadalton plasmid of B. subtilis ( natto ), is capable of promoting the...plasmid in Bacillus subtilis ( natto ).20 IV. Determination of the size of pX02 by restriction analysis-....... 24 V. Transfer of pXO1 by the B
PWAS EMIS-ECIS Active Carbon Filter Residual Life Estimation Methodology
2013-09-23
change in the EMIS spectrum. This method is similar to the full width at half maximum (FWHM) method implemented in the fiber Bragg grating ( FBG ), where...the intensity of the light reflected by the FBG at the half peak frequency is used to detect the strain change in the FBG . 4 W911NF-11-1-0210...grating ( FBG ), where the intensity of the light reflected by the FBG at the half peak frequency is used to detect the strain change in the FBG . A brief
2014-03-01
not provide effective control. Most tropical fi sh commercially available to hobbyists have recommended temperature requirements > 20 oC (e.g...at 10-12 oC (RR-C). This suggests lower lethal temperatures of 12-14 oC, but laboratory studies by the Florida Fish and Wildlife Conservation...55(5): 58-60. Flecker, A.S. 1992. Fish trophic guilds and the structure of a tropical stream: Weak vs. strong indirect effects . Ecology 73
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.
1987-01-01
The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.
Bistatic radar cross section of a perfectly conducting rhombus-shaped flat plate
NASA Astrophysics Data System (ADS)
Fenn, Alan J.
1990-05-01
The bistatic radar cross section of a perfectly conducting flat plate that has a rhombus shape (equilateral parallelogram) is investigated. The Ohio State University electromagnetic surface patch code (ESP version 4) is used to compute the theoretical bistatic radar cross section of a 35- x 27-in rhombus plate at 1.3 GHz over the bistatic angles 15 deg to 142 deg. The ESP-4 computer code is a method of moments FORTRAN-77 program which can analyze general configurations of plates and wires. This code has been installed and modified at Lincoln Laboratory on a SUN 3 computer network. Details of the code modifications are described. Comparisons of the method of moments simulations and measurements of the rhombus plate are made. It is shown that the ESP-4 computer code provides a high degree of accuracy in the calculation of copolarized and cross-polarized bistatic radar cross section patterns.
ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less
Navier-Stokes Simulation of Homogeneous Turbulence on the CYBER 205
NASA Technical Reports Server (NTRS)
Wu, C. T.; Ferziger, J. H.; Chapman, D. R.; Rogallo, R. S.
1984-01-01
A computer code which solves the Navier-Stokes equations for three dimensional, time-dependent, homogenous turbulence has been written for the CYBER 205. The code has options for both 64-bit and 32-bit arithmetic. With 32-bit computation, mesh sizes up to 64 (3) are contained within core of a 2 million 64-bit word memory. Computer speed timing runs were made for various vector lengths up to 6144. With this code, speeds a little over 100 Mflops have been achieved on a 2-pipe CYBER 205. Several problems encountered in the coding are discussed.
The investigation of tethered satellite system dynamics
NASA Technical Reports Server (NTRS)
Lorenzini, E.
1985-01-01
The tether control law to retrieve the satellite was modified in order to have a smooth retrieval trajectory of the satellite that minimizes the thruster activation. The satellite thrusters were added to the rotational dynamics computer code and a preliminary control logic was implemented to simulate them during the retrieval maneuver. The high resolution computer code for modelling the three dimensional dynamics of untensioned tether, SLACK3, was made fully operative and a set of computer simulations of possible tether breakages was run. The distribution of the electric field around an electrodynamic tether in vacuo severed at some length from the shuttle was computed with a three dimensional electrodynamic computer code.
Experimental and computational surface and flow-field results for an all-body hypersonic aircraft
NASA Technical Reports Server (NTRS)
Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.
1990-01-01
The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.
Computer search for binary cyclic UEP codes of odd length up to 65
NASA Technical Reports Server (NTRS)
Lin, Mao-Chao; Lin, Chi-Chang; Lin, Shu
1990-01-01
Using an exhaustive computation, the unequal error protection capabilities of all binary cyclic codes of odd length up to 65 that have minimum distances at least 3 are found. For those codes that can only have upper bounds on their unequal error protection capabilities computed, an analytic method developed by Dynkin and Togonidze (1976) is used to show that the upper bounds meet the exact unequal error protection capabilities.
A Combinatorial Geometry Computer Description of the MEP-021A Generator Set
1979-02-01
Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] *7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack
Optimizing a liquid propellant rocket engine with an automated combustor design code (AUTOCOM)
NASA Technical Reports Server (NTRS)
Hague, D. S.; Reichel, R. H.; Jones, R. T.; Glatt, C. R.
1972-01-01
A procedure for automatically designing a liquid propellant rocket engine combustion chamber in an optimal fashion is outlined. The procedure is contained in a digital computer code, AUTOCOM. The code is applied to an existing engine, and design modifications are generated which provide a substantial potential payload improvement over the existing design. Computer time requirements for this payload improvement were small, approximately four minutes in the CDC 6600 computer.
Unaligned instruction relocation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unalignedmore » ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.« less
Unaligned instruction relocation
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.; Sura, Zehra N.
2018-01-23
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unaligned ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.
Computer algorithm for coding gain
NASA Technical Reports Server (NTRS)
Dodd, E. E.
1974-01-01
Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Design and optimization of a portable LQCD Monte Carlo code using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.
Development of a thermal and structural analysis procedure for cooled radial turbines
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Deanna, Russell G.
1988-01-01
A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine is considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analyses. An inviscid, quasi three-dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous one-dimensional internal flow code for the momentum and energy equation. These boundary conditions are input to a three-dimensional heat conduction code for calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results from this case are included.
COMPUTATION OF GLOBAL PHOTOCHEMISTRY WITH SMVGEAR II (R823186)
A computer model was developed to simulate global gas-phase photochemistry. The model solves chemical equations with SMVGEAR II, a sparse-matrix, vectorized Gear-type code. To obtain SMVGEAR II, the original SMVGEAR code was modified to allow computation of different sets of chem...
NASA Technical Reports Server (NTRS)
Weed, Richard Allen; Sankar, L. N.
1994-01-01
An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.
NASA Technical Reports Server (NTRS)
Fishbach, L. H.
1979-01-01
The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.
A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle
1984-12-01
program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit
Characterizing the Properties of a Woven SiC/SiC Composite Using W-CEMCAN Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.
1999-01-01
A micromechanics based computer code to predict the thermal and mechanical properties of woven ceramic matrix composites (CMC) is developed. This computer code, W-CEMCAN (Woven CEramic Matrix Composites ANalyzer), predicts the properties of two-dimensional woven CMC at any temperature and takes into account various constituent geometries and volume fractions. This computer code is used to predict the thermal and mechanical properties of an advanced CMC composed of 0/90 five-harness (5 HS) Sylramic fiber which had been chemically vapor infiltrated (CVI) with boron nitride (BN) and SiC interphase coatings and melt-infiltrated (MI) with SiC. The predictions, based on the bulk constituent properties from the literature, are compared with measured experimental data. Based on the comparison. improved or calibrated properties for the constituent materials are then developed for use by material developers/designers. The computer code is then used to predict the properties of a composite with the same constituents but with different fiber volume fractions. The predictions are compared with measured data and a good agreement is achieved.
Fault tolerant computing: A preamble for assuring viability of large computer systems
NASA Technical Reports Server (NTRS)
Lim, R. S.
1977-01-01
The need for fault-tolerant computing is addressed from the viewpoints of (1) why it is needed, (2) how to apply it in the current state of technology, and (3) what it means in the context of the Phoenix computer system and other related systems. To this end, the value of concurrent error detection and correction is described. User protection, program retry, and repair are among the factors considered. The technology of algebraic codes to protect memory systems and arithmetic codes to protect memory systems and arithmetic codes to protect arithmetic operations is discussed.
The Advanced Software Development and Commercialization Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallopoulos, E.; Canfield, T.R.; Minkoff, M.
1990-09-01
This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less
Source Code Plagiarism--A Student Perspective
ERIC Educational Resources Information Center
Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.
2011-01-01
This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…
NASA Technical Reports Server (NTRS)
Filman, Robert E.
2004-01-01
This viewgraph presentation provides samples of computer code which have characteristics of poetic verse, and addresses the theoretical underpinnings of artistic coding, as well as how computer language influences software style, and the possible style of future coding.
Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing
NASA Astrophysics Data System (ADS)
Salamone, Joseph A., III
Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Visual Computing Environment Workshop
NASA Technical Reports Server (NTRS)
Lawrence, Charles (Compiler)
1998-01-01
The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.
Force user's manual: A portable, parallel FORTRAN
NASA Technical Reports Server (NTRS)
Jordan, Harry F.; Benten, Muhammad S.; Arenstorf, Norbert S.; Ramanan, Aruna V.
1990-01-01
The use of Force, a parallel, portable FORTRAN on shared memory parallel computers is described. Force simplifies writing code for parallel computers and, once the parallel code is written, it is easily ported to computers on which Force is installed. Although Force is nearly the same for all computers, specific details are included for the Cray-2, Cray-YMP, Convex 220, Flex/32, Encore, Sequent, Alliant computers on which it is installed.
Monte Carlo simulation of Ising models by multispin coding on a vector computer
NASA Astrophysics Data System (ADS)
Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus
1984-11-01
Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.
NASA Technical Reports Server (NTRS)
Chan, J. S.; Freeman, J. A.
1984-01-01
The viscous, axisymmetric flow in the thrust chamber of the space shuttle main engine (SSME) was computed on the CRAY 205 computer using the general interpolants method (GIM) code. Results show that the Navier-Stokes codes can be used for these flows to study trends and viscous effects as well as determine flow patterns; but further research and development is needed before they can be used as production tools for nozzle performance calculations. The GIM formulation, numerical scheme, and computer code are described. The actual SSME nozzle computation showing grid points, flow contours, and flow parameter plots is discussed. The computer system and run times/costs are detailed.
The {sup 18}O(d,p){sup 19}O reaction and the ANC method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burjan, V.; Hons, Z.; Kroha, V.
2014-05-09
The neutron capture rate {sup 18}O(n,γ){sup 19}O is important for analysis of nucleosynthesis in inhomogeneous Big Bang models and also for models of processes in massive red giant stars and AGB stars. Angular distributions of the {sup 18}O(d,p){sup 19}O reaction were measured at a deuteron energy of 16.3 MeV in NPI in Řež, Czech Republic, with the aim to determine Asymptotic Normalization Coefficients which can then be used for indirect determination of the direct contribution to the {sup 18}O(n,γ){sup 19}O process. In the experiment, the gas target with {sup 18}O isotope of high purity 99.9 % was used thus eliminatingmore » any contaminating reactions. Reaction products were measured by the set of 8 ΔE-E telescopes consisting of thin and thick silicon surface-barrier detectors. Angular distributions of proton transfers corresponding to 6 levels of {sup 19}O up to the 4.1093 MeV excitation energy were determined. The analysis of angular distributions in the angular range from 6 to 64 degree including also the angular distribution of elastically scattered deuterons was carried out by means of ECIS and DWUCK codes. From the determined ANCs the direct contribution to the radiative capture {sup 18}O(n,γ){sup 19}O was deduced and compared with existing direct measurements.« less
Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials
NASA Technical Reports Server (NTRS)
Luebbers, Raymond J.; Beggs, John H.
1991-01-01
Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.
Multitasking the code ARC3D. [for computational fluid dynamics
NASA Technical Reports Server (NTRS)
Barton, John T.; Hsiung, Christopher C.
1986-01-01
The CRAY multitasking system was developed in order to utilize all four processors and sharply reduce the wall clock run time. This paper describes the techniques used to modify the computational fluid dynamics code ARC3D for this run and analyzes the achieved speedup. The ARC3D code solves either the Euler or thin-layer N-S equations using an implicit approximate factorization scheme. Results indicate that multitask processing can be used to achieve wall clock speedup factors of over three times, depending on the nature of the program code being used. Multitasking appears to be particularly advantageous for large-memory problems running on multiple CPU computers.
Addressing the challenges of standalone multi-core simulations in molecular dynamics
NASA Astrophysics Data System (ADS)
Ocaya, R. O.; Terblans, J. J.
2017-07-01
Computational modelling in material science involves mathematical abstractions of force fields between particles with the aim to postulate, develop and understand materials by simulation. The aggregated pairwise interactions of the material's particles lead to a deduction of its macroscopic behaviours. For practically meaningful macroscopic scales, a large amount of data are generated, leading to vast execution times. Simulation times of hours, days or weeks for moderately sized problems are not uncommon. The reduction of simulation times, improved result accuracy and the associated software and hardware engineering challenges are the main motivations for many of the ongoing researches in the computational sciences. This contribution is concerned mainly with simulations that can be done on a "standalone" computer based on Message Passing Interfaces (MPI), parallel code running on hardware platforms with wide specifications, such as single/multi- processor, multi-core machines with minimal reconfiguration for upward scaling of computational power. The widely available, documented and standardized MPI library provides this functionality through the MPI_Comm_size (), MPI_Comm_rank () and MPI_Reduce () functions. A survey of the literature shows that relatively little is written with respect to the efficient extraction of the inherent computational power in a cluster. In this work, we discuss the main avenues available to tap into this extra power without compromising computational accuracy. We also present methods to overcome the high inertia encountered in single-node-based computational molecular dynamics. We begin by surveying the current state of the art and discuss what it takes to achieve parallelism, efficiency and enhanced computational accuracy through program threads and message passing interfaces. Several code illustrations are given. The pros and cons of writing raw code as opposed to using heuristic, third-party code are also discussed. The growing trend towards graphical processor units and virtual computing clouds for high-performance computing is also discussed. Finally, we present the comparative results of vacancy formation energy calculations using our own parallelized standalone code called Verlet-Stormer velocity (VSV) operating on 30,000 copper atoms. The code is based on the Sutton-Chen implementation of the Finnis-Sinclair pairwise embedded atom potential. A link to the code is also given.
Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing
2008-01-01
complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING
Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D
NASA Technical Reports Server (NTRS)
Carle, Alan; Fagan, Mike; Green, Lawrence L.
1998-01-01
This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
NASA Astrophysics Data System (ADS)
Lee, Jangho; Kim, Kwang-Yul
2018-02-01
CSEOF analysis is applied for the springtime (March, April, May) daily PM10 concentrations measured at 23 Ministry of Environment stations in Seoul, Korea for the period of 2003-2012. Six meteorological variables at 12 pressure levels are also acquired from the ERA Interim reanalysis datasets. CSEOF analysis is conducted for each meteorological variable over East Asia. Regression analysis is conducted in CSEOF space between the PM10 concentrations and individual meteorological variables to identify associated atmospheric conditions for each CSEOF mode. By adding the regressed loading vectors with the mean meteorological fields, the daily atmospheric conditions are obtained for the first five CSEOF modes. Then, HYSPLIT model is run with the atmospheric conditions for each CSEOF mode in order to back trace the air parcels and dust reaching Seoul. The K-means clustering algorithm is applied to identify major source regions for each CSEOF mode of the PM10 concentrations in Seoul. Three main source regions identified based on the mean fields are: (1) northern Taklamakan Desert (NTD), (2) Gobi Desert and (GD), and (3) East China industrial area (ECI). The main source regions for the mean meteorological fields are consistent with those of previous study; 41% of the source locations are located in GD followed by ECI (37%) and NTD (21%). Back trajectory calculations based on CSEOF analysis of meteorological variables identify distinct source characteristics associated with each CSEOF mode and greatly facilitate the interpretation of the PM10 variability in Seoul in terms of transportation route and meteorological conditions including the source area.
Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Garg Vijay; Ameri, Ali
2005-01-01
The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.
Development of a 3-D upwind PNS code for chemically reacting hypersonic flowfields
NASA Technical Reports Server (NTRS)
Tannehill, J. C.; Wadawadigi, G.
1992-01-01
Two new parabolized Navier-Stokes (PNS) codes were developed to compute the three-dimensional, viscous, chemically reacting flow of air around hypersonic vehicles such as the National Aero-Space Plane (NASP). The first code (TONIC) solves the gas dynamic and species conservation equations in a fully coupled manner using an implicit, approximately-factored, central-difference algorithm. This code was upgraded to include shock fitting and the capability of computing the flow around complex body shapes. The revised TONIC code was validated by computing the chemically-reacting (M(sub infinity) = 25.3) flow around a 10 deg half-angle cone at various angles of attack and the Ames All-Body model at 0 deg angle of attack. The results of these calculations were in good agreement with the results from the UPS code. One of the major drawbacks of the TONIC code is that the central-differencing of fluxes across interior flowfield discontinuities tends to introduce errors into the solution in the form of local flow property oscillations. The second code (UPS), originally developed for a perfect gas, has been extended to permit either perfect gas, equilibrium air, or nonequilibrium air computations. The code solves the PNS equations using a finite-volume, upwind TVD method based on Roe's approximate Riemann solver that was modified to account for real gas effects. The dissipation term associated with this algorithm is sufficiently adaptive to flow conditions that, even when attempting to capture very strong shock waves, no additional smoothing is required. For nonequilibrium calculations, the code solves the fluid dynamic and species continuity equations in a loosely-coupled manner. This code was used to calculate the hypersonic, laminar flow of chemically reacting air over cones at various angles of attack. In addition, the flow around the McDonnel Douglas generic option blended-wing-body was computed and comparisons were made between the perfect gas, equilibrium air, and the nonequilibrium air results.
Linear chirp phase perturbing approach for finding binary phased codes
NASA Astrophysics Data System (ADS)
Li, Bing C.
2017-05-01
Binary phased codes have many applications in communication and radar systems. These applications require binary phased codes to have low sidelobes in order to reduce interferences and false detection. Barker codes are the ones that satisfy these requirements and they have lowest maximum sidelobes. However, Barker codes have very limited code lengths (equal or less than 13) while many applications including low probability of intercept radar, and spread spectrum communication, require much higher code lengths. The conventional techniques of finding binary phased codes in literatures include exhaust search, neural network, and evolutionary methods, and they all require very expensive computation for large code lengths. Therefore these techniques are limited to find binary phased codes with small code lengths (less than 100). In this paper, by analyzing Barker code, linear chirp, and P3 phases, we propose a new approach to find binary codes. Experiments show that the proposed method is able to find long low sidelobe binary phased codes (code length >500) with reasonable computational cost.
Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Bartels, Robert E.
2002-01-01
A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1994-01-01
A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.
Manual for obscuration code with space station applications
NASA Technical Reports Server (NTRS)
Marhefka, R. J.; Takacs, L.
1986-01-01
The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Johnson, J.D.; Blond, R.M.
The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.
Progressive fracture of fiber composites
NASA Technical Reports Server (NTRS)
Irvin, T. B.; Ginty, C. A.
1983-01-01
Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.
Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1994-01-01
An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.
Design geometry and design/off-design performance computer codes for compressors and turbines
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1995-01-01
This report summarizes some NASA Lewis (i.e., government owned) computer codes capable of being used for airbreathing propulsion system studies to determine the design geometry and to predict the design/off-design performance of compressors and turbines. These are not CFD codes; velocity-diagram energy and continuity computations are performed fore and aft of the blade rows using meanline, spanline, or streamline analyses. Losses are provided by empirical methods. Both axial-flow and radial-flow configurations are included.
PerSEUS: Ultra-Low-Power High Performance Computing for Plasma Simulations
NASA Astrophysics Data System (ADS)
Doxas, I.; Andreou, A.; Lyon, J.; Angelopoulos, V.; Lu, S.; Pritchett, P. L.
2017-12-01
Peta-op SupErcomputing Unconventional System (PerSEUS) aims to explore the use for High Performance Scientific Computing (HPC) of ultra-low-power mixed signal unconventional computational elements developed by Johns Hopkins University (JHU), and demonstrate that capability on both fluid and particle Plasma codes. We will describe the JHU Mixed-signal Unconventional Supercomputing Elements (MUSE), and report initial results for the Lyon-Fedder-Mobarry (LFM) global magnetospheric MHD code, and a UCLA general purpose relativistic Particle-In-Cell (PIC) code.
Multiple grid problems on concurrent-processing computers
NASA Technical Reports Server (NTRS)
Eberhardt, D. S.; Baganoff, D.
1986-01-01
Three computer codes were studied which make use of concurrent processing computer architectures in computational fluid dynamics (CFD). The three parallel codes were tested on a two processor multiple-instruction/multiple-data (MIMD) facility at NASA Ames Research Center, and are suggested for efficient parallel computations. The first code is a well-known program which makes use of the Beam and Warming, implicit, approximate factored algorithm. This study demonstrates the parallelism found in a well-known scheme and it achieved speedups exceeding 1.9 on the two processor MIMD test facility. The second code studied made use of an embedded grid scheme which is used to solve problems having complex geometries. The particular application for this study considered an airfoil/flap geometry in an incompressible flow. The scheme eliminates some of the inherent difficulties found in adapting approximate factorization techniques onto MIMD machines and allows the use of chaotic relaxation and asynchronous iteration techniques. The third code studied is an application of overset grids to a supersonic blunt body problem. The code addresses the difficulties encountered when using embedded grids on a compressible, and therefore nonlinear, problem. The complex numerical boundary system associated with overset grids is discussed and several boundary schemes are suggested. A boundary scheme based on the method of characteristics achieved the best results.
Binary weight distributions of some Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Pollara, F.; Arnold, S.
1992-01-01
The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.
NASA Technical Reports Server (NTRS)
Mcgaw, Michael A.; Saltsman, James F.
1993-01-01
A recently developed high-temperature fatigue life prediction computer code is presented and an example of its usage given. The code discussed is based on the Total Strain version of Strainrange Partitioning (TS-SRP). Included in this code are procedures for characterizing the creep-fatigue durability behavior of an alloy according to TS-SRP guidelines and predicting cyclic life for complex cycle types for both isothermal and thermomechanical conditions. A reasonably extensive materials properties database is included with the code.
Turbomachinery Heat Transfer and Loss Modeling for 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Ameri, Ali
2005-01-01
This report's contents focus on making use of NASA Glenn on-site computational facilities,to develop, validate, and apply models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes to enhance the capability to compute heat transfer and losses in turbomachiney.
Real-time computer treatment of THz passive device images with the high image quality
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2012-06-01
We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.
Fingerprinting Communication and Computation on HPC Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean
2010-06-02
How do we identify what is actually running on high-performance computing systems? Names of binaries, dynamic libraries loaded, or other elements in a submission to a batch queue can give clues, but binary names can be changed, and libraries provide limited insight and resolution on the code being run. In this paper, we present a method for"fingerprinting" code running on HPC machines using elements of communication and computation. We then discuss how that fingerprint can be used to determine if the code is consistent with certain other types of codes, what a user usually runs, or what the user requestedmore » an allocation to do. In some cases, our techniques enable us to fingerprint HPC codes using runtime MPI data with a high degree of accuracy.« less
Practices in source code sharing in astrophysics
NASA Astrophysics Data System (ADS)
Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly
2013-02-01
While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.
Development of V/STOL methodology based on a higher order panel method
NASA Technical Reports Server (NTRS)
Bhateley, I. C.; Howell, G. A.; Mann, H. W.
1983-01-01
The development of a computational technique to predict the complex flowfields of V/STOL aircraft was initiated in which a number of modules and a potential flow aerodynamic code were combined in a comprehensive computer program. The modules were developed in a building-block approach to assist the user in preparing the geometric input and to compute parameters needed to simulate certain flow phenomena that cannot be handled directly within a potential flow code. The PAN AIR aerodynamic code, which is higher order panel method, forms the nucleus of this program. PAN AIR's extensive capability for allowing generalized boundary conditions allows the modules to interact with the aerodynamic code through the input and output files, thereby requiring no changes to the basic code and easy replacement of updated modules.
Lattice surgery on the Raussendorf lattice
NASA Astrophysics Data System (ADS)
Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco
2018-07-01
Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.
40 CFR 1033.110 - Emission diagnostics-general requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a... and understand the diagnostic trouble codes stored in the onboard computer with generic tools and...
Airfoil Vibration Dampers program
NASA Technical Reports Server (NTRS)
Cook, Robert M.
1991-01-01
The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.
Computer optimization of reactor-thermoelectric space power systems
NASA Technical Reports Server (NTRS)
Maag, W. L.; Finnegan, P. M.; Fishbach, L. H.
1973-01-01
A computer simulation and optimization code that has been developed for nuclear space power systems is described. The results of using this code to analyze two reactor-thermoelectric systems are presented.
A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.
1994-01-01
Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.
1990-09-01
13 Bart Kuhn, GM-14 Samantha K. Maddox , GS-04 Mike Nakada, GM- 13 John Wolfe, GM-14 Reynaldo I. Monzon, GS- 12 Jose G. Suarez, GS- 11 19 Product...1410-09 GS-334-09 Janice Whiting Procurement Clerk Code 21 GS-1106-05 Separations Samantha Maddox Hoa T. Lu Supply Clerk Computer Specialist Code 21...Jennifer Thorp Royal S. Magnus Student Aide Personnel Research Psychologist Code 23 Code 12 GW-322-03 GS-180-11 Linda L. Turnmire Yvonne S. Baker Computer
Ascent Aerodynamic Pressure Distributions on WB001
NASA Technical Reports Server (NTRS)
Vu, B.; Ruf, J.; Canabal, F.; Brunty, J.
1996-01-01
To support the reusable launch vehicle concept study, the aerodynamic data and surface pressure for WB001 were predicted using three computational fluid dynamic (CFD) codes at several flow conditions between code to code and code to aerodynamic database as well as available experimental data. A set of particular solutions have been selected and recommended for use in preliminary conceptual designs. These computational fluid dynamic (CFD) results have also been provided to the structure group for wing loading analysis.
NASA Technical Reports Server (NTRS)
Kumar, A.; Graves, R. A., Jr.; Weilmuenster, K. J.
1980-01-01
A vectorized code, EQUIL, was developed for calculating the equilibrium chemistry of a reacting gas mixture on the Control Data STAR-100 computer. The code provides species mole fractions, mass fractions, and thermodynamic and transport properties of the mixture for given temperature, pressure, and elemental mass fractions. The code is set up for the electrons H, He, C, O, N system of elements. In all, 24 chemical species are included.
Computer code for charge-exchange plasma propagation
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Kaufman, H. R.
1981-01-01
The propagation of the charge-exchange plasma from an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ASNI Standard FORTRAN.
Self-Scheduling Parallel Methods for Multiple Serial Codes with Application to WOPWOP
NASA Technical Reports Server (NTRS)
Long, Lyle N.; Brentner, Kenneth S.
2000-01-01
This paper presents a scheme for efficiently running a large number of serial jobs on parallel computers. Two examples are given of computer programs that run relatively quickly, but often they must be run numerous times to obtain all the results needed. It is very common in science and engineering to have codes that are not massive computing challenges in themselves, but due to the number of instances that must be run, they do become large-scale computing problems. The two examples given here represent common problems in aerospace engineering: aerodynamic panel methods and aeroacoustic integral methods. The first example simply solves many systems of linear equations. This is representative of an aerodynamic panel code where someone would like to solve for numerous angles of attack. The complete code for this first example is included in the appendix so that it can be readily used by others as a template. The second example is an aeroacoustics code (WOPWOP) that solves the Ffowcs Williams Hawkings equation to predict the far-field sound due to rotating blades. In this example, one quite often needs to compute the sound at numerous observer locations, hence parallelization is utilized to automate the noise computation for a large number of observers.
Computer Code for Transportation Network Design and Analysis
DOT National Transportation Integrated Search
1977-01-01
This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...
Current and anticipated uses of the thermal hydraulics codes at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less
Analyzing Pulse-Code Modulation On A Small Computer
NASA Technical Reports Server (NTRS)
Massey, David E.
1988-01-01
System for analysis pulse-code modulation (PCM) comprises personal computer, computer program, and peripheral interface adapter on circuit board that plugs into expansion bus of computer. Functions essentially as "snapshot" PCM decommutator, which accepts and stores thousands of frames of PCM data, sifts through them repeatedly to process according to routines specified by operator. Enables faster testing and involves less equipment than older testing systems.
A fast technique for computing syndromes of BCH and RS codes. [deep space network
NASA Technical Reports Server (NTRS)
Reed, I. S.; Truong, T. K.; Miller, R. L.
1979-01-01
A combination of the Chinese Remainder Theorem and Winograd's algorithm is used to compute transforms of odd length over GF(2 to the m power). Such transforms are used to compute the syndromes needed for decoding CBH and RS codes. The present scheme requires substantially fewer multiplications and additions than the conventional method of computing the syndromes directly.
Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs
NASA Technical Reports Server (NTRS)
Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.
1977-01-01
The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.
Numerical computation of space shuttle orbiter flow field
NASA Technical Reports Server (NTRS)
Tannehill, John C.
1988-01-01
A new parabolized Navier-Stokes (PNS) code has been developed to compute the hypersonic, viscous chemically reacting flow fields around 3-D bodies. The flow medium is assumed to be a multicomponent mixture of thermally perfect but calorically imperfect gases. The new PNS code solves the gas dynamic and species conservation equations in a coupled manner using a noniterative, implicit, approximately factored, finite difference algorithm. The space-marching method is made well-posed by special treatment of the streamwise pressure gradient term. The code has been used to compute hypersonic laminar flow of chemically reacting air over cones at angle of attack. The results of the computations are compared with the results of reacting boundary-layer computations and show excellent agreement.
NASA Technical Reports Server (NTRS)
Warren, Gary
1988-01-01
The SOS code is used to compute the resonance modes (frequency-domain information) of sample devices and separately to compute the transient behavior of the same devices. A code, DOT, is created to compute appropriate dot products of the time-domain and frequency-domain results. The transient behavior of individual modes in the device is then plotted. Modes in a coupled-cavity traveling-wave tube (CCTWT) section excited beam in separate simulations are analyzed. Mode energy vs. time and mode phase vs. time are computed and it is determined whether the transient waves are forward or backward waves for each case. Finally, the hot-test mode frequencies of the CCTWT section are computed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, D.G.: Watkins, J.C.
This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In additionmore » to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji
A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA andmore » MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less
Prediction of sound radiated from different practical jet engine inlets
NASA Technical Reports Server (NTRS)
Zinn, B. T.; Meyer, W. L.
1980-01-01
Existing computer codes for calculating the far field radiation patterns surrounding various practical jet engine inlet configurations under different excitation conditions were upgraded. The computer codes were refined and expanded so that they are now more efficient computationally by a factor of about three and they are now capable of producing accurate results up to nondimensional wave numbers of twenty. Computer programs were also developed to help generate accurate geometrical representations of the inlets to be investigated. This data is required as input for the computer programs which calculate the sound fields. This new geometry generating computer program considerably reduces the time required to generate the input data which was one of the most time consuming steps in the process. The results of sample runs using the NASA-Lewis QCSEE inlet are presented and comparison of run times and accuracy are made between the old and upgraded computer codes. The overall accuracy of the computations is determined by comparison of the results of the computations with simple source solutions.
Error Suppression for Hamiltonian-Based Quantum Computation Using Subsystem Codes
NASA Astrophysics Data System (ADS)
Marvian, Milad; Lidar, Daniel A.
2017-01-01
We present general conditions for quantum error suppression for Hamiltonian-based quantum computation using subsystem codes. This involves encoding the Hamiltonian performing the computation using an error detecting subsystem code and the addition of a penalty term that commutes with the encoded Hamiltonian. The scheme is general and includes the stabilizer formalism of both subspace and subsystem codes as special cases. We derive performance bounds and show that complete error suppression results in the large penalty limit. To illustrate the power of subsystem-based error suppression, we introduce fully two-local constructions for protection against local errors of the swap gate of adiabatic gate teleportation and the Ising chain in a transverse field.
Error Suppression for Hamiltonian-Based Quantum Computation Using Subsystem Codes.
Marvian, Milad; Lidar, Daniel A
2017-01-20
We present general conditions for quantum error suppression for Hamiltonian-based quantum computation using subsystem codes. This involves encoding the Hamiltonian performing the computation using an error detecting subsystem code and the addition of a penalty term that commutes with the encoded Hamiltonian. The scheme is general and includes the stabilizer formalism of both subspace and subsystem codes as special cases. We derive performance bounds and show that complete error suppression results in the large penalty limit. To illustrate the power of subsystem-based error suppression, we introduce fully two-local constructions for protection against local errors of the swap gate of adiabatic gate teleportation and the Ising chain in a transverse field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan
2015-02-16
CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less
Navier-Stokes and Comprehensive Analysis Performance Predictions of the NREL Phase VI Experiment
NASA Technical Reports Server (NTRS)
Duque, Earl P. N.; Burklund, Michael D.; Johnson, Wayne
2003-01-01
A vortex lattice code, CAMRAD II, and a Reynolds-Averaged Navier-Stoke code, OVERFLOW-D2, were used to predict the aerodynamic performance of a two-bladed horizontal axis wind turbine. All computations were compared with experimental data that was collected at the NASA Ames Research Center 80- by 120-Foot Wind Tunnel. Computations were performed for both axial as well as yawed operating conditions. Various stall delay models and dynamics stall models were used by the CAMRAD II code. Comparisons between the experimental data and computed aerodynamic loads show that the OVERFLOW-D2 code can accurately predict the power and spanwise loading of a wind turbine rotor.
Fault-tolerance in Two-dimensional Topological Systems
NASA Astrophysics Data System (ADS)
Anderson, Jonas T.
This thesis is a collection of ideas with the general goal of building, at least in the abstract, a local fault-tolerant quantum computer. The connection between quantum information and topology has proven to be an active area of research in several fields. The introduction of the toric code by Alexei Kitaev demonstrated the usefulness of topology for quantum memory and quantum computation. Many quantum codes used for quantum memory are modeled by spin systems on a lattice, with operators that extract syndrome information placed on vertices or faces of the lattice. It is natural to wonder whether the useful codes in such systems can be classified. This thesis presents work that leverages ideas from topology and graph theory to explore the space of such codes. Homological stabilizer codes are introduced and it is shown that, under a set of reasonable assumptions, any qubit homological stabilizer code is equivalent to either a toric code or a color code. Additionally, the toric code and the color code correspond to distinct classes of graphs. Many systems have been proposed as candidate quantum computers. It is very desirable to design quantum computing architectures with two-dimensional layouts and low complexity in parity-checking circuitry. Kitaev's surface codes provided the first example of codes satisfying this property. They provided a new route to fault tolerance with more modest overheads and thresholds approaching 1%. The recently discovered color codes share many properties with the surface codes, such as the ability to perform syndrome extraction locally in two dimensions. Some families of color codes admit a transversal implementation of the entire Clifford group. This work investigates color codes on the 4.8.8 lattice known as triangular codes. I develop a fault-tolerant error-correction strategy for these codes in which repeated syndrome measurements on this lattice generate a three-dimensional space-time combinatorial structure. I then develop an integer program that analyzes this structure and determines the most likely set of errors consistent with the observed syndrome values. I implement this integer program to find the threshold for depolarizing noise on small versions of these triangular codes. Because the threshold for magic-state distillation is likely to be higher than this value and because logical
System, methods and apparatus for program optimization for multi-threaded processor architectures
Bastoul, Cedric; Lethin, Richard A; Leung, Allen K; Meister, Benoit J; Szilagyi, Peter; Vasilache, Nicolas T; Wohlford, David E
2015-01-06
Methods, apparatus and computer software product for source code optimization are provided. In an exemplary embodiment, a first custom computing apparatus is used to optimize the execution of source code on a second computing apparatus. In this embodiment, the first custom computing apparatus contains a memory, a storage medium and at least one processor with at least one multi-stage execution unit. The second computing apparatus contains at least two multi-stage execution units that allow for parallel execution of tasks. The first custom computing apparatus optimizes the code for parallelism, locality of operations and contiguity of memory accesses on the second computing apparatus. This Abstract is provided for the sole purpose of complying with the Abstract requirement rules. This Abstract is submitted with the explicit understanding that it will not be used to interpret or to limit the scope or the meaning of the claims.
NASA Technical Reports Server (NTRS)
Liu, D. D.; Kao, Y. F.; Fung, K. Y.
1989-01-01
A transonic equivalent strip (TES) method was further developed for unsteady flow computations of arbitrary wing planforms. The TES method consists of two consecutive correction steps to a given nonlinear code such as LTRAN2; namely, the chordwise mean flow correction and the spanwise phase correction. The computation procedure requires direct pressure input from other computed or measured data. Otherwise, it does not require airfoil shape or grid generation for given planforms. To validate the computed results, four swept wings of various aspect ratios, including those with control surfaces, are selected as computational examples. Overall trends in unsteady pressures are established with those obtained by XTRAN3S codes, Isogai's full potential code and measured data by NLR and RAE. In comparison with these methods, the TES has achieved considerable saving in computer time and reasonable accuracy which suggests immediate industrial applications.
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
Development Of A Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Yoon, Seokkwan; Kwak, Dochan
1993-01-01
Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
"SMART": A Compact and Handy FORTRAN Code for the Physics of Stellar Atmospheres
NASA Astrophysics Data System (ADS)
Sapar, A.; Poolamäe, R.
2003-01-01
A new computer code SMART (Spectra from Model Atmospheres by Radiative Transfer) for computing the stellar spectra, forming in plane-parallel atmospheres, has been compiled by us and A. Aret. To guarantee wide compatibility of the code with shell environment, we chose FORTRAN-77 as programming language and tried to confine ourselves to common part of its numerous versions both in WINDOWS and LINUX. SMART can be used for studies of several processes in stellar atmospheres. The current version of the programme is undergoing rapid changes due to our goal to elaborate a simple, handy and compact code. Instead of linearisation (being a mathematical method of recurrent approximations) we propose to use the physical evolutionary changes or in other words relaxation of quantum state populations rates from LTE to NLTE has been studied using small number of NLTE states. This computational scheme is essentially simpler and more compact than the linearisation. This relaxation scheme enables using instead of the Λ-iteration procedure a physically changing emissivity (or the source function) which incorporates in itself changing Menzel coefficients for NLTE quantum state populations. However, the light scattering on free electrons is in the terms of Feynman graphs a real second-order quantum process and cannot be reduced to consequent processes of absorption and emission as in the case of radiative transfer in spectral lines. With duly chosen input parameters the code SMART enables computing radiative acceleration to the matter of stellar atmosphere in turbulence clumps. This also enables to connect the model atmosphere in more detail with the problem of the stellar wind triggering. Another problem, which has been incorporated into the computer code SMART, is diffusion of chemical elements and their isotopes in the atmospheres of chemically peculiar (CP) stars due to usual radiative acceleration and the essential additional acceleration generated by the light-induced drift. As a special case, using duly chosen pixels on the stellar disk, the spectrum of rotating star can be computed. No instrumental broadening has been incorporated in the code of SMART. To facilitate study of stellar spectra, a GUI (Graphical User Interface) with selection of labels by ions has been compiled to study the spectral lines of different elements and ions in the computed emergent flux. An amazing feature of SMART is that its code is very short: it occupies only 4 two-sided two-column A4 sheets in landscape format. In addition, if well commented, it is quite easily readable and understandable. We have used the tactics of writing the comments on the right-side margin (columns starting from 73). Such short code has been composed widely using the unified input physics (for example the ionisation cross-sections for bound-free transitions and the electron and ion collision rates). As current restriction to the application area of the present version of the SMART is that molecules are since ignored. Thus, it can be used only for luke and hot stellar atmospheres. In the computer code we have tried to avoid bulky often over-optimised methods, primarily meant to spare the time of computations. For instance, we compute the continuous absorption coefficient at every wavelength. Nevertheless, during an hour by the personal computer in our disposal AMD Athlon XP 1700+, 512MB DDRAM) a stellar spectrum with spectral step resolution λ / dλ = 3D100,000 for spectral interval 700 -- 30,000 Å is computed. The model input data and the line data used by us are both the ones computed and compiled by R. Kurucz. In order to follow presence and representability of quantum states and to enumerate them for NLTE studies a C++ code, transforming the needed data to the LATEX version, has been compiled. Thus we have composed a quantum state list for all neutrals and ions in the Kurucz file 'gfhyperall.dat'. The list enables more adequately to compose the concept of super-states, including partly correlating super-states. We are grateful to R. Kurucz for making available by CD-ROMs and Internet his computer codes ATLAS and SYNTHE used by us as a starting point in composing of the new computer code. We are also grateful to Estonian Science Foundation for grant ESF-4701.
Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold
1997-01-01
The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.
Transferring ecosystem simulation codes to supercomputers
NASA Technical Reports Server (NTRS)
Skiles, J. W.; Schulbach, C. H.
1995-01-01
Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.
Duct flow nonuniformities for Space Shuttle Main Engine (SSME)
NASA Technical Reports Server (NTRS)
1987-01-01
A three-duct Space Shuttle Main Engine (SSME) Hot Gas Manifold geometry code was developed for use. The methodology of the program is described, recommendations on its implementation made, and an input guide, input deck listing, and a source code listing provided. The code listing is strewn with an abundance of comments to assist the user in following its development and logic. A working source deck will be provided. A thorough analysis was made of the proper boundary conditions and chemistry kinetics necessary for an accurate computational analysis of the flow environment in the SSME fuel side preburner chamber during the initial startup transient. Pertinent results were presented to facilitate incorporation of these findings into an appropriate CFD code. The computation must be a turbulent computation, since the flow field turbulent mixing will have a profound effect on the chemistry. Because of the additional equations demanded by the chemistry model it is recommended that for expediency a simple algebraic mixing length model be adopted. Performing this computation for all or selected time intervals of the startup time will require an abundance of computer CPU time regardless of the specific CFD code selected.
War of Ontology Worlds: Mathematics, Computer Code, or Esperanto?
Rzhetsky, Andrey; Evans, James A.
2011-01-01
The use of structured knowledge representations—ontologies and terminologies—has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies. PMID:21980276
Verifying a computational method for predicting extreme ground motion
Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.
2011-01-01
In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.
An evaluation of four single element airfoil analytic methods
NASA Technical Reports Server (NTRS)
Freuler, R. J.; Gregorek, G. M.
1979-01-01
A comparison of four computer codes for the analysis of two-dimensional single element airfoil sections is presented for three classes of section geometries. Two of the computer codes utilize vortex singularities methods to obtain the potential flow solution. The other two codes solve the full inviscid potential flow equation using finite differencing techniques, allowing results to be obtained for transonic flow about an airfoil including weak shocks. Each program incorporates boundary layer routines for computing the boundary layer displacement thickness and boundary layer effects on aerodynamic coefficients. Computational results are given for a symmetrical section represented by an NACA 0012 profile, a conventional section illustrated by an NACA 65A413 profile, and a supercritical type section for general aviation applications typified by a NASA LS(1)-0413 section. The four codes are compared and contrasted in the areas of method of approach, range of applicability, agreement among each other and with experiment, individual advantages and disadvantages, computer run times and memory requirements, and operational idiosyncrasies.
War of ontology worlds: mathematics, computer code, or Esperanto?
Rzhetsky, Andrey; Evans, James A
2011-09-01
The use of structured knowledge representations-ontologies and terminologies-has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies.
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
40 CFR 1048.110 - How must my engines diagnose malfunctions?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., the MIL may stay off during later engine operation. (d) Store trouble codes in computer memory. Record and store in computer memory any diagnostic trouble codes showing a malfunction that should illuminate...
Recent applications of the transonic wing analysis computer code, TWING
NASA Technical Reports Server (NTRS)
Subramanian, N. R.; Holst, T. L.; Thomas, S. D.
1982-01-01
An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.
Response surface method in geotechnical/structural analysis, phase 1
NASA Astrophysics Data System (ADS)
Wong, F. S.
1981-02-01
In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.
User's Manual for FEMOM3DS. Version 1.0
NASA Technical Reports Server (NTRS)
Reddy, C.J.; Deshpande, M. D.
1997-01-01
FEMOM3DS is a computer code written in FORTRAN 77 to compute electromagnetic(EM) scattering characteristics of a three dimensional object with complex materials using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity and the triangular elements with the basis functions similar to that described for MoM at the outer boundary. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.
Performance measures for transform data coding.
NASA Technical Reports Server (NTRS)
Pearl, J.; Andrews, H. C.; Pratt, W. K.
1972-01-01
This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.
ERIC Educational Resources Information Center
Holbrook, M. Cay; MacCuspie, P. Ann
2010-01-01
Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…
ERIC Educational Resources Information Center
Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela
2015-01-01
Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…
NASA Technical Reports Server (NTRS)
Stoll, Frederick
1993-01-01
The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.
NASA Technical Reports Server (NTRS)
Rathjen, K. A.
1977-01-01
A digital computer code CAVE (Conduction Analysis Via Eigenvalues), which finds application in the analysis of two dimensional transient heating of hypersonic vehicles is described. The CAVE is written in FORTRAN 4 and is operational on both IBM 360-67 and CDC 6600 computers. The method of solution is a hybrid analytical numerical technique that is inherently stable permitting large time steps even with the best of conductors having the finest of mesh size. The aerodynamic heating boundary conditions are calculated by the code based on the input flight trajectory or can optionally be calculated external to the code and then entered as input data. The code computes the network conduction and convection links, as well as capacitance values, given basic geometrical and mesh sizes, for four generations (leading edges, cooled panels, X-24C structure and slabs). Input and output formats are presented and explained. Sample problems are included. A brief summary of the hybrid analytical-numerical technique, which utilizes eigenvalues (thermal frequencies) and eigenvectors (thermal mode vectors) is given along with aerodynamic heating equations that have been incorporated in the code and flow charts.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Method for rapid high-frequency seismogram calculation
NASA Astrophysics Data System (ADS)
Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo
2009-02-01
We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).
Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor
NASA Technical Reports Server (NTRS)
Ajmani, Kumud; Liu, Nan-Suey
2005-01-01
The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.
A supersonic three-dimensional code for flow over blunt bodies: Program documentation and test cases
NASA Technical Reports Server (NTRS)
Chaussee, D. S.; Mcmillan, O. J.
1980-01-01
The use of a computer code for the calculation of steady, supersonic, three dimensional, inviscid flow over blunt bodies is illustrated. Input and output are given and explained for two cases: a pointed code of 20 deg half angle at 15 deg angle of attack in a free stream with M sub infinite = 7, and a cone-ogive-cylinder at 10 deg angle of attack with M sub infinite = 2.86. A source listing of the computer code is provided.
PLASIM: A computer code for simulating charge exchange plasma propagation
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Deininger, W. D.; Winder, D. R.; Kaufman, H. R.
1982-01-01
The propagation of the charge exchange plasma for an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ANSI Standard FORTRAN.
Influence of Roads on the Surrounding Natural Environment - Vegetation, Soil, and Ground Water
1980-02-01
0 __________ _______ ___ _ ii _ _ _ _ en 12bm ~11 I.IO .Figuz 9. Observationsomr~ de Gundlatorp (nr 15). 1W 15, - 1~Karlsharnn - Ronrioby, (Study site...0s . . number of’ srscie s - -- ANTAL ATf 1EJPER RUTA 9 10 13 3 Per souare TOTALI AITAL LATER 13 total n- -her of srnecies Table 2. Analyses of...50. It .5 POA ANNUA 2 1 3 5 100. 3. 2.7 POA NERORALIS 2 0 0 1 50. . 8 No.of species per square ANJTAL ARTER PE.R RUTA 7 6 S 6 Total no. Of S ~eciesE
Development of a Decision Taxonomy for the Marine Command and Control Environment. Volume I
1979-06-01
34, . .1 Strucuri Aid Characterization aml Areliiary int1Ol 1977. SftI ert 4? W- 7709. UOctP 0ECIS6011 A.ILOSIS Oalkey, 14. ~L~ on r 2 JL-aGc79Shol 4...AEPRESEVrA-.: Cft Luger, a. "A State-Soace Descript~ion of Transfer Effects in 1somorcihic Problem Situation%, ’ nt. J. van-Macsine Studies, 1978, 10:613-6Z3...Decision Making and Planning by Maring Officers," M.S. Thesis , San Diego State University, Spring, 1976 (unpublished). 8-1 Leal, A., Levin, S., Johnston, S
1981-03-01
sweep wLS1vaTI GrJAVA-M*- .~--- ’- - - -t------ 7EOA FFM t ee.W?mv ~.L: 1.7.5.4/2n C.AMM~AT!: L’sftle MIMI I-- itlm . z4 ~~~as~e;.~~ i c’.1~te1...ac~e I C-s.met~ I £’A.F- ivsa ~ta I ~ acca az~Y I I tc~ ~t I tca± - e~aec N .x-c, ; cnar. ~ ;zar~a I ~i~; EC~I~ a c2r..1C:%ttI I ~t7!fl - -- rr..r
1982-01-01
Scale Items Checking the Manipulation of O’s Self -disclosure Means F(3, 70) P Self - Nonself- disclosure disclosure 24.85 35.55 8.76 .0002 i...AD-A126 925 EFFECTS OF THE RECIPROCIT OF SELF S SCLOSURE UPON / ATTRIBUTO OS ATAACT. U AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH P D FISHER 1982...GOVT ACCESSION4 NO. 3. 09ECiPiEwTIS CATALOG NUMVER I g. TI~a (dd sulilS. TYPE OF REPORT a PERIOD COVERED Effects of the Reciprocity of Self -Disclosure
India’s Nuclear Energy Program and U.S. Policies Today. Revision
1980-02-01
they were telling the truth . This might appear to be troubling on some of the familiar standards of Western logic--that the CIRUS should have been...M CAR TELL YOU. ____ a vo ),, WR, o R WEAPCM S UNDER kjY uo~IIOS N 00 lot VINT~~TSII HE 81RITISA PIPR!(ECiT RIALRTVN To NAVE I.L.CIDNS Vf ANY WIND...F-i - F-2 TI C .. SEPTE4BER 1, 1939 by W.H. Auden I sit in one of the dives On Fifty-Second Street Uncertain and afraid As the clever hopes
The Programs of Japan in the Philippines
1944-07-29
regarding the appointment and territorial- ju - risdiction of notaries public. The measure provides that judges of courts of the first instance and the...8217 -i 1.):’ila-nl: (So aI-Te. Ten) , :oV:,.or Chiof of the Publ:ciity Deart,.-ment of the Ihi ’ - V a2ds, oh t.,ippine Brao]ih of the Ihunr!kin6 Ju ...savings of Fili-pinos has increased from 10 pesos to jUS under 20 pesos (1l/5/43 Tok. Jap.) . c C N M I C FINANCE ECiANGE At the Manila note exc’hIange
Extension, validation and application of the NASCAP code
NASA Technical Reports Server (NTRS)
Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.
1979-01-01
Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.
NASA Technical Reports Server (NTRS)
Weilmuenster, K. J.; Hamilton, H. H., II
1983-01-01
A computer code HALIS, designed to compute the three dimensional flow about shuttle like configurations at angles of attack greater than 25 deg, is described. Results from HALIS are compared where possible with an existing flow field code; such comparisons show excellent agreement. Also, HALIS results are compared with experimental pressure distributions on shuttle models over a wide range of angle of attack. These comparisons are excellent. It is demonstrated that the HALIS code can incorporate equilibrium air chemistry in flow field computations.
Analysis of JSI TRIGA MARK II reactor physical parameters calculated with TRIPOLI and MCNP.
Henry, R; Tiselj, I; Snoj, L
2015-03-01
New computational model of the JSI TRIGA Mark II research reactor was built for TRIPOLI computer code and compared with existing MCNP code model. The same modelling assumptions were used in order to check the differences of the mathematical models of both Monte Carlo codes. Differences between the TRIPOLI and MCNP predictions of keff were up to 100pcm. Further validation was performed with analyses of the normalized reaction rates and computations of kinetic parameters for various core configurations. Copyright © 2014 Elsevier Ltd. All rights reserved.
A comparison of two central difference schemes for solving the Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Maksymiuk, C. M.; Swanson, R. C.; Pulliam, T. H.
1990-01-01
Five viscous transonic airfoil cases were computed by two significantly different computational fluid dynamics codes: An explicit finite-volume algorithm with multigrid, and an implicit finite-difference approximate-factorization method with Eigenvector diagonalization. Both methods are described in detail, and their performance on the test cases is compared. The codes utilized the same grids, turbulence model, and computer to provide the truest test of the algorithms. The two approaches produce very similar results, which, for attached flows, also agree well with experimental results; however, the explicit code is considerably faster.
Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D; Volz, Kerstin
2017-06-01
We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space. Copyright © 2017 Elsevier B.V. All rights reserved.
Evaluation of the efficiency and fault density of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1993-01-01
Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.
NASA Technical Reports Server (NTRS)
White, P. R.; Little, R. R.
1985-01-01
A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baes, C.F. III; Sharp, R.D.; Sjoreen, A.L.
1984-11-01
TERRA is a computer code which calculates concentrations of radionuclides and ingrowing daughters in surface and root-zone soil, produce and feed, beef, and milk from a given deposition rate at any location in the conterminous United States. The code is fully integrated with seven other computer codes which together comprise a Computerized Radiological Risk Investigation System, CRRIS. Output from either the long range (> 100 km) atmospheric dispersion code RETADD-II or the short range (<80 km) atmospheric dispersion code ANEMOS, in the form of radionuclide air concentrations and ground deposition rates by downwind location, serves as input to TERRA. User-definedmore » deposition rates and air concentrations may also be provided as input to TERRA through use of the PRIMUS computer code. The environmental concentrations of radionuclides predicted by TERRA serve as input to the ANDROS computer code which calculates population and individual intakes, exposures, doses, and risks. TERRA incorporates models to calculate uptake from soil and atmospheric deposition on four groups of produce for human consumption and four groups of livestock feeds. During the environmental transport simulation, intermediate calculations of interception fraction for leafy vegetables, produce directly exposed to atmospherically depositing material, pasture, hay, and silage are made based on location-specific estimates of standing crop biomass. Pasture productivity is estimated by a model which considers the number and types of cattle and sheep, pasture area, and annual production of other forages (hay and silage) at a given location. Calculations are made of the fraction of grain imported from outside the assessment area. TERRA output includes the above calculations and estimated radionuclide concentrations in plant produce, milk, and a beef composite by location.« less
A survey to identify the clinical coding and classification systems currently in use across Europe.
de Lusignan, S; Minmagh, C; Kennedy, J; Zeimet, M; Bommezijn, H; Bryant, J
2001-01-01
This is a survey to identify what clinical coding systems are currently in use across the European Union, and the states seeking membership to it. We sought to identify what systems are currently used and to what extent they were subject to local adaptation. Clinical coding should facilitate identifying key medical events in a computerised medical record, and aggregating information across groups of records. The emerging new driver is as the enabler of the life-long computerised medical record. A prerequisite for this level of functionality is the transfer of information between different computer systems. This transfer can be facilitated either by working on the interoperability problems between disparate systems or by harmonising the underlying data. This paper examines the extent to which the latter has occurred across Europe. Literature and Internet search. Requests for information via electronic mail to pan-European mailing lists of health informatics professionals. Coding systems are now a de facto part of health information systems across Europe. There are relatively few coding systems in existence across Europe. ICD9 and ICD 10, ICPC and Read were the most established. However the local adaptation of these classification systems either on a by country or by computer software manufacturer basis; significantly reduces the ability for the meaning coded with patients computer records to be easily transferred from one medical record system to another. There is no longer any debate as to whether a coding or classification system should be used. Convergence of different classifications systems should be encouraged. Countries and computer manufacturers within the EU should be encouraged to stop making local modifications to coding and classification systems, as this practice risks significantly slowing progress towards easy transfer of records between computer systems.
PIC codes for plasma accelerators on emerging computer architectures (GPUS, Multicore/Manycore CPUS)
NASA Astrophysics Data System (ADS)
Vincenti, Henri
2016-03-01
The advent of exascale computers will enable 3D simulations of a new laser-plasma interaction regimes that were previously out of reach of current Petasale computers. However, the paradigm used to write current PIC codes will have to change in order to fully exploit the potentialities of these new computing architectures. Indeed, achieving Exascale computing facilities in the next decade will be a great challenge in terms of energy consumption and will imply hardware developments directly impacting our way of implementing PIC codes. As data movement (from die to network) is by far the most energy consuming part of an algorithm future computers will tend to increase memory locality at the hardware level and reduce energy consumption related to data movement by using more and more cores on each compute nodes (''fat nodes'') that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, CPU machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD register length is expected to double every four years. GPU's also have a reduced clock speed per core and can process Multiple Instructions on Multiple Datas (MIMD). At the software level Particle-In-Cell (PIC) codes will thus have to achieve both good memory locality and vectorization (for Multicore/Manycore CPU) to fully take advantage of these upcoming architectures. In this talk, we present the portable solutions we implemented in our high performance skeleton PIC code PICSAR to both achieve good memory locality and cache reuse as well as good vectorization on SIMD architectures. We also present the portable solutions used to parallelize the Pseudo-sepctral quasi-cylindrical code FBPIC on GPUs using the Numba python compiler.
Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers
NASA Technical Reports Server (NTRS)
Skiles, J. W.; Schulbach, C. H.
1994-01-01
Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.
User's manual: Subsonic/supersonic advanced panel pilot code
NASA Technical Reports Server (NTRS)
Moran, J.; Tinoco, E. N.; Johnson, F. T.
1978-01-01
Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.
Experimental aerothermodynamic research of hypersonic aircraft
NASA Technical Reports Server (NTRS)
Cleary, Joseph W.
1987-01-01
The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.
Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Additional extensions to the NASCAP computer code, volume 3
NASA Technical Reports Server (NTRS)
Mandell, M. J.; Cooke, D. L.
1981-01-01
The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.
Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers
NASA Astrophysics Data System (ADS)
Ogino, T.
High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... analysis and design, and computer software design and coding. Given the fact that over $500 million were... acoustic algorithms, computer architecture, and source code that dated to the 1970s. Since that time... 2012. Version 3.0 is an entirely new, state-of-the-art computer program used for predicting noise...
ERIC Educational Resources Information Center
Good, Jonathon; Keenan, Sarah; Mishra, Punya
2016-01-01
The popular press is rife with examples of how students in the United States and around the globe are learning to program, make, and tinker. The Hour of Code, maker-education, and similar efforts are advocating that more students be exposed to principles found within computer science. We propose an expansion beyond simply teaching computational…
NASA Technical Reports Server (NTRS)
Athavale, Mahesh; Przekwas, Andrzej
2004-01-01
The objectives of the program were to develop computational fluid dynamics (CFD) codes and simpler industrial codes for analyzing and designing advanced seals for air-breathing and space propulsion engines. The CFD code SCISEAL is capable of producing full three-dimensional flow field information for a variety of cylindrical configurations. An implicit multidomain capability allow the division of complex flow domains to allow optimum use of computational cells. SCISEAL also has the unique capability to produce cross-coupled stiffness and damping coefficients for rotordynamic computations. The industrial codes consist of a series of separate stand-alone modules designed for expeditious parametric analyses and optimization of a wide variety of cylindrical and face seals. Coupled through a Knowledge-Based System (KBS) that provides a user-friendly Graphical User Interface (GUI), the industrial codes are PC based using an OS/2 operating system. These codes were designed to treat film seals where a clearance exists between the rotating and stationary components. Leakage is inhibited by surface roughness, small but stiff clearance films, and viscous pumping devices. The codes have demonstrated to be a valuable resource for seal development of future air-breathing and space propulsion engines.
Establishment of a Beta Test Center for the NPARC Code at Central State University
NASA Technical Reports Server (NTRS)
Okhio, Cyril B.
1996-01-01
Central State University has received a supplementary award to purchase computer workstations for the NPARC (National Propulsion Ames Research Center) computational fluid dynamics code BETA Test Center. The computational code has also been acquired for installation on the workstations. The acquisition of this code is an initial step for CSU in joining an alliance composed of NASA, AEDC, The Aerospace Industry, and academia. A post-Doctoral research Fellow from a neighboring university will assist the PI in preparing a template for Tutorial documents for the BETA test center. The major objective of the alliance is to establish a national applications-oriented CFD capability, centered on the NPARC code. By joining the alliance, the BETA test center at CSU will allow the PI, as well as undergraduate and post-graduate students to test the capability of the NPARC code in predicting the physics of aerodynamic/geometric configurations that are of interest to the alliance. Currently, CSU is developing a once a year, hands-on conference/workshop based upon the experience acquired from running other codes similar to the NPARC code in the first year of this grant.
Microgravity computing codes. User's guide
NASA Astrophysics Data System (ADS)
1982-01-01
Codes used in microgravity experiments to compute fluid parameters and to obtain data graphically are introduced. The computer programs are stored on two diskettes, compatible with the floppy disk drives of the Apple 2. Two versions of both disks are available (DOS-2 and DOS-3). The codes are written in BASIC and are structured as interactive programs. Interaction takes place through the keyboard of any Apple 2-48K standard system with single floppy disk drive. The programs are protected against wrong commands given by the operator. The programs are described step by step in the same order as the instructions displayed on the monitor. Most of these instructions are shown, with samples of computation and of graphics.
Computer access security code system
NASA Technical Reports Server (NTRS)
Collins, Earl R., Jr. (Inventor)
1990-01-01
A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less
Quantum computing with Majorana fermion codes
NASA Astrophysics Data System (ADS)
Litinski, Daniel; von Oppen, Felix
2018-05-01
We establish a unified framework for Majorana-based fault-tolerant quantum computation with Majorana surface codes and Majorana color codes. All logical Clifford gates are implemented with zero-time overhead. This is done by introducing a protocol for Pauli product measurements with tetrons and hexons which only requires local 4-Majorana parity measurements. An analogous protocol is used in the fault-tolerant setting, where tetrons and hexons are replaced by Majorana surface code patches, and parity measurements are replaced by lattice surgery, still only requiring local few-Majorana parity measurements. To this end, we discuss twist defects in Majorana fermion surface codes and adapt the technique of twist-based lattice surgery to fermionic codes. Moreover, we propose a family of codes that we refer to as Majorana color codes, which are obtained by concatenating Majorana surface codes with small Majorana fermion codes. Majorana surface and color codes can be used to decrease the space overhead and stabilizer weight compared to their bosonic counterparts.
Computational Nuclear Physics and Post Hartree-Fock Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietz, Justin; Sam, Novario; Hjorth-Jensen, M.
We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions onmore » strategies for porting the code to present and planned high-performance computing facilities.« less
Computer codes for thermal analysis of a solid rocket motor nozzle
NASA Technical Reports Server (NTRS)
Chauhan, Rajinder Singh
1988-01-01
A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.
New Parallel computing framework for radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostin, M.A.; /Michigan State U., NSCL; Mokhov, N.V.
A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility canmore » be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less
Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.
2005-01-01
In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.
User's manual for CBS3DS, version 1.0
NASA Astrophysics Data System (ADS)
Reddy, C. J.; Deshpande, M. D.
1995-10-01
CBS3DS is a computer code written in FORTRAN 77 to compute the backscattering radar cross section of cavity backed apertures in infinite ground plane and slots in thick infinite ground plane. CBS3DS implements the hybrid Finite Element Method (FEM) and Method of Moments (MoM) techniques. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity/slot and the triangular elements with the basis functions for MoM at the apertures. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials; due to MoM, the apertures can be of any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computer the code is intended to run.
Program optimizations: The interplay between power, performance, and energy
Leon, Edgar A.; Karlin, Ian; Grant, Ryan E.; ...
2016-05-16
Practical considerations for future supercomputer designs will impose limits on both instantaneous power consumption and total energy consumption. Working within these constraints while providing the maximum possible performance, application developers will need to optimize their code for speed alongside power and energy concerns. This paper analyzes the effectiveness of several code optimizations including loop fusion, data structure transformations, and global allocations. A per component measurement and analysis of different architectures is performed, enabling the examination of code optimizations on different compute subsystems. Using an explicit hydrodynamics proxy application from the U.S. Department of Energy, LULESH, we show how code optimizationsmore » impact different computational phases of the simulation. This provides insight for simulation developers into the best optimizations to use during particular simulation compute phases when optimizing code for future supercomputing platforms. Here, we examine and contrast both x86 and Blue Gene architectures with respect to these optimizations.« less
Computational simulation of progressive fracture in fiber composites
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.
NASA Technical Reports Server (NTRS)
STACK S. H.
1981-01-01
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wemhoff, A P; Burnham, A K
2006-04-05
Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less
Ducted-Fan Engine Acoustic Predictions using a Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Rumsey, C. L.; Biedron, R. T.; Farassat, F.; Spence, P. L.
1998-01-01
A Navier-Stokes computer code is used to predict one of the ducted-fan engine acoustic modes that results from rotor-wake/stator-blade interaction. A patched sliding-zone interface is employed to pass information between the moving rotor row and the stationary stator row. The code produces averaged aerodynamic results downstream of the rotor that agree well with a widely used average-passage code. The acoustic mode of interest is generated successfully by the code and is propagated well upstream of the rotor; temporal and spatial numerical resolution are fine enough such that attenuation of the signal is small. Two acoustic codes are used to find the far-field noise. Near-field propagation is computed by using Eversman's wave envelope code, which is based on a finite-element model. Propagation to the far field is accomplished by using the Kirchhoff formula for moving surfaces with the results of the wave envelope code as input data. Comparison of measured and computed far-field noise levels show fair agreement in the range of directivity angles where the peak radiation lobes from the inlet are observed. Although only a single acoustic mode is targeted in this study, the main conclusion is a proof-of-concept: Navier-Stokes codes can be used both to generate and propagate rotor/stator acoustic modes forward through an engine, where the results can be coupled to other far-field noise prediction codes.
NASA Technical Reports Server (NTRS)
Bonhaus, Daryl L.; Wornom, Stephen F.
1991-01-01
Two codes which solve the 3-D Thin Layer Navier-Stokes (TLNS) equations are used to compute the steady state flow for two test cases representing typical finite wings at transonic conditions. Several grids of C-O topology and varying point densities are used to determine the effects of grid refinement. After a description of each code and test case, standards for determining code efficiency and accuracy are defined and applied to determine the relative performance of the two codes in predicting turbulent transonic wing flows. Comparisons of computed surface pressure distributions with experimental data are made.
Computer Simulation of the VASIMR Engine
NASA Technical Reports Server (NTRS)
Garrison, David
2005-01-01
The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.
Procedures for the computation of unsteady transonic flows including viscous effects
NASA Technical Reports Server (NTRS)
Rizzetta, D. P.
1982-01-01
Modifications of the code LTRAN2, developed by Ballhaus and Goorjian, which account for viscous effects in the computation of planar unsteady transonic flows are presented. Two models are considered and their theoretical development and numerical implementation is discussed. Computational examples employing both models are compared with inviscid solutions and with experimental data. Use of the modified code is described.
ERIC Educational Resources Information Center
Pon-Barry, Heather; Packard, Becky Wai-Ling; St. John, Audrey
2017-01-01
A dilemma within computer science departments is developing sustainable ways to expand capacity within introductory computer science courses while remaining committed to inclusive practices. Training near-peer mentors for peer code review is one solution. This paper describes the preparation of near-peer mentors for their role, with a focus on…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
... analysis and design, and computer software design and coding. Given the fact that over $500 million were... acoustic algorithms, computer architecture, and source code that dated to the 1970s. Since that time... towards the end of 2012. Version 3.0 is an entirely new, state-of-the-art computer program used for...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Administration COURT SERVICES AND OFFENDER SUPERVISION AGENCY FOR THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS... proprietary interest in the information. (e) Computer software means tools by which records are created, stored, and retrieved. Normally, computer software, including source code, object code, and listings of...
Error threshold for color codes and random three-body Ising models.
Katzgraber, Helmut G; Bombin, H; Martin-Delgado, M A
2009-08-28
We study the error threshold of color codes, a class of topological quantum codes that allow a direct implementation of quantum Clifford gates suitable for entanglement distillation, teleportation, and fault-tolerant quantum computation. We map the error-correction process onto a statistical mechanical random three-body Ising model and study its phase diagram via Monte Carlo simulations. The obtained error threshold of p(c) = 0.109(2) is very close to that of Kitaev's toric code, showing that enhanced computational capabilities do not necessarily imply lower resistance to noise.
Photoionization and High Density Gas
NASA Technical Reports Server (NTRS)
Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)
2002-01-01
We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.
The mathematical theory of signal processing and compression-designs
NASA Astrophysics Data System (ADS)
Feria, Erlan H.
2006-05-01
The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.
Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code
NASA Technical Reports Server (NTRS)
Weinberg, B. C.; Mcdonald, H.
1980-01-01
There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.
NASA Technical Reports Server (NTRS)
Teske, M. E.
1984-01-01
This is a user manual for the computer code ""AGDISP'' (AGricultural DISPersal) which has been developed to predict the deposition of material released from fixed and rotary wing aircraft in a single-pass, computationally efficient manner. The formulation of the code is novel in that the mean particle trajectory and the variance about the mean resulting from turbulent fluid fluctuations are simultaneously predicted. The code presently includes the capability of assessing the influence of neutral atmospheric conditions, inviscid wake vortices, particle evaporation, plant canopy and terrain on the deposition pattern.
HOMAR: A computer code for generating homotopic grids using algebraic relations: User's manual
NASA Technical Reports Server (NTRS)
Moitra, Anutosh
1989-01-01
A computer code for fast automatic generation of quasi-three-dimensional grid systems for aerospace configurations is described. The code employs a homotopic method to algebraically generate two-dimensional grids in cross-sectional planes, which are stacked to produce a three-dimensional grid system. Implementation of the algebraic equivalents of the homotopic relations for generating body geometries and grids are explained. Procedures for controlling grid orthogonality and distortion are described. Test cases with description and specification of inputs are presented in detail. The FORTRAN computer program and notes on implementation and use are included.
Advances in Computational Capabilities for Hypersonic Flows
NASA Technical Reports Server (NTRS)
Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip
1997-01-01
The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.
[Microbiological characterisation of Listeria monocytogenes isolates from human cases in Andalusia].
Lepe, José A; Torres, María José; Liró, Julia; Luque, Rafael; Aznar, Javier
2012-12-01
The aim of this study was to perform a retrospective study by genotyping 154 isolates from human listeriosis cases occurred in the region of Andalusia (southern Spain) in the period 2005-2009. Serotyping was performed for 1 and 4 somatic antigens using commercial Listeria antisera, and by multiplex-PCR serogrouping according to the method described by Doumith et al. (2004). The antimicrobial susceptibility was performed by Epsilon test and interpreted by CLSI criteria. PFGE was performed according to the PulseNet protocol with the ApaI enzyme. The similarity of PFGE profiles was evaluated using the Bionumerics software. The multiplex PCR protocol described by Chen and Knabel (2007) was used for the identification of isolates belonging to L. monocytogenes ECI, ECII, and ECIII epidemic clones. The 154 isolates were grouped into four serotypes: 4b [94 (61%)] strains, 1/2b [30 (19%)] strains, 1/2a [27 (18%)] strains, and 1/2c [3 (2%)] strains, with 100% of susceptibility to ampicillin and cotrimoxazole. A further sixty-two ApaI distinct pulsotypes were recognized. Thirty-seven isolates (24%) showed unique ApaI pulsotypes, and the remaining 117 strains (76%) were assigned to 25 ApaI clusters (60% in clusters of more than two isolates). The EC markers were found in 62 (40.3%) of the L. monocytogenes isolates tested. The ECI marker was present in 43 (46.2%) 4b serotype isolates, ECII in 10 (10.7%) 4b serotype isolates, and ECIII in 9 (33,3%) 1/2a serotype isolates. A large proportion of the human listeriosis cases under investigation could be grouped into molecular subtype clusters, and our cases could be related to international food-borne outbreaks. Copyright © 2011 Elsevier España, S.L. All rights reserved.
Yang, Zujun; Zhang, Tao; Bolshoy, Alexander; Beharav, Alexander; Nevo, Eviatar
2009-05-01
'Evolution Canyon' (ECI) at Lower Nahal Oren, Mount Carmel, Israel, is an optimal natural microscale model for unravelling evolution in action highlighting the twin evolutionary processes of adaptation and speciation. A major model organism in ECI is wild barley, Hordeum spontaneum, the progenitor of cultivated barley, which displays dramatic interslope adaptive and speciational divergence on the 'African' dry slope (AS) and the 'European' humid slope (ES), separated on average by 200 m. Here we examined interslope single nucleotide polymorphism (SNP) sequences and the expression diversity of the drought resistant dehydrin 1 gene (Dhn1) between the opposite slopes. We analysed 47 plants (genotypes), 4-10 individuals in each of seven stations (populations) in an area of 7000 m(2), for Dhn1 sequence diversity located in the 5' upstream flanking region of the gene. We found significant levels of Dhn1 genic diversity represented by 29 haplotypes, derived from 45 SNPs in a total of 708 bp sites. Most of the haplotypes, 25 out of 29 (= 86.2%), were represented by one genotype; hence, unique to one population. Only a single haplotype was common to both slopes. Genetic divergence of sequence and haplotype diversity was generally and significantly different among the populations and slopes. Nucleotide diversity was higher on the AS, whereas haplotype diversity was higher on the ES. Interslope divergence was significantly higher than intraslope divergence. The applied Tajima D rejected neutrality of the SNP diversity. The Dhn1 expression under dehydration indicated interslope divergent expression between AS and ES genotypes, reinforcing Dhn1 associated with drought resistance of wild barley at 'Evolution Canyon'. These results are inexplicable by mutation, gene flow, or chance effects, and support adaptive natural microclimatic selection as the major evolutionary divergent driving force.
Development and application of computational aerothermodynamics flowfield computer codes
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj
1994-01-01
Research was performed in the area of computational modeling and application of hypersonic, high-enthalpy, thermo-chemical nonequilibrium flow (Aerothermodynamics) problems. A number of computational fluid dynamic (CFD) codes were developed and applied to simulate high altitude rocket-plume, the Aeroassist Flight Experiment (AFE), hypersonic base flow for planetary probes, the single expansion ramp model (SERN) connected with the National Aerospace Plane, hypersonic drag devices, hypersonic ramp flows, ballistic range models, shock tunnel facility nozzles, transient and steady flows in the shock tunnel facility, arc-jet flows, thermochemical nonequilibrium flows around simple and complex bodies, axisymmetric ionized flows of interest to re-entry, unsteady shock induced combustion phenomena, high enthalpy pulsed facility simulations, and unsteady shock boundary layer interactions in shock tunnels. Computational modeling involved developing appropriate numerical schemes for the flows on interest and developing, applying, and validating appropriate thermochemical processes. As part of improving the accuracy of the numerical predictions, adaptive grid algorithms were explored, and a user-friendly, self-adaptive code (SAGE) was developed. Aerothermodynamic flows of interest included energy transfer due to strong radiation, and a significant level of effort was spent in developing computational codes for calculating radiation and radiation modeling. In addition, computational tools were developed and applied to predict the radiative heat flux and spectra that reach the model surface.
NASA Technical Reports Server (NTRS)
Bittker, D. A.; Scullin, V. J.
1984-01-01
A general chemical kinetics code is described for complex, homogeneous ideal gas reactions in any chemical system. The main features of the GCKP84 code are flexibility, convenience, and speed of computation for many different reaction conditions. The code, which replaces the GCKP code published previously, solves numerically the differential equations for complex reaction in a batch system or one dimensional inviscid flow. It also solves numerically the nonlinear algebraic equations describing the well stirred reactor. A new state of the art numerical integration method is used for greatly increased speed in handling systems of stiff differential equations. The theory and the computer program, including details of input preparation and a guide to using the code are given.
A Multiple Sphere T-Matrix Fortran Code for Use on Parallel Computer Clusters
NASA Technical Reports Server (NTRS)
Mackowski, D. W.; Mishchenko, M. I.
2011-01-01
A general-purpose Fortran-90 code for calculation of the electromagnetic scattering and absorption properties of multiple sphere clusters is described. The code can calculate the efficiency factors and scattering matrix elements of the cluster for either fixed or random orientation with respect to the incident beam and for plane wave or localized- approximation Gaussian incident fields. In addition, the code can calculate maps of the electric field both interior and exterior to the spheres.The code is written with message passing interface instructions to enable the use on distributed memory compute clusters, and for such platforms the code can make feasible the calculation of absorption, scattering, and general EM characteristics of systems containing several thousand spheres.
Computation of neutron fluxes in clusters of fuel pins arranged in hexagonal assemblies (2D and 3D)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabha, H.; Marleau, G.
2012-07-01
For computations of fluxes, we have used Carvik's method of collision probabilities. This method requires tracking algorithms. An algorithm to compute tracks (in 2D and 3D) has been developed for seven hexagonal geometries with cluster of fuel pins. This has been implemented in the NXT module of the code DRAGON. The flux distribution in cluster of pins has been computed by using this code. For testing the results, they are compared when possible with the EXCELT module of the code DRAGON. Tracks are plotted in the NXT module by using MATLAB, these plots are also presented here. Results are presentedmore » with increasing number of lines to show the convergence of these results. We have numerically computed volumes, surface areas and the percentage errors in these computations. These results show that 2D results converge faster than 3D results. The accuracy on the computation of fluxes up to second decimal is achieved with fewer lines. (authors)« less
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components
NASA Technical Reports Server (NTRS)
1991-01-01
Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.
ASTEC—the Aarhus STellar Evolution Code
NASA Astrophysics Data System (ADS)
Christensen-Dalsgaard, Jørgen
2008-08-01
The Aarhus code is the result of a long development, starting in 1974, and still ongoing. A novel feature is the integration of the computation of adiabatic oscillations for specified models as part of the code. It offers substantial flexibility in terms of microphysics and has been carefully tested for the computation of solar models. However, considerable development is still required in the treatment of nuclear reactions, diffusion and convective mixing.
Instrumentation for Verification of Bomb Damage Repair Computer Code.
1981-09-01
record the data, a conventional 14-track FM analog tape recorder was retained. The unknown factors of signal duration, test duration, and signal ...Kirtland Air Force Base computer centers for more detailed analyses. In addition to the analog recorder, signal conditioning equipment and amplifiers were...necessary to allow high quality data to be recorded. An Interrange Instrumentation Group (IRIG) code generator/reader placed a coded signal on the tape
Feasibility of a computer-assisted feedback system between dispatch centre and ambulances.
Lindström, Veronica; Karlsten, Rolf; Falk, Ann-Charlotte; Castrèn, Maaret
2011-06-01
The aim of the study was to evaluate the feasibility of a newly developed computer-assisted feedback system between dispatch centre and ambulances in Stockholm, Sweden. A computer-assisted feedback system based on a Finnish model was designed to fit the Swedish emergency medical system. Feedback codes were identified and divided into three categories; assessment of patients' primary condition when ambulance arrives at scene, no transport by the ambulance and level of priority. Two ambulances and one emergency medical communication centre (EMCC) in Stockholm participated in the study. A sample of 530 feedback codes sent through the computer-assisted feedback system was reviewed. The information on the ambulance medical records was compared with the feedback codes used and 240 assignments were further analyzed. The used feedback codes sent from ambulance to EMCC were correct in 92% of the assignments. The most commonly used feedback code sent to the emergency medical dispatchers was 'agree with the dispatchers' assessment'. In addition, in 160 assignments there was a mismatch between emergency medical dispatchers and ambulance nurse assessments. Our results have shown a high agreement between medical dispatchers and ambulance nurse assessment. The feasibility of the feedback codes seems to be acceptable based on the small margin of error. The computer-assisted feedback system may, when used on a daily basis, make it possible for the medical dispatchers to receive feedback in a structural way. The EMCC organization can directly evaluate any changes in the assessment protocol by structured feedback sent from the ambulance.
NASA Technical Reports Server (NTRS)
Lilley, D. G.; Rhode, D. L.
1982-01-01
A primitive pressure-velocity variable finite difference computer code was developed to predict swirling recirculating inert turbulent flows in axisymmetric combustors in general, and for application to a specific idealized combustion chamber with sudden or gradual expansion. The technique involves a staggered grid system for axial and radial velocities, a line relaxation procedure for efficient solution of the equations, a two-equation k-epsilon turbulence model, a stairstep boundary representation of the expansion flow, and realistic accommodation of swirl effects. A user's manual, dealing with the computational problem, showing how the mathematical basis and computational scheme may be translated into a computer program is presented. A flow chart, FORTRAN IV listing, notes about various subroutines and a user's guide are supplied as an aid to prospective users of the code.
Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing
NASA Technical Reports Server (NTRS)
Ozguner, Fusun
1996-01-01
Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J
Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less
NASA Astrophysics Data System (ADS)
Delogu, A.; Furini, F.
1991-09-01
Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.
Los Alamos radiation transport code system on desktop computing platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less
McKenzie, Kirsten; Walker, Sue; Tong, Shilu
It remains unclear whether the change from a manual to an automated coding system (ACS) for deaths has significantly affected the consistency of Australian mortality data. The underlying causes of 34,000 deaths registered in 1997 in Australia were dual coded, in ICD-9 manually, and by using an automated computer coding program. The diseases most affected by the change from manual to ACS were senile/presenile dementia, and pneumonia. The most common disease to which a manually assigned underlying cause of senile dementia was coded with ACS was unspecified psychoses (37.2%). Only 12.5% of codes assigned by ACS as senile dementia were coded the same by manual coders. This study indicates some important differences in mortality rates when comparing mortality data that have been coded manually with those coded using an automated computer coding program. These differences may be related to both the different interpretation of ICD coding rules between manual and automated coding, and different co-morbidities or co-existing conditions among demographic groups.
Adaptive neural coding: from biological to behavioral decision-making
Louie, Kenway; Glimcher, Paul W.; Webb, Ryan
2015-01-01
Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666
Star adaptation for two-algorithms used on serial computers
NASA Technical Reports Server (NTRS)
Howser, L. M.; Lambiotte, J. J., Jr.
1974-01-01
Two representative algorithms used on a serial computer and presently executed on the Control Data Corporation 6000 computer were adapted to execute efficiently on the Control Data STAR-100 computer. Gaussian elimination for the solution of simultaneous linear equations and the Gauss-Legendre quadrature formula for the approximation of an integral are the two algorithms discussed. A description is given of how the programs were adapted for STAR and why these adaptations were necessary to obtain an efficient STAR program. Some points to consider when adapting an algorithm for STAR are discussed. Program listings of the 6000 version coded in 6000 FORTRAN, the adapted STAR version coded in 6000 FORTRAN, and the STAR version coded in STAR FORTRAN are presented in the appendices.
Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?
Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon
2007-01-01
Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.
NASA Astrophysics Data System (ADS)
Moon, Hongsik
What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.
ACON: a multipurpose production controller for plasma physics codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snell, C.
1983-01-01
ACON is a BCON controller designed to run large production codes on the CTSS Cray-1 or the LTSS 7600 computers. ACON can also be operated interactively, with input from the user's terminal. The controller can run one code or a sequence of up to ten codes during the same job. Options are available to get and save Mass storage files, to perform Historian file updating operations, to compile and load source files, and to send out print and film files. Special features include ability to retry after Mass failures, backup options for saving files, startup messages for the various codes,more » and ability to reserve specified amounts of computer time after successive code runs. ACON's flexibility and power make it useful for running a number of different production codes.« less
NASA Astrophysics Data System (ADS)
Lidar, Daniel A.; Brun, Todd A.
2013-09-01
Prologue; Preface; Part I. Background: 1. Introduction to decoherence and noise in open quantum systems Daniel Lidar and Todd Brun; 2. Introduction to quantum error correction Dave Bacon; 3. Introduction to decoherence-free subspaces and noiseless subsystems Daniel Lidar; 4. Introduction to quantum dynamical decoupling Lorenza Viola; 5. Introduction to quantum fault tolerance Panos Aliferis; Part II. Generalized Approaches to Quantum Error Correction: 6. Operator quantum error correction David Kribs and David Poulin; 7. Entanglement-assisted quantum error-correcting codes Todd Brun and Min-Hsiu Hsieh; 8. Continuous-time quantum error correction Ognyan Oreshkov; Part III. Advanced Quantum Codes: 9. Quantum convolutional codes Mark Wilde; 10. Non-additive quantum codes Markus Grassl and Martin Rötteler; 11. Iterative quantum coding systems David Poulin; 12. Algebraic quantum coding theory Andreas Klappenecker; 13. Optimization-based quantum error correction Andrew Fletcher; Part IV. Advanced Dynamical Decoupling: 14. High order dynamical decoupling Zhen-Yu Wang and Ren-Bao Liu; 15. Combinatorial approaches to dynamical decoupling Martin Rötteler and Pawel Wocjan; Part V. Alternative Quantum Computation Approaches: 16. Holonomic quantum computation Paolo Zanardi; 17. Fault tolerance for holonomic quantum computation Ognyan Oreshkov, Todd Brun and Daniel Lidar; 18. Fault tolerant measurement-based quantum computing Debbie Leung; Part VI. Topological Methods: 19. Topological codes Héctor Bombín; 20. Fault tolerant topological cluster state quantum computing Austin Fowler and Kovid Goyal; Part VII. Applications and Implementations: 21. Experimental quantum error correction Dave Bacon; 22. Experimental dynamical decoupling Lorenza Viola; 23. Architectures Jacob Taylor; 24. Error correction in quantum communication Mark Wilde; Part VIII. Critical Evaluation of Fault Tolerance: 25. Hamiltonian methods in QEC and fault tolerance Eduardo Novais, Eduardo Mucciolo and Harold Baranger; 26. Critique of fault-tolerant quantum information processing Robert Alicki; References; Index.
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Putt, Charles W.
1997-01-01
The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.
Efficient Helicopter Aerodynamic and Aeroacoustic Predictions on Parallel Computers
NASA Technical Reports Server (NTRS)
Wissink, Andrew M.; Lyrintzis, Anastasios S.; Strawn, Roger C.; Oliker, Leonid; Biswas, Rupak
1996-01-01
This paper presents parallel implementations of two codes used in a combined CFD/Kirchhoff methodology to predict the aerodynamics and aeroacoustics properties of helicopters. The rotorcraft Navier-Stokes code, TURNS, computes the aerodynamic flowfield near the helicopter blades and the Kirchhoff acoustics code computes the noise in the far field, using the TURNS solution as input. The overall parallel strategy adds MPI message passing calls to the existing serial codes to allow for communication between processors. As a result, the total code modifications required for parallel execution are relatively small. The biggest bottleneck in running the TURNS code in parallel comes from the LU-SGS algorithm that solves the implicit system of equations. We use a new hybrid domain decomposition implementation of LU-SGS to obtain good parallel performance on the SP-2. TURNS demonstrates excellent parallel speedups for quasi-steady and unsteady three-dimensional calculations of a helicopter blade in forward flight. The execution rate attained by the code on 114 processors is six times faster than the same cases run on one processor of the Cray C-90. The parallel Kirchhoff code also shows excellent parallel speedups and fast execution rates. As a performance demonstration, unsteady acoustic pressures are computed at 1886 far-field observer locations for a sample acoustics problem. The calculation requires over two hundred hours of CPU time on one C-90 processor but takes only a few hours on 80 processors of the SP2. The resultant far-field acoustic field is analyzed with state of-the-art audio and video rendering of the propagating acoustic signals.
Computer code for the optimization of performance parameters of mixed explosive formulations.
Muthurajan, H; Sivabalan, R; Talawar, M B; Venugopalan, S; Gandhe, B R
2006-08-25
LOTUSES is a novel computer code, which has been developed for the prediction of various thermodynamic properties such as heat of formation, heat of explosion, volume of explosion gaseous products and other related performance parameters. In this paper, we report LOTUSES (Version 1.4) code which has been utilized for the optimization of various high explosives in different combinations to obtain maximum possible velocity of detonation. LOTUSES (Version 1.4) code will vary the composition of mixed explosives automatically in the range of 1-100% and computes the oxygen balance as well as the velocity of detonation for various compositions in preset steps. Further, the code suggests the compositions for which least oxygen balance and the higher velocity of detonation could be achieved. Presently, the code can be applied for two component explosive compositions. The code has been validated with well-known explosives like, TNT, HNS, HNF, TATB, RDX, HMX, AN, DNA, CL-20 and TNAZ in different combinations. The new algorithm incorporated in LOTUSES (Version 1.4) enhances the efficiency and makes it a more powerful tool for the scientists/researches working in the field of high energy materials/hazardous materials.
Upgrades of Two Computer Codes for Analysis of Turbomachinery
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; Liou, Meng-Sing
2005-01-01
Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.
NASA Technical Reports Server (NTRS)
Goodwin, Sabine A.; Raj, P.
1999-01-01
Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.
Computer programs for predicting supersonic and hypersonic interference flow fields and heating
NASA Technical Reports Server (NTRS)
Morris, D. J.; Keyes, J. W.
1973-01-01
This report describes computer codes which calculate two-dimensional shock interference patterns. These codes compute the six types of interference flows as defined by Edney (Aeronaut. Res. Inst. of Sweden FAA Rep. 115). Results include properties of the inviscid flow field and the inviscid-viscous interaction at the surface along with peak pressure and peak heating at the impingement point.
A Model Code of Ethics for the Use of Computers in Education.
ERIC Educational Resources Information Center
Shere, Daniel T.; Cannings, Terence R.
Two Delphi studies were conducted by the Ethics and Equity Committee of the International Council for Computers in Education (ICCE) to obtain the opinions of experts on areas that should be covered by ethical guides for the use of computers in education and for software development, and to develop a model code of ethics for each of these areas.…
Nonlinear Computational Aeroelasticity: Formulations and Solution Algorithms
2003-03-01
problem is proposed. Fluid-structure coupling algorithms are then discussed with some emphasis on distributed computing strategies. Numerical results...the structure and the exchange of structure motion to the fluid. The computational fluid dynamics code PFES is our finite element code for the numerical ...unstructured meshes). It was numerically demonstrated [1-3] that EBS can be less diffusive than SUPG [4-6] and the standard Finite Volume schemes
MIADS2 ... an alphanumeric map information assembly and display system for a large computer
Elliot L. Amidon
1966-01-01
A major improvement and extension of the Map Information Assembly and Display System (MIADS) developed in 1964 is described. Basic principles remain unchanged, but the computer programs have been expanded and rewritten for a large computer, in Fortran IV and MAP languages. The code system is extended from 99 integers to about 2,200 alphanumeric 2-character codes. Hand-...
Parallelisation study of a three-dimensional environmental flow model
NASA Astrophysics Data System (ADS)
O'Donncha, Fearghal; Ragnoli, Emanuele; Suits, Frank
2014-03-01
There are many simulation codes in the geosciences that are serial and cannot take advantage of the parallel computational resources commonly available today. One model important for our work in coastal ocean current modelling is EFDC, a Fortran 77 code configured for optimal deployment on vector computers. In order to take advantage of our cache-based, blade computing system we restructured EFDC from serial to parallel, thereby allowing us to run existing models more quickly, and to simulate larger and more detailed models that were previously impractical. Since the source code for EFDC is extensive and involves detailed computation, it is important to do such a port in a manner that limits changes to the files, while achieving the desired speedup. We describe a parallelisation strategy involving surgical changes to the source files to minimise error-prone alteration of the underlying computations, while allowing load-balanced domain decomposition for efficient execution on a commodity cluster. The use of conjugate gradient posed particular challenges due to implicit non-local communication posing a hindrance to standard domain partitioning schemes; a number of techniques are discussed to address this in a feasible, computationally efficient manner. The parallel implementation demonstrates good scalability in combination with a novel domain partitioning scheme that specifically handles mixed water/land regions commonly found in coastal simulations. The approach presented here represents a practical methodology to rejuvenate legacy code on a commodity blade cluster with reasonable effort; our solution has direct application to other similar codes in the geosciences.
Analysis of internal flows relative to the space shuttle main engine
NASA Technical Reports Server (NTRS)
1987-01-01
Cooperative efforts between the Lockheed-Huntsville Computational Mechanics Group and the NASA-MSFC Computational Fluid Dynamics staff has resulted in improved capabilities for numerically simulating incompressible flows generic to the Space Shuttle Main Engine (SSME). A well established and documented CFD code was obtained, modified, and applied to laminar and turbulent flows of the type occurring in the SSME Hot Gas Manifold. The INS3D code was installed on the NASA-MSFC CRAY-XMP computer system and is currently being used by NASA engineers. Studies to perform a transient analysis of the FPB were conducted. The COBRA/TRAC code is recommended for simulating the transient flow of oxygen into the LOX manifold. Property data for modifying the code to represent LOX/GOX flow was collected. The ALFA code was developed and recommended for representing the transient combustion in the preburner. These two codes will couple through the transient boundary conditions to simulate the startup and/or shutdown of the fuel preburner. A study, NAS8-37461, is currently being conducted to implement this modeling effort.
Near Zone: Basic scattering code user's manual with space station applications
NASA Technical Reports Server (NTRS)
Marhefka, R. J.; Silvestro, J. W.
1989-01-01
The Electromagnetic Code - Basic Scattering Code, Version 3, is a user oriented computer code to analyze near and far zone patterns of antennas in the presence of scattering structures, to provide coupling between antennas in a complex environment, and to determine radiation hazard calculations at UHF and above. The analysis is based on uniform asymptotic techniques formulated in terms of the Uniform Geometrical Theory of Diffraction (UTD). Complicated structures can be simulated by arbitrarily oriented flat plates and an infinite ground plane that can be perfectly conducting or dielectric. Also, perfectly conducting finite elliptic cylinder, elliptic cone frustum sections, and finite composite ellipsoids can be used to model the superstructure of a ship, the body of a truck, and airplane, a satellite, etc. This manual gives special consideration to space station modeling applications. This is a user manual designed to give an overall view of the operation of the computer code, to instruct a user in how to model structures, and to show the validity of the code by comparing various computed results against measured and alternative calculations such as method of moments whenever available.
ASHMET: A computer code for estimating insolation incident on tilted surfaces
NASA Technical Reports Server (NTRS)
Elkin, R. F.; Toelle, R. G.
1980-01-01
A computer code, ASHMET, was developed by MSFC to estimate the amount of solar insolation incident on the surfaces of solar collectors. Both tracking and fixed-position collectors were included. Climatological data for 248 U. S. locations are built into the code. The basic methodology used by ASHMET is the ASHRAE clear-day insolation relationships modified by a clearness index derived from SOLMET-measured solar radiation data to a horizontal surface.
Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code
1979-06-01
dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was
NASA Technical Reports Server (NTRS)
Collins, Earl R., Jr.
1990-01-01
Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
DOE Office of Scientific and Technical Information (OSTI.GOV)
HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.
2000-04-01
Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were appliedmore » to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.« less
A proposed study of multiple scattering through clouds up to 1 THz
NASA Technical Reports Server (NTRS)
Gerace, G. C.; Smith, E. K.
1992-01-01
A rigorous computation of the electromagnetic field scattered from an atmospheric liquid water cloud is proposed. The recent development of a fast recursive algorithm (Chew algorithm) for computing the fields scattered from numerous scatterers now makes a rigorous computation feasible. A method is presented for adapting this algorithm to a general case where there are an extremely large number of scatterers. It is also proposed to extend a new binary PAM channel coding technique (El-Khamy coding) to multiple levels with non-square pulse shapes. The Chew algorithm can be used to compute the transfer function of a cloud channel. Then the transfer function can be used to design an optimum El-Khamy code. In principle, these concepts can be applied directly to the realistic case of a time-varying cloud (adaptive channel coding and adaptive equalization). A brief review is included of some preliminary work on cloud dispersive effects on digital communication signals and on cloud liquid water spectra and correlations.
NASA Technical Reports Server (NTRS)
Sulyma, P. R.
1980-01-01
Fundamental equations and similarity definition and application are described as well as the computational steps of a computer program developed to design model nozzles for wind tunnel tests conducted to define power-on aerodynamic characteristics of the space shuttle over a range of ascent trajectory conditions. The computer code capabilities, a user's guide for the model nozzle design program, and the output format are examined. A program listing is included.
NASA Technical Reports Server (NTRS)
Anderson, B. H.; Putt, C. W.; Giamati, C. C.
1981-01-01
Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.
CARES/LIFE Software Commercialization
NASA Technical Reports Server (NTRS)
1995-01-01
The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.
Pretest aerosol code comparisons for LWR aerosol containment tests LA1 and LA2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, A.L.; Wilson, J.H.; Arwood, P.C.
The Light-Water-Reactor (LWR) Aerosol Containment Experiments (LACE) are being performed in Richland, Washington, at the Hanford Engineering Development Laboratory (HEDL) under the leadership of an international project board and the Electric Power Research Institute. These tests have two objectives: (1) to investigate, at large scale, the inherent aerosol retention behavior in LWR containments under simulated severe accident conditions, and (2) to provide an experimental data base for validating aerosol behavior and thermal-hydraulic computer codes. Aerosol computer-code comparison activities are being coordinated at the Oak Ridge National Laboratory. For each of the six LACE tests, ''pretest'' calculations (for code-to-code comparisons) andmore » ''posttest'' calculations (for code-to-test data comparisons) are being performed. The overall goals of the comparison effort are (1) to provide code users with experience in applying their codes to LWR accident-sequence conditions and (2) to evaluate and improve the code models.« less
Electronic curriculum implementation at North American dental schools.
Hendricson, William D; Panagakos, Fotinos; Eisenberg, Elise; McDonald, James; Guest, Gary; Jones, Pamela; Johnson, Lynn; Cintron, Laura
2004-10-01
Electronic curriculum, or E-curriculum, refers to computer-based learning including educational materials available on CD or DVD, online courses, electronic mechanisms to search the literature, email, and various applications of instructional technology including providing laptops to students, multimedia projection systems, and Internet-compatible classrooms. In spite of enthusiasm about the potential for E-curriculum to enhance dental education, there is minimal guidance in the literature to assist schools with implementation. The study objectives were: 1) identify U.S. and Canadian dental schools that have initiated mandatory laptop programs and assess cost, faculty development issues, extent of curricular use, problems, and qualitative perceptions; 2) determine the extent to which twenty-two other E-curriculum resources were available and used at North American dental schools; and 3) identify factors that influenced E-curriculum implementation. A twenty-six item questionnaire, known as the Electronic Curriculum Implementation Survey (ECIS), was mailed to all sixty-six North American dental schools (ten Canadian and fifty-six U.S. schools) during 2002-03 with a response rate of 100 percent. Twenty-five of the twenty-six ECIS questions employed a menu-driven, forced choice format, but respondents could provide amplifying comments. Fifty-three questionnaires were completed by associate deans for academic affairs, three by deans, and ten by instructional technology (IT) managers, IT committee chairs, or directors of dental informatics departments. The survey found that E-curriculum implementation among North American dental schools is following the classic innovation pattern in which a few early adopting institutions proceed rapidly while the majority of potential adopters make modifications slowly. Fourteen U.S. dental schools have established mandatory laptop programs for students. Ten of these laptop programs were created in the past two years; respondents reported numerous growing pains but were generally pleased with their progress. Other E-curriculum capabilities were incorporated into courses more frequently at laptop schools than the fifty-two non-laptop schools including websites, online course evaluations, and instructor use of email to communicate with students. Few dental schools use online courses, and at most schools, few faculty have received training in online instructional techniques. Virtually all North American dental schools have provided substantial instructional technology resources to their faculty, but use of twenty-two components and capabilities of E-curriculum was limited, especially at schools without laptop programs. Various faculty-related issues were reported as implementation barriers including lack of time, skill, and incentive to develop educational software. We conclude that many North American dental schools, especially those with laptop programs, are functioning at the "learn by doing" phase of initial implementation in a four-stage innovation adoption model. E-curriculum planners should pay close attention to implementation problems that occur at this stage where many innovation efforts break down.
Using concatenated quantum codes for universal fault-tolerant quantum gates.
Jochym-O'Connor, Tomas; Laflamme, Raymond
2014-01-10
We propose a method for universal fault-tolerant quantum computation using concatenated quantum error correcting codes. The concatenation scheme exploits the transversal properties of two different codes, combining them to provide a means to protect against low-weight arbitrary errors. We give the required properties of the error correcting codes to ensure universal fault tolerance and discuss a particular example using the 7-qubit Steane and 15-qubit Reed-Muller codes. Namely, other than computational basis state preparation as required by the DiVincenzo criteria, our scheme requires no special ancillary state preparation to achieve universality, as opposed to schemes such as magic state distillation. We believe that optimizing the codes used in such a scheme could provide a useful alternative to state distillation schemes that exhibit high overhead costs.
Proceedings of the 14th International Conference on the Numerical Simulation of Plasmas
NASA Astrophysics Data System (ADS)
Partial Contents are as follows: Numerical Simulations of the Vlasov-Maxwell Equations by Coupled Particle-Finite Element Methods on Unstructured Meshes; Electromagnetic PIC Simulations Using Finite Elements on Unstructured Grids; Modelling Travelling Wave Output Structures with the Particle-in-Cell Code CONDOR; SST--A Single-Slice Particle Simulation Code; Graphical Display and Animation of Data Produced by Electromagnetic, Particle-in-Cell Codes; A Post-Processor for the PEST Code; Gray Scale Rendering of Beam Profile Data; A 2D Electromagnetic PIC Code for Distributed Memory Parallel Computers; 3-D Electromagnetic PIC Simulation on the NRL Connection Machine; Plasma PIC Simulations on MIMD Computers; Vlasov-Maxwell Algorithm for Electromagnetic Plasma Simulation on Distributed Architectures; MHD Boundary Layer Calculation Using the Vortex Method; and Eulerian Codes for Plasma Simulations.
Lean coding machine. Facilities target productivity and job satisfaction with coding automation.
Rollins, Genna
2010-07-01
Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Andrew; Lawrence, Earl
The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less
Scheduling Operations for Massive Heterogeneous Clusters
NASA Technical Reports Server (NTRS)
Humphrey, John; Spagnoli, Kyle
2013-01-01
High-performance computing (HPC) programming has become increasingly difficult with the advent of hybrid supercomputers consisting of multicore CPUs and accelerator boards such as the GPU. Manual tuning of software to achieve high performance on this type of machine has been performed by programmers. This is needlessly difficult and prone to being invalidated by new hardware, new software, or changes in the underlying code. A system was developed for task-based representation of programs, which when coupled with a scheduler and runtime system, allows for many benefits, including higher performance and utilization of computational resources, easier programming and porting, and adaptations of code during runtime. The system consists of a method of representing computer algorithms as a series of data-dependent tasks. The series forms a graph, which can be scheduled for execution on many nodes of a supercomputer efficiently by a computer algorithm. The schedule is executed by a dispatch component, which is tailored to understand all of the hardware types that may be available within the system. The scheduler is informed by a cluster mapping tool, which generates a topology of available resources and their strengths and communication costs. Software is decoupled from its hardware, which aids in porting to future architectures. A computer algorithm schedules all operations, which for systems of high complexity (i.e., most NASA codes), cannot be performed optimally by a human. The system aids in reducing repetitive code, such as communication code, and aids in the reduction of redundant code across projects. It adds new features to code automatically, such as recovering from a lost node or the ability to modify the code while running. In this project, the innovators at the time of this reporting intend to develop two distinct technologies that build upon each other and both of which serve as building blocks for more efficient HPC usage. First is the scheduling and dynamic execution framework, and the second is scalable linear algebra libraries that are built directly on the former.
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.; Jones, Scott M.
1991-01-01
This analysis and this computer code apply to full, split, and dual expander cycles. Heat regeneration from the turbine exhaust to the pump exhaust is allowed. The combustion process is modeled as one of chemical equilibrium in an infinite-area or a finite-area combustor. Gas composition in the nozzle may be either equilibrium or frozen during expansion. This report, which serves as a users guide for the computer code, describes the system, the analysis methodology, and the program input and output. Sample calculations are included to show effects of key variables such as nozzle area ratio and oxidizer-to-fuel mass ratio.
Development of a cryogenic mixed fluid J-T cooling computer code, 'JTMIX'
NASA Technical Reports Server (NTRS)
Jones, Jack A.
1991-01-01
An initial study was performed for analyzing and predicting the temperatures and cooling capacities when mixtures of fluids are used in Joule-Thomson coolers and in heat pipes. A computer code, JTMIX, was developed for mixed gas J-T analysis for any fluid combination of neon, nitrogen, various hydrocarbons, argon, oxygen, carbon monoxide, carbon dioxide, and hydrogen sulfide. When used in conjunction with the NIST computer code, DDMIX, it has accurately predicted order-of-magnitude increases in J-T cooling capacities when various hydrocarbons are added to nitrogen, and it predicts nitrogen normal boiling point depressions to as low as 60 K when neon is added.
Particle Hydrodynamics with Material Strength for Multi-Layer Orbital Debris Shield Design
NASA Technical Reports Server (NTRS)
Fahrenthold, Eric P.
1999-01-01
Three dimensional simulation of oblique hypervelocity impact on orbital debris shielding places extreme demands on computer resources. Research to date has shown that particle models provide the most accurate and efficient means for computer simulation of shield design problems. In order to employ a particle based modeling approach to the wall plate impact portion of the shield design problem, it is essential that particle codes be augmented to represent strength effects. This report describes augmentation of a Lagrangian particle hydrodynamics code developed by the principal investigator, to include strength effects, allowing for the entire shield impact problem to be represented using a single computer code.
Laser Signature Prediction Using The VALUE Computer Program
NASA Astrophysics Data System (ADS)
Akerman, Alexander; Hoffman, George A.; Patton, Ronald
1989-09-01
A variety of enhancements are being made to the 1976-vintage LASERX computer code. These include: - Surface characterization with BDRF tabular data - Specular reflection from transparent surfaces - Generation of glint direction maps - Generation of relative range imagery - Interface to the LOWTRAN atmospheric transmission code - Interface to the LEOPS laser sensor code - User friendly menu prompting for easy setup Versions of VALUE have been written for both VAX/VMS and PC/DOS computer environments. Outputs have also been revised to be user friendly and include tables, plots, and images for (1) intensity, (2) cross section,(3) reflectance, (4) relative range, (5) region type, and (6) silhouette.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clouse, C. J.; Edwards, M. J.; McCoy, M. G.
2015-07-07
Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.
Experimental and analytical comparison of flowfields in a 110 N (25 lbf) H2/O2 rocket
NASA Technical Reports Server (NTRS)
Reed, Brian D.; Penko, Paul F.; Schneider, Steven J.; Kim, Suk C.
1991-01-01
A gaseous hydrogen/gaseous oxygen 110 N (25 lbf) rocket was examined through the RPLUS code using the full Navier-Stokes equations with finite rate chemistry. Performance tests were conducted on the rocket in an altitude test facility. Preliminary parametric analyses were performed for a range of mixture ratios and fuel film cooling pcts. It is shown that the computed values of specific impulse and characteristic exhaust velocity follow the trend of the experimental data. Specific impulse computed by the code is lower than the comparable test values by about two to three percent. The computed characteristic exhaust velocity values are lower than the comparable test values by three to four pct. Thrust coefficients computed by the code are found to be within two pct. of the measured values. It is concluded that the discrepancy between computed and experimental performance values could not be attributed to experimental uncertainty.
Reliability model of a monopropellant auxiliary propulsion system
NASA Technical Reports Server (NTRS)
Greenberg, J. S.
1971-01-01
A mathematical model and associated computer code has been developed which computes the reliability of a monopropellant blowdown hydrazine spacecraft auxiliary propulsion system as a function of time. The propulsion system is used to adjust or modify the spacecraft orbit over an extended period of time. The multiple orbit corrections are the multiple objectives which the auxiliary propulsion system is designed to achieve. Thus the reliability model computes the probability of successfully accomplishing each of the desired orbit corrections. To accomplish this, the reliability model interfaces with a computer code that models the performance of a blowdown (unregulated) monopropellant auxiliary propulsion system. The computer code acts as a performance model and as such gives an accurate time history of the system operating parameters. The basic timing and status information is passed on to and utilized by the reliability model which establishes the probability of successfully accomplishing the orbit corrections.
A Computational Study of an Oscillating VR-12 Airfoil with a Gurney Flap
NASA Technical Reports Server (NTRS)
Rhee, Myung
2004-01-01
Computations of the flow over an oscillating airfoil with a Gurney-flap are performed using a Reynolds Averaged Navier-Stokes code and compared with recent experimental data. The experimental results have been generated for different sizes of the Gurney flaps. The computations are focused mainly on a configuration. The baseline airfoil without a Gurney flap is computed and compared with the experiments in both steady and unsteady cases for the purpose of initial testing of the code performance. The are carried out with different turbulence models. Effects of the grid refinement are also examined and unsteady cases, in addition to the assessment of solver effects. The results of the comparisons of steady lift and drag computations indicate that the code is reasonably accurate in the attached flow of the steady condition but largely overpredicts the lift and underpredicts the drag in the higher angle steady flow.