An Investigation of the Flow Physics of Acoustic Liners by Direct Numerical Simulation
NASA Technical Reports Server (NTRS)
Watson, Willie R. (Technical Monitor); Tam, Christopher
2004-01-01
This report concentrates on reporting the effort and status of work done on three dimensional (3-D) simulation of a multi-hole resonator in an impedance tube. This work is coordinated with a parallel experimental effort to be carried out at the NASA Langley Research Center. The outline of this report is as follows : 1. Preliminary consideration. 2. Computation model. 3. Mesh design and parallel computing. 4. Visualization. 5. Status of computer code development. 1. Preliminary Consideration.
An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1988-01-01
The initial effort was concentrated on developing the quasi-analytical approach for two-dimensional transonic flow. To keep the problem computationally efficient and straightforward, only the two-dimensional flow was considered and the problem was modeled using the transonic small perturbation equation.
Application of computational physics within Northrop
NASA Technical Reports Server (NTRS)
George, M. W.; Ling, R. T.; Mangus, J. F.; Thompkins, W. T.
1987-01-01
An overview of Northrop programs in computational physics is presented. These programs depend on access to today's supercomputers, such as the Numerical Aerodynamical Simulator (NAS), and future growth on the continuing evolution of computational engines. Descriptions here are concentrated on the following areas: computational fluid dynamics (CFD), computational electromagnetics (CEM), computer architectures, and expert systems. Current efforts and future directions in these areas are presented. The impact of advances in the CFD area is described, and parallels are drawn to analagous developments in CEM. The relationship between advances in these areas and the development of advances (parallel) architectures and expert systems is also presented.
Louisiana: a model for advancing regional e-Research through cyberinfrastructure.
Katz, Daniel S; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott
2009-06-28
Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date.
Computation of Large Turbulence Structures and Noise of Supersonic Jets
NASA Technical Reports Server (NTRS)
Tam, Christopher
1996-01-01
Our research effort concentrated on obtaining an understanding of the generation mechanisms and the prediction of the three components of supersonic jet noise. In addition, we also developed a computational method for calculating the mean flow of turbulent high-speed jets. Below is a short description of the highlights of our contributions in each of these areas: (a) Broadband shock associated noise, (b) Turbulent mixing noise, (c) Screech tones and impingement tones, (d) Computation of the mean flow of turbulent jets.
Louisiana: a model for advancing regional e-Research through cyberinfrastructure
Katz, Daniel S.; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D.; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott
2009-01-01
Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date. PMID:19451102
NASA Technical Reports Server (NTRS)
1979-01-01
Eastman Kodak Company, Rochester, New York is a broad-based firm which produces photographic apparatus and supplies, fibers, chemicals and vitamin concentrates. Much of the company's research and development effort is devoted to photographic science and imaging technology, including laser technology. Eastman Kodak is using a COSMIC computer program called LACOMA in the analysis of laser optical systems and camera design studies. The company reports that use of the program has provided development time savings and reduced computer service fees.
Computerization of Library and Information Services in Mainland China.
ERIC Educational Resources Information Center
Lin, Sharon Chien
1994-01-01
Describes two phases of the automation of library and information services in mainland China. From 1974-86, much effort was concentrated on developing computer systems, databases, online retrieval, and networking. From 1986 to the present, practical progress became possible largely because of CD-ROM technology; and large scale networking for…
Computer Network Security- The Challenges of Securing a Computer Network
NASA Technical Reports Server (NTRS)
Scotti, Vincent, Jr.
2011-01-01
This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.
Evaluation of hydrothermal resources of North Dakota. Phase II. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, K.L.; Howell, F.L.; Winczewski, L.M.
1981-06-01
The Phase II activities dealt with three main topical areas: geothermal gradient and heat-flow studies, stratigraphic studies, and water quality studies. Efforts were concentrated on Mesozoic and Cenozoic rocks. The geothermal gradient and heat-flow studies involved running temperature logs in groundwater observation holes in areas of interest, and locating, obtaining access to, and casing holes of convenience to be used as heat-flow determination sites. The stratigraphic and water quality studies involved two main efforts: updating and expanding WELLFILE and assembling a computer library system (WELLCAT) for all water wells drilled in the state. WATERCAT combines data from the United Statesmore » Geological Survey Water Resources Division's WATSTOR and GWST computer libraries; and includes physical, stratigraphic, and water quality data. Goals, methods, and results are presented.« less
Simulation of the Dropping Mercury Electrode by Orthogonal Collocation.
1982-08-18
Electro byOtoyI =111. - e 15,, "A’Al,.rt arp t2g? ____________ ;f f-1e of Navel Rehnar..h .905 Chemistry Program - Chemitry CO&e 41? Unkclass ified Pe . nI...transport to a dropping mercury electrode lomr.i. Accurate values for’ the concentration profiles and current are obtained with minimal computational effort...offered by COMPUTATIONAL ASP’ECTS KOutecky (14) which is corrected for spherical dittusion Results accurate to 0.4 0 of Koutecky’s calculated values I 08 n
Solar energy concentrator system for crystal growth and zone refining in space
NASA Technical Reports Server (NTRS)
Mcdermit, J. H.
1975-01-01
The technological feasibility of using solar concentrators for crystal growth and zone refining in space has been performed. Previous studies of space-deployed solar concentrators were reviewed for their applicability to materials processing and a new state-of-the-art concentrator-receiver radiation analysis was developed. The radiation analysis is in the form of a general purpose computer program. It was concluded from this effort that the technology for fabricating, orbiting and deploying large solar concentrators has been developed. It was also concluded that the technological feasibility of space processing materials in the focal region of a solar concentrator depends primarily on two factors: (1) the ability of a solar concentrator to provide sufficient thermal energy for the process and (2) the ability of a solar concentrator to provide a thermal environment that is conductive to the processes of interest. The analysis indicate that solar concentrators can satisfactorily provide both of these factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samuels, Alex C.; Boele, Cherilynn A.; Bennett, Kevin T.
2014-12-01
A combined experimental and theoretical approach has investigated the complex speciation of Rh(III) in hydrochloric and nitric acid media, as a function of acid concentration. This has relevance to the separation and isolation of Rh(III) from dissolved spent nuclear fuel, which is an emergent and attractive alternative source of platinum group metals, relative to traditional mining efforts.
A preliminary design study for a cosmic X-ray spectrometer
NASA Technical Reports Server (NTRS)
1972-01-01
The results are described of theoretical and experimental investigations aimed at the development of a curved crystal cosmic X-ray spectrometer to be used at the focal plane of the large orbiting X-ray telescope on the third High Energy Astronomical Observatory. The effort was concentrated on the development of spectrometer concepts and their evaluation by theoretical analysis, computer simulation, and laboratory testing with breadboard arrangements of crystals and detectors. In addition, a computer-controlled facility for precision testing and evaluation of crystals in air and vacuum was constructed. A summary of research objectives and results is included.
Multiple Concentric Cylinder Model (MCCM) user's guide
NASA Technical Reports Server (NTRS)
Williams, Todd O.; Pindera, Marek-Jerzy
1994-01-01
A user's guide for the computer program mccm.f is presented. The program is based on a recently developed solution methodology for the inelastic response of an arbitrarily layered, concentric cylinder assemblage under thermomechanical loading which is used to model the axisymmetric behavior of unidirectional metal matrix composites in the presence of various microstructural details. These details include the layered morphology of certain types of ceramic fibers, as well as multiple fiber/matrix interfacial layers recently proposed as a means of reducing fabrication-induced, and in-service, residual stress. The computer code allows efficient characterization and evaluation of new fibers and/or new coating systems on existing fibers with a minimum of effort, taking into account inelastic and temperature-dependent properties and different morphologies of the fiber and the interfacial region. It also facilitates efficient design of engineered interfaces for unidirectional metal matrix composites.
NASA Astrophysics Data System (ADS)
Suresh, Deivarajan
Secondary concentrators operate in the focal plane of a point focusing system such as a paraboloidal dish or a tower and, when properly designed, are capable of enhancing the overall concentration ratio of the optical system at least by factor of two to five. The viability of using different shapes was demonstrated both analytically as well as experimentally in recent years, including Compound Parabolic Concentrators (CPCs) of circular cross section and 'trumpets' as secondaries. Current research effort is centered around a HCPC (Hexagonal CPC). Major areas addressed include an overview on the state of development of secondary concentrators, some background information related to the design of a HCPC, the results of an analytical study on the thermal behavior of this HCPC under concentrated flux conditions, and a computer modeling for assessing the possible thermal interactions between the secondary and a high temperature receiver.
Code of Federal Regulations, 2014 CFR
2014-01-01
... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...
Code of Federal Regulations, 2012 CFR
2012-01-01
... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...
Code of Federal Regulations, 2013 CFR
2013-01-01
... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...
Code of Federal Regulations, 2011 CFR
2011-01-01
... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...
Code of Federal Regulations, 2010 CFR
2010-01-01
... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...
Combining Thermal And Structural Analyses
NASA Technical Reports Server (NTRS)
Winegar, Steven R.
1990-01-01
Computer code makes programs compatible so stresses and deformations calculated. Paper describes computer code combining thermal analysis with structural analysis. Called SNIP (for SINDA-NASTRAN Interfacing Program), code provides interface between finite-difference thermal model of system and finite-element structural model when no node-to-element correlation between models. Eliminates much manual work in converting temperature results of SINDA (Systems Improved Numerical Differencing Analyzer) program into thermal loads for NASTRAN (NASA Structural Analysis) program. Used to analyze concentrating reflectors for solar generation of electric power. Large thermal and structural models needed to predict distortion of surface shapes, and SNIP saves considerable time and effort in combining models.
Reflection Effects in Multimode Fiber Systems Utilizing Laser Transmitters
NASA Technical Reports Server (NTRS)
Bates, Harry E.
1991-01-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
Reflection effects in multimode fiber systems utilizing laser transmitters
NASA Astrophysics Data System (ADS)
Bates, Harry E.
1991-11-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
Recrystallization and Grain Growth Kinetics in Binary Alpha Titanium-Aluminum Alloys
NASA Astrophysics Data System (ADS)
Trump, Anna Marie
Titanium alloys are used in a variety of important naval and aerospace applications and often undergo thermomechanical processing which leads to recrystallization and grain growth. Both of these processes have a significant impact on the mechanical properties of the material. Therefore, understanding the kinetics of these processes is crucial to being able to predict the final properties. Three alloys are studied with varying concentrations of aluminum which allows for the direct quantification of the effect of aluminum content on the kinetics of recrystallization and grain growth. Aluminum is the most common alpha stabilizing alloying element used in titanium alloys, however the effect of aluminum on these processes has not been previously studied. This work is also part of a larger Integrated Computational Materials Engineering (ICME) effort whose goal is to combine both computational and experimental efforts to develop computationally efficient models that predict materials microstructure and properties based on processing history. The static recrystallization kinetics are measured using an electron backscatter diffraction (EBSD) technique and a significant retardation in the kinetics is observed with increasing aluminum concentration. An analytical model is then used to capture these results and is able to successfully predict the effect of solute concentration on the time to 50% recrystallization. The model reveals that this solute effect is due to a combination of a decrease in grain boundary mobility and a decrease in driving force with increasing aluminum concentration. The effect of microstructural inhomogeneities is also experimentally quantified and the results are validated with a phase field model for recrystallization. These microstructural inhomogeneities explain the experimentally measured Avrami exponent, which is lower than the theoretical value calculated by the JMAK model. Similar to the effect seen in recrystallization, the addition of aluminum also significantly slows downs the grain growth kinetics. This is generally attributed to the solute drag effect due to segregation of solute atoms at the grain boundaries, however aluminum segregation is not observed in these alloys. The mechanism for this result is explained and is used to validate the prediction of an existing model for solute drag.
NASA Astrophysics Data System (ADS)
Stewart, Iris T.; Loague, Keith
2003-12-01
Groundwater vulnerability assessments of nonpoint source agrochemical contamination at regional scales are either qualitative in nature or require prohibitively costly computational efforts. By contrast, the type transfer function (TTF) modeling approach for vadose zone pesticide leaching presented here estimates solute concentrations at a depth of interest, only uses available soil survey, climatic, and irrigation information, and requires minimal computational cost for application. TTFs are soil texture based travel time probability density functions that describe a characteristic leaching behavior for soil profiles with similar soil hydraulic properties. Seven sets of TTFs, representing different levels of upscaling, were developed for six loam soil textural classes with the aid of simulated breakthrough curves from synthetic data sets. For each TTF set, TTFs were determined from a group or subgroup of breakthrough curves for each soil texture by identifying the effective parameters of the function that described the average leaching behavior of the group. The grouping of the breakthrough curves was based on the TTF index, a measure of the magnitude of the peak concentration, the peak arrival time, and the concentration spread. Comparison to process-based simulations show that the TTFs perform well with respect to mass balance, concentration magnitude, and the timing of concentration peaks. Sets of TTFs based on individual soil textures perform better for all the evaluation criteria than sets that span all textures. As prediction accuracy and computational cost increase with the number of TTFs in a set, the selection of a TTF set is determined by a given application.
Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T
2012-01-01
This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.
Ant colony system algorithm for the optimization of beer fermentation control.
Xiao, Jie; Zhou, Ze-Kui; Zhang, Guang-Xin
2004-12-01
Beer fermentation is a dynamic process that must be guided along a temperature profile to obtain the desired results. Ant colony system algorithm was applied to optimize the kinetic model of this process. During a fixed period of fermentation time, a series of different temperature profiles of the mixture were constructed. An optimal one was chosen at last. Optimal temperature profile maximized the final ethanol production and minimized the byproducts concentration and spoilage risk. The satisfactory results obtained did not require much computation effort.
Quality assurance for health and environmental chemistry: 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gautier, M.A.; Gladney, E.S.; Koski, N.L.
1991-10-01
This report documents the continuing quality assurance efforts of the Health and Environmental Chemistry Group (HSE-9) at the Los Alamos National Laboratory. The philosophy, methodology, computing resources, and laboratory information management system used by the quality assurance program to encompass the diversity of analytical chemistry practiced in the group are described. Included in the report are all quality assurance reference materials used, along with their certified or consensus concentrations, and all analytical chemistry quality assurance measurements made by HSE-9 during 1990.
Using MODIS Terra 250 m Imagery to Map Concentrations of Total Suspended Matter in Coastal Waters
NASA Technical Reports Server (NTRS)
Miller, Richard L.; McKee, Brent A.
2004-01-01
High concentrations of suspended particulate matter in coastal waters directly effect or govern numerous water column and benthic processes. The concentration of suspended sediments derived from bottom sediment resuspension or discharge of sediment-laden rivers is highly variable over a wide range of time and space scales. Although there has been considerable effort to use remotely sensed images to provide synoptic maps of suspended particulate matter, there are limited routine applications of this technology due in-part to the low spatial resolution, long revisit period, or cost of most remotely sensed data. In contrast, near daily coverage of medium-resolution data is available from the MODIS Terra instrument without charge from several data distribution gateways. Equally important, several display and processing programs are available that operate on low cost computers.
Engineering specification and system design for CAD/CAM of custom shoes: UMC project effort
NASA Technical Reports Server (NTRS)
Bao, Han P.
1991-01-01
The goal of this project is to supplement the footwear design system of North Carolina State University (NCSU) with a software module to design and manufacture a combination sole. The four areas of concentration were: customization of NASCAD (NASA Computer Aided Design) to the footwear project; use of CENCIT data; computer aided manufacturing activities; and beginning work for the bottom elements of shoes. The task of generating a software module for producing a sole was completed with a demonstrated product realization. The software written in C was delivered to NCSU for inclusion in their design system for custom footwear known as LASTMOD. The machining process of the shoe last was improved using a spiral tool path approach.
System-Level Virtualization Research at Oak Ridge National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Stephen L; Vallee, Geoffroy R; Naughton, III, Thomas J
2010-01-01
System-level virtualization is today enjoying a rebirth as a technique to effectively share what were then considered large computing resources to subsequently fade from the spotlight as individual workstations gained in popularity with a one machine - one user approach. One reason for this resurgence is that the simple workstation has grown in capability to rival that of anything available in the past. Thus, computing centers are again looking at the price/performance benefit of sharing that single computing box via server consolidation. However, industry is only concentrating on the benefits of using virtualization for server consolidation (enterprise computing) whereas ourmore » interest is in leveraging virtualization to advance high-performance computing (HPC). While these two interests may appear to be orthogonal, one consolidating multiple applications and users on a single machine while the other requires all the power from many machines to be dedicated solely to its purpose, we propose that virtualization does provide attractive capabilities that may be exploited to the benefit of HPC interests. This does raise the two fundamental questions of: is the concept of virtualization (a machine sharing technology) really suitable for HPC and if so, how does one go about leveraging these virtualization capabilities for the benefit of HPC. To address these questions, this document presents ongoing studies on the usage of system-level virtualization in a HPC context. These studies include an analysis of the benefits of system-level virtualization for HPC, a presentation of research efforts based on virtualization for system availability, and a presentation of research efforts for the management of virtual systems. The basis for this document was material presented by Stephen L. Scott at the Collaborative and Grid Computing Technologies meeting held in Cancun, Mexico on April 12-14, 2007.« less
NASA Astrophysics Data System (ADS)
Peruchena, Carlos M. Fernández; García-Barberena, Javier; Guisado, María Vicenta; Gastón, Martín
2016-05-01
The design of Concentrating Solar Thermal Power (CSTP) systems requires a detailed knowledge of the dynamic behavior of the meteorology at the site of interest. Meteorological series are often condensed into one representative year with the aim of data volume reduction and speeding-up of energy system simulations, defined as Typical Meteorological Year (TMY). This approach seems to be appropriate for rather detailed simulations of a specific plant; however, in previous stages of the design of a power plant, especially during the optimization of the large number of plant parameters before a final design is reached, a huge number of simulations are needed. Even with today's technology, the computational effort to simulate solar energy system performance with one year of data at high frequency (as 1-min) may become colossal if a multivariable optimization has to be performed. This work presents a simple and efficient methodology for selecting number of individual days able to represent the electrical production of the plant throughout the complete year. To achieve this objective, a new procedure for determining a reduced set of typical weather data in order to evaluate the long-term performance of a solar energy system is proposed. The proposed methodology is based on cluster analysis and permits to drastically reduce computational effort related to the calculation of a CSTP plant energy yield by simulating a reduced number of days from a high frequency TMY.
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
Qi, Xingliang; Zhang, Jing; Liu, Yapeng; Ji, Shuang; Chen, Zheng; Sluiter, Judith K; Deng, Huihua
2014-04-01
The present study aims to investigate the relationship between effort-reward imbalance and hair cortisol concentration among teachers to examine whether hair cortisol can be a biomarker of chronic work stress. Hair samples were collected from 39 female teachers from three kindergartens. Cortisol was extracted from the hair samples with methanol, and cortisol concentrations were measured with high performance liquid chromatography-tandem mass spectrometry. Work stress was measured using the effort-reward imbalance scale. The ratio of effort to reward showed significantly positive association with hair cortisol concentration. The cortisol concentration in the system increases with the effort-reward imbalance. Measurement of hair cortisol can become a useful biomarker of chronic work stress. Copyright © 2014 Elsevier Inc. All rights reserved.
High level cognitive information processing in neural networks
NASA Technical Reports Server (NTRS)
Barnden, John A.; Fields, Christopher A.
1992-01-01
Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.
Using Computational Toxicology to Enable Risk-Based ...
presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.
EPA’s National Center for Computational Toxicology is engaged in high-profile research efforts to improve the ability to more efficiently and effectively prioritize and screen thousands of environmental chemicals for potential toxicity. A central component of these efforts invol...
Bayesian spatio-temporal modeling of particulate matter concentrations in Peninsular Malaysia
NASA Astrophysics Data System (ADS)
Manga, Edna; Awang, Norhashidah
2016-06-01
This article presents an application of a Bayesian spatio-temporal Gaussian process (GP) model on particulate matter concentrations from Peninsular Malaysia. We analyze daily PM10 concentration levels from 35 monitoring sites in June and July 2011. The spatiotemporal model set in a Bayesian hierarchical framework allows for inclusion of informative covariates, meteorological variables and spatiotemporal interactions. Posterior density estimates of the model parameters are obtained by Markov chain Monte Carlo methods. Preliminary data analysis indicate information on PM10 levels at sites classified as industrial locations could explain part of the space time variations. We include the site-type indicator in our modeling efforts. Results of the parameter estimates for the fitted GP model show significant spatio-temporal structure and positive effect of the location-type explanatory variable. We also compute some validation criteria for the out of sample sites that show the adequacy of the model for predicting PM10 at unmonitored sites.
Advanced ballistic range technology
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1993-01-01
Experimental interferograms, schlieren, and shadowgraphs are used for quantitative and qualitative flow-field studies. These images are created by passing light through a flow field, and the recorded intensity patterns are functions of the phase shift and angular deflection of the light. As part of the grant NCC2-583, techniques and software have been developed for obtaining phase shifts from finite-fringe interferograms and for constructing optical images from Computational Fluid Dynamics (CFD) solutions. During the period from 1 Nov. 1992 - 30 Jun. 1993, research efforts have been concentrated in improving these techniques.
NASA Astrophysics Data System (ADS)
Gerjuoy, Edward
2005-06-01
The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.
Mahmoudi, Babak; Erfanian, Abbas
2006-11-01
Mental imagination is the essential part of the most EEG-based communication systems. Thus, the quality of mental rehearsal, the degree of imagined effort, and mind controllability should have a major effect on the performance of electro-encephalogram (EEG) based brain-computer interface (BCI). It is now well established that mental practice using motor imagery improves motor skills. The effects of mental practice on motor skill learning are the result of practice on central motor programming. According to this view, it seems logical that mental practice should modify the neuronal activity in the primary sensorimotor areas and consequently change the performance of EEG-based BCI. For developing a practical BCI system, recognizing the resting state with eyes opened and the imagined voluntary movement is important. For this purpose, the mind should be able to focus on a single goal for a period of time, without deviation to another context. In this work, we are going to examine the role of mental practice and concentration skills on the EEG control during imaginative hand movements. The results show that the mental practice and concentration can generally improve the classification accuracy of the EEG patterns. It is found that mental training has a significant effect on the classification accuracy over the primary motor cortex and frontal area.
Proposed Directions for Research in Computer-Based Education.
ERIC Educational Resources Information Center
Waugh, Michael L.
Several directions for potential research efforts in the field of computer-based education (CBE) are discussed. (For the purposes of this paper, CBE is defined as any use of computers to promote learning with no intended inference as to the specific nature or organization of the educational application under discussion.) Efforts should be directed…
NASA Astrophysics Data System (ADS)
Yoon, S.
2016-12-01
To define geodetic reference frame using GPS data collected by Continuously Operating Reference Stations (CORS) network, historical GPS data needs to be reprocessed regularly. Reprocessing GPS data collected by upto 2000 CORS sites for the last two decades requires a lot of computational resource. At National Geodetic Survey (NGS), there has been one completed reprocessing in 2011, and currently, the second reprocessing is undergoing. For the first reprocessing effort, in-house computing resource was utilized. In the current second reprocessing effort, outsourced cloud computing platform is being utilized. In this presentation, the outline of data processing strategy at NGS is described as well as the effort to parallelize the data processing procedure in order to maximize the benefit of the cloud computing. The time and cost savings realized by utilizing cloud computing approach will also be discussed.
Instability Mechanisms of Thermally-Driven Interfacial Flows in Liquid-Encapsulated Crystal Growth
NASA Technical Reports Server (NTRS)
Haj-Hariri, Hossein; Borhan, Ali
1997-01-01
During the past year, a great deal of effort was focused on the enhancement and refinement of the computational tools developed as part of our previous NASA grant. In particular, the interface mollification algorithm developed earlier was extended to incorporate the effects of surface-rheological properties in order to allow the study of thermocapillary flows in the presence of surface contamination. These tools will be used in the computational component of the proposed research in the remaining years of this grant. A detailed description of the progress made in this area is provided elsewhere. Briefly, the method developed allows for the convection and diffusion of bulk-insoluble surfactants on a moving and deforming interface. The novelty of the method is its grid independence: there is no need for front tracking, surface reconstruction, body-fitted grid generation, or metric evaluations; these are all very expensive computational tasks in three dimensions. For small local radii of curvature there is a need for local grid adaption so that the smearing thickness remains a small fraction of the radius of curvature. A special Neumann boundary condition was devised and applied so that the calculated surfactant concentration has no variations normal to the interface, and it is hence truly a surface-defined quantity. The discretized governing equations are solved subsequently using a time-split integration scheme which updates the concentration and the shape successively. Results demonstrate excellent agreement between the computed and exact solutions.
ERIC Educational Resources Information Center
Ashcraft, Catherine
2015-01-01
To date, girls and women are significantly underrepresented in computer science and technology. Concerns about this underrepresentation have sparked a wealth of educational efforts to promote girls' participation in computing, but these programs have demonstrated limited impact on reversing current trends. This paper argues that this is, in part,…
NASA Technical Reports Server (NTRS)
Ferzali, Wassim; Zacharakis, Vassilis; Upadhyay, Triveni; Weed, Dennis; Burke, Gregory
1995-01-01
The ICAO Aeronautical Mobile Communications Panel (AMCP) completed the drafting of the Aeronautical Mobile Satellite Service (AMSS) Standards and Recommended Practices (SARP's) and the associated Guidance Material and submitted these documents to ICAO Air Navigation Commission (ANC) for ratification in May 1994. This effort, encompassed an extensive, multi-national SARP's validation. As part of this activity, the US Federal Aviation Administration (FAA) sponsored an effort to validate the SARP's via computer simulation. This paper provides a description of this effort. Specifically, it describes: (1) the approach selected for the creation of a high-fidelity AMSS computer model; (2) the test traffic generation scenarios; and (3) the resultant AMSS performance assessment. More recently, the AMSS computer model was also used to provide AMSS performance statistics in support of the RTCA standardization activities. This paper describes this effort as well.
Kinetic Monte Carlo Simulation of Oxygen Diffusion in Ytterbium Disilicate
NASA Astrophysics Data System (ADS)
Good, Brian
2015-03-01
Ytterbium disilicate is of interest as a potential environmental barrier coating for aerospace applications, notably for use in next generation jet turbine engines. In such applications, the diffusion of oxygen and water vapor through these coatings is undesirable if high temperature corrosion is to be avoided. In an effort to understand the diffusion process in these materials, we have performed kinetic Monte Carlo simulations of vacancy-mediated oxygen diffusion in Ytterbium Disilicate. Oxygen vacancy site energies and diffusion barrier energies are computed using Density Functional Theory. We find that many potential diffusion paths involve large barrier energies, but some paths have barrier energies smaller than one electron volt. However, computed vacancy formation energies suggest that the intrinsic vacancy concentration is small in the pure material, with the result that the material is unlikely to exhibit significant oxygen permeability.
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes that permit more accurate and efficient three-dimensional analysis of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. This report is presented in two volumes. Volume 1 describes effort performed under Task 4B, Special Finite Element Special Function Models, while Volume 2 concentrates on Task 4C, Advanced Special Functions Models.
An insight into cyanobacterial genomics--a perspective.
Lakshmi, Palaniswamy Thanga Velan
2007-05-20
At the turn of the millennium, cyanobacteria deserve attention to be reviewed to understand the past, present and future. The advent of post genomic research, which encompasses functional genomics, structural genomics, transcriptomics, pharmacogenomics, proteomics and metabolomics that allows a systematic wide approach for biological system studies. Thus by exploiting genomic and associated protein information through computational analyses, the fledging information that are generated by biotechnological analyses, could be well extrapolated to fill in the lacuna of scarce information on cyanobacteria and as an effort this paper attempts to highlights the perspectives available and awakens researcher to concentrate in the field of cyanobacterial informatics.
A study of Minnesota land and water resources using remote sensing, volume 13
NASA Technical Reports Server (NTRS)
1980-01-01
Progress in the use of LANDSAT data to classify wetlands in the Upper Mississippi River Valley and efforts to evaluate stress in corn and soybean crops are described. Satellite remote sensing data was used to measure particle concentrations in Lake Superior and several different kinds of remote sensing data were synergistically combined in order to identify near surface bedrock in Minnesota. Data analysis techniques which separate those activities requiring extensive computing form those involving a great deal of user interaction were developed to allow the latter to be done in the user's office or in the field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eyler, L L; Trent, D S; Budden, M J
During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.
42 CFR 441.182 - Maintenance of effort: Computation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... SERVICES Inpatient Psychiatric Services for Individuals Under Age 21 in Psychiatric Facilities or Programs § 441.182 Maintenance of effort: Computation. (a) For expenditures for inpatient psychiatric services... total State Medicaid expenditures in the current quarter for inpatient psychiatric services and...
London, Nir; Ambroggio, Xavier
2014-02-01
Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.
Turbulence modeling of free shear layers for high performance aircraft
NASA Technical Reports Server (NTRS)
Sondak, Douglas
1993-01-01
In many flowfield computations, accuracy of the turbulence model employed is frequently a limiting factor in the overall accuracy of the computation. This is particularly true for complex flowfields such as those around full aircraft configurations. Free shear layers such as wakes, impinging jets (in V/STOL applications), and mixing layers over cavities are often part of these flowfields. Although flowfields have been computed for full aircraft, the memory and CPU requirements for these computations are often excessive. Additional computer power is required for multidisciplinary computations such as coupled fluid dynamics and conduction heat transfer analysis. Massively parallel computers show promise in alleviating this situation, and the purpose of this effort was to adapt and optimize CFD codes to these new machines. The objective of this research effort was to compute the flowfield and heat transfer for a two-dimensional jet impinging normally on a cool plate. The results of this research effort were summarized in an AIAA paper titled 'Parallel Implementation of the k-epsilon Turbulence Model'. Appendix A contains the full paper.
Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter
2013-01-01
Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032
NASA Astrophysics Data System (ADS)
Stout, Jane G.; Blaney, Jennifer M.
2017-10-01
Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.
Vassena, Eliana; Deraeve, James; Alexander, William H
2017-10-01
Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO model based on hierarchical error prediction, developed to explain MPFC-DLPFC interactions. We derive behavioral predictions that describe how effort and reward information is coded in PFC and how changing the configuration of such environmental information might affect decision-making and task performance involving motivation.
Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.
2016-01-01
A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clouse, C. J.; Edwards, M. J.; McCoy, M. G.
2015-07-07
Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.
GPS interferometric attitude and heading determination: Initial flight test results
NASA Technical Reports Server (NTRS)
Vangraas, Frank; Braasch, Michael
1991-01-01
Attitude and heading determination using GPS interferometry is a well-understood concept. However, efforts have been concentrated mainly in the development of robust algorithms and applications for low dynamic, rigid platforms (e.g., shipboard). This paper presents results of what is believed by the authors to be the first realtime flight test of a GPS attitude and heading determination system. The system is installed in Ohio University's Douglas DC-3 research aircraft. Signals from four antennas are processed by an Ashtech 3DF 24-channel GPS receiver. Data from the receiver are sent to a microcomputer for storage and further computations. Attitude and heading data are sent to a second computer for display on a software generated artificial horizon. Demonstration of this technique proves its candidacy for augmentation of aircraft state estimation for flight control and navigation as well as for numerous other applications.
RANS modeling of scalar dispersion from localized sources within a simplified urban-area model
NASA Astrophysics Data System (ADS)
Rossi, Riccardo; Capra, Stefano; Iaccarino, Gianluca
2011-11-01
The dispersion of a passive scalar downstream a localized source within a simplified urban-like geometry is examined by means of RANS scalar flux models. The computations are conducted under conditions of neutral stability and for three different incoming wind directions (0°, 45°, 90°) at a roughness Reynolds number of Ret = 391. A Reynolds stress transport model is used to close the flow governing equations whereas both the standard eddy-diffusivity closure and algebraic flux models are employed to close the transport equation for the passive scalar. The comparison with a DNS database shows improved reliability from algebraic scalar flux models towards predicting both the mean concentration and the plume structure. Since algebraic flux models do not increase substantially the computational effort, the results indicate that the use of tensorial-diffusivity can be promising tool for dispersion simulations for the urban environment.
New Mexico district work-effort analysis computer program
Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.
1972-01-01
The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation 6600 computer system. Central processing computer time has seldom exceeded 5 minutes on the longest year-to-date runs.
Practices of excellent companies in the drug industry.
Pringle, F; Kleiner, B H
1997-01-01
Examines excellence in three major pharmaceutical companies: Merck, Lilly, and Glaxo. Provides an overview of recent trends in the health care industry. Shows that although all three companies are facing tough competition and strict cost-containment pressures, they continue to develop innovative strategies for increasing the quality of their product offering. Analyses Merck's recent acquisition of Medco and its implications; also highlights Merck's "Vital Interests" programme. Discusses Lilly's recent purchase of PCS from McKesson Drug and Lilly's recent efforts to concentrate on its core business. Introduces Helix, Glaxo's new computer network for pharmacists and explains the benefits of this unique service, both to its users and the sponsor.
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
[Earth Science Technology Office's Computational Technologies Project
NASA Technical Reports Server (NTRS)
Fischer, James (Technical Monitor); Merkey, Phillip
2005-01-01
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
Participation in multilateral effort to develop high performance integrated CPC evacuated collectors
NASA Astrophysics Data System (ADS)
Winston, R.; Ogallagher, J. J.
1992-05-01
The University of Chicago Solar Energy Group has had a continuing program and commitment to develop an advanced evacuated solar collector integrating nonimaging concentration into its design. During the period from 1985-1987, some of our efforts were directed toward designing and prototyping a manufacturable version of an Integrated Compound Parabolic Concentrator (ICPC) evacuated collector tube as part of an international cooperative effort involving six organizations in four different countries. This 'multilateral' project made considerable progress towards a commercially practical collector. One of two basic designs considered employed a heat pipe and an internal metal reflector CPC. We fabricated and tested two large diameter (125 mm) borosilicate glass collector tubes to explore this concept. The other design also used a large diameter (125 mm) glass tube but with a specially configured internal shaped mirror CPC coupled to a U-tube absorber. Performance projections in a variety of systems applications using the computer design tools developed by the International Energy Agency (IEA) task on evacuated collectors were used to optimize the optical and thermal design. The long-term goal of this work continues to be the development of a high efficiency, low cost solar collector to supply solar thermal energy at temperatures up to 250 C. Some experience and perspectives based on our work are presented and reviewed. Despite substantial progress, the stability of research support and the market for commercial solar thermal collectors were such that the project could not be continued. A cooperative path involving university, government, and industrial collaboration remains the most attractive near term option for developing a commercial ICPC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xuefang; Hecht, Ethan S.; Christopher, David M.
Much effort has been made to model hydrogen releases from leaks during potential failures of hydrogen storage systems. A reduced-order jet model can be used to quickly characterize these flows, with low computational cost. Notional nozzle models are often used to avoid modeling the complex shock structures produced by the underexpanded jets by determining an “effective” source to produce the observed downstream trends. In our work, the mean hydrogen concentration fields were measured in a series of subsonic and underexpanded jets using a planar laser Rayleigh scattering system. Furthermore, we compared the experimental data to a reduced order jet modelmore » for subsonic flows and a notional nozzle model coupled to the jet model for underexpanded jets. The values of some key model parameters were determined by comparisons with the experimental data. Finally, the coupled model was also validated against hydrogen concentrations measurements for 100 and 200 bar hydrogen jets with the predictions agreeing well with data in the literature.« less
Computational Methods for Stability and Control (COMSAC): The Time Has Come
NASA Technical Reports Server (NTRS)
Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.
2005-01-01
Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.
Schenk, Liam N.; Anderson, Chauncey W.; Diaz, Paul; Stewart, Marc A.
2016-12-22
Executive SummarySuspended-sediment and total phosphorus loads were computed for two sites in the Upper Klamath Basin on the Wood and Williamson Rivers, the two main tributaries to Upper Klamath Lake. High temporal resolution turbidity and acoustic backscatter data were used to develop surrogate regression models to compute instantaneous concentrations and loads on these rivers. Regression models for the Williamson River site showed strong correlations of turbidity with total phosphorus and suspended-sediment concentrations (adjusted coefficients of determination [Adj R2]=0.73 and 0.95, respectively). Regression models for the Wood River site had relatively poor, although statistically significant, relations of turbidity with total phosphorus, and turbidity and acoustic backscatter with suspended sediment concentration, with high prediction uncertainty. Total phosphorus loads for the partial 2014 water year (excluding October and November 2013) were 39 and 28 metric tons for the Williamson and Wood Rivers, respectively. These values are within the low range of phosphorus loads computed for these rivers from prior studies using water-quality data collected by the Klamath Tribes. The 2014 partial year total phosphorus loads on the Williamson and Wood Rivers are assumed to be biased low because of the absence of data from the first 2 months of water year 2014, and the drought conditions that were prevalent during that water year. Therefore, total phosphorus and suspended-sediment loads in this report should be considered as representative of a low-water year for the two study sites. Comparing loads from the Williamson and Wood River monitoring sites for November 2013–September 2014 shows that the Williamson and Sprague Rivers combined, as measured at the Williamson River site, contributed substantially more suspended sediment to Upper Klamath Lake than the Wood River, with 4,360 and 1,450 metric tons measured, respectively.Surrogate techniques have proven useful at the two study sites, particularly in using turbidity to compute suspended-sediment concentrations in the Williamson River. This proof-of-concept effort for computing total phosphorus concentrations using turbidity at the Williamson and Wood River sites also has shown that with additional samples over a wide range of flow regimes, high-temporal-resolution total phosphorus loads can be estimated on a daily, monthly, and annual basis, along with uncertainties for total phosphorus and suspended-sediment concentrations computed using regression models. Sediment-corrected backscatter at the Wood River has potential for estimating suspended-sediment loads from the Wood River Valley as well, with additional analysis of the variable streamflow measured at that site. Suspended-sediment and total phosphorus loads with a high level of temporal resolution will be useful to water managers, restoration practitioners, and scientists in the Upper Klamath Basin working toward the common goal of decreasing nutrient and sediment loads in Upper Klamath Lake.
A compendium of computational fluid dynamics at the Langley Research Center
NASA Technical Reports Server (NTRS)
1980-01-01
Through numerous summary examples, the scope and general nature of the computational fluid dynamics (CFD) effort at Langley is identified. These summaries will help inform researchers in CFD and line management at Langley of the overall effort. In addition to the inhouse efforts, out of house CFD work supported by Langley through industrial contracts and university grants are included. Researchers were encouraged to include summaries of work in preliminary and tentative states of development as well as current research approaching definitive results.
Motivational Beliefs, Student Effort, and Feedback Behaviour in Computer-Based Formative Assessment
ERIC Educational Resources Information Center
Timmers, Caroline F.; Braber-van den Broek, Jannie; van den Berg, Stephanie M.
2013-01-01
Feedback can only be effective when students seek feedback and process it. This study examines the relations between students' motivational beliefs, effort invested in a computer-based formative assessment, and feedback behaviour. Feedback behaviour is represented by whether a student seeks feedback and the time a student spends studying the…
Establishing a K-12 Circuit Design Program
ERIC Educational Resources Information Center
Inceoglu, Mustafa M.
2010-01-01
Outreach, as defined by Wikipedia, is an effort by an organization or group to connect its ideas or practices to the efforts of other organizations, groups, specific audiences, or the general public. This paper describes a computer engineering outreach project of the Department of Computer Engineering at Ege University, Izmir, Turkey, to a local…
ERIC Educational Resources Information Center
Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.
2009-01-01
Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…
Serrano, M A; Salvador, A; González-Bono, E G; Sanchís, C; Suay, F
2001-06-01
Relationships between perceived exertion and blood lactate have usually been studied in laboratory or training contexts but not in competition, the most important setting in which sports performance is evaluated. The purpose of this study was to examine the relationships between psychological and physiological indices of the physical effort in a competition setting, taking into account the duration of effort. For this, we employed two Ratings of Perceived Exertion (RPE and CR-10) and lactic acid plasma concentration as a biological marker of the effort performed. 13 male judo fighters who participated in a sports club competition provided capillary blood samples to assay lactate concentrations and indicated on scale their Recall of Perceived Exertion in the total competition and again in just the Last Fight to compare the usefulness of RPE and CR-10 in assessing discrete bouts of effort and a whole session. Analysis showed that perceived exertion or the effort made during the whole competition was positively and significantly related to maximal lactate concentration and lactate increase in competition, thus extending the validity of this scale to sports contests. The Recall of Perceived Exertion scores were not significantly correlated with the duration of effort.
Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Givi, P.; Madnia, C. K.; Steinberger, C. J.; Frankel, S. H.
1992-01-01
The basic objective of this research is to extend the capabilities of Large Eddy Simulations (LES) and Direct Numerical Simulations (DNS) for the computational analyses of high speed reacting flows. In the efforts related to LES, we were primarily involved with assessing the performance of the various modern methods based on the Probability Density Function (PDF) methods for providing closures for treating the subgrid fluctuation correlations of scalar quantities in reacting turbulent flows. In the work on DNS, we concentrated on understanding some of the relevant physics of compressible reacting flows by means of statistical analysis of the data generated by DNS of such flows. In the research conducted in the second year of this program, our efforts focused on the modeling of homogeneous compressible turbulent flows by PDF methods, and on DNS of non-equilibrium reacting high speed mixing layers. Some preliminary work is also in progress on PDF modeling of shear flows, and also on LES of such flows.
Effects of the 2008 flood on economic performance and food security in Yemen: a simulation analysis.
Breisinger, Clemens; Ecker, Olivier; Thiele, Rainer; Wiebelt, Manfred
2016-04-01
Extreme weather events such as floods and droughts can have devastating consequences for individual well being and economic development, in particular in poor societies with limited availability of coping mechanisms. Combining a dynamic computable general equilibrium model of the Yemeni economy with a household-level calorie consumption simulation model, this paper assesses the economy-wide, agricultural and food security effects of the 2008 tropical storm and flash flood that hit the Hadramout and Al-Mahrah governorates. The estimation results suggest that agricultural value added, farm household incomes and rural food security deteriorated long term in the flood-affected areas. Due to economic spillover effects, significant income losses and increases in food insecurity also occurred in areas that were unaffected by flooding. This finding suggests that while most relief efforts are typically concentrated in directly affected areas, future efforts should also consider surrounding areas and indirectly affected people. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Impact of remote sensing upon the planning, management, and development of water resources
NASA Technical Reports Server (NTRS)
Loats, H. L.; Fowler, T. R.; Frech, S. L.
1974-01-01
A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.
An opportunity cost model of subjective effort and task performance
Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus
2013-01-01
Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775
An efficient method for hybrid density functional calculation with spin-orbit coupling
NASA Astrophysics Data System (ADS)
Wang, Maoyuan; Liu, Gui-Bin; Guo, Hong; Yao, Yugui
2018-03-01
In first-principles calculations, hybrid functional is often used to improve accuracy from local exchange correlation functionals. A drawback is that evaluating the hybrid functional needs significantly more computing effort. When spin-orbit coupling (SOC) is taken into account, the non-collinear spin structure increases computing effort by at least eight times. As a result, hybrid functional calculations with SOC are intractable in most cases. In this paper, we present an approximate solution to this problem by developing an efficient method based on a mixed linear combination of atomic orbital (LCAO) scheme. We demonstrate the power of this method using several examples and we show that the results compare very well with those of direct hybrid functional calculations with SOC, yet the method only requires a computing effort similar to that without SOC. The presented technique provides a good balance between computing efficiency and accuracy, and it can be extended to magnetic materials.
Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.
Vassena, Eliana; Holroyd, Clay B; Alexander, William H
2017-01-01
In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.
[Earth and Space Sciences Project Services for NASA HPCC
NASA Technical Reports Server (NTRS)
Merkey, Phillip
2002-01-01
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
Transformed Fourier and Fick equations for the control of heat and mass diffusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guenneau, S.; Petiteau, D.; Zerrad, M.
We review recent advances in the control of diffusion processes in thermodynamics and life sciences through geometric transforms in the Fourier and Fick equations, which govern heat and mass diffusion, respectively. We propose to further encompass transport properties in the transformed equations, whereby the temperature is governed by a three-dimensional, time-dependent, anisotropic heterogeneous convection-diffusion equation, which is a parabolic partial differential equation combining the diffusion equation and the advection equation. We perform two dimensional finite element computations for cloaks, concentrators and rotators of a complex shape in the transient regime. We precise that in contrast to invisibility cloaks for waves,more » the temperature (or mass concentration) inside a diffusion cloak crucially depends upon time, its distance from the source, and the diffusivity of the invisibility region. However, heat (or mass) diffusion outside cloaks, concentrators and rotators is unaffected by their presence, whatever their shape or position. Finally, we propose simplified designs of layered cylindrical and spherical diffusion cloaks that might foster experimental efforts in thermal and biochemical metamaterials.« less
NASA Astrophysics Data System (ADS)
Daniels, R. M.; Jacobs, J. M.; Paranjpye, R.; Lanerolle, L. W.
2016-02-01
The Pathogens group of the NOAA Ecological Forecasting Roadmap has begun a range of efforts to monitor and predict potential pathogen occurrences in shellfish and in U.S. Coastal waters. NOAA/NCOSS along with NMFS/NWFSC have led the Pathogens group and the development of web based tools and forecasts for both Vibrio vulnificus and Vibrio parahaemolyticus. A strong relationship with FDA has allowed the team to develop forecasts that will serve U.S. shellfish harvesters and consumers. NOAA/NOS/CSDL has provided modeling expertise to help the group use the hydrodynamic models and their forecasts of physical variables that drive the ecological predictions. The NOAA/NWS/Ocean Prediction Center has enabled these ecological forecasting efforts by providing the infrastructure, computing knowledge and experience in an operational culture. Daily forecasts have been demonstrated and are available from the web for the Chesapeake Bay, Delaware Bay, Northern Gulf of Mexico, Tampa Bay, Puget Sound and Long Island Sound. The forecast systems run on a daily basis being fed by NOS model data from the NWS/NCEP super computers. New forecast tools including V. parahaemolyticus post harvest growth and doubling time in ambient air temperature will be described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David
The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes weremore » used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge« less
Viscous flow computations for elliptical two-duct version of the SSME hot gas manifold
NASA Technical Reports Server (NTRS)
Roger, R. P.
1986-01-01
The objective of the effort was to numerically simulate viscous subsonic flow in a proposed elliptical two-duct version of the fuel side Hot Gas Manifold (HGM) for the Space Shuttle Main Engine (SSME). The numerical results were to complement both water flow and air flow experiments in the two-duct geometry performed at NASA-MSFC and Rocketdyne. The three-dimensional character of the HGM consists of two essentially different geometries. The first part of the construction is a concentric shell duct structure which channels the gases from a turbine exit into the second part comprised of two cylindrically shaped transfer ducts. The initial concentric shell portion can be further subdivided into a turnaround section and a bowl section. The turnaround duct (TAD) changes the direction of the mean flow by 180 degress from a smaller radius to a larger radius duct which discharges into the bowl. The cylindrical transfer ducts are attached to the bowl on one side thus providing a plane of symmetry midway between the two. Centerline flow distance from the TAD inlet to the transfer duct exit is approximately two feet. Details of the approach used to numerically simulate laminar or turbulent flow in the HGM geometry are presented. Computational results are presented and discussed.
A Survey of Techniques for Approximate Computing
Mittal, Sparsh
2016-03-18
Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less
NASA Astrophysics Data System (ADS)
Gedeon, M.; Vandersteen, K.; Rogiers, B.
2012-04-01
Radionuclide concentrations in aquifers represent an important indicator in estimating the impact of a planned surface disposal for low and medium level short-lived radioactive waste in Belgium, developed by the Belgian Agency for Radioactive Waste and Enriched Fissile Materials (ONDRAF/NIRAS), who also coordinates and leads the corresponding research. Estimating aquifer concentrations for individual radionuclides represents a computational challenge because (a) different retardation values are applied to different hydrogeologic units and (b) sequential decay reactions with radionuclides of various sorption characteristics cause long computational times until a steady-state is reached. The presented work proposes a methodology reducing substantially the computational effort by postprocessing the results of a prior non-reactive tracer simulation. These advective transport results represent the steady-state concentration - source flux ratio and the break-through time at each modelling cell. These two variables are further used to estimate the individual radionuclide concentrations by (a) scaling the steady-state concentrations to the source fluxes of individual radionuclides; (b) applying the radioactive decay and ingrowth in a decay chain; (c) scaling the travel time by the retardation factor and (d) applying linear sorption. While all steps except (b) require solving simple linear equations, applying ingrowth of individual radionuclides in decay chains requires solving the differential Bateman equation. This equation needs to be solved once for a unit radionuclide activity at all arrival times found in the numerical grid. The ratios between the parent nuclide activity and the progeny activities are then used in the postprocessing. Results are presented for discrete points and examples of radioactive plume maps are given. These results compare well to the results achieved using a full numerical simulation including the respective chemical reaction processes. Although the proposed method represents a fast way to estimate the radionuclide concentrations without performing timely challenging simulations, its applicability has some limits. The radionuclide source needs to be assumed constant during the period of achieving a steady-state in the model. Otherwise, the source variability of individual radionuclides needs to be modelled using a numerical simulation. However, such a situation only occurs in cases of source variability in a period until steady-state is reached and such a simulation takes a relatively short time. The proposed method enables an effective estimation of individual radionuclide concentrations in the frame of performance assessment of a radioactive waste disposal. Reducing the calculation time to a minimum enables performing sensitivity and uncertainty analyses, testing alternative models, etc. thus enhancing the overall quality of the modelling analysis.
NASA Technical Reports Server (NTRS)
Deepak, A.; Becher, J.
1979-01-01
Advanced remote sensing techniques and inversion methods for the measurement of characteristics of aerosol and gaseous species in the atmosphere were investigated. Of particular interest were the physical and chemical properties of aerosols, such as their size distribution, number concentration, and complex refractive index, and the vertical distribution of these properties on a local as well as global scale. Remote sensing techniques for monitoring of tropospheric aerosols were developed as well as satellite monitoring of upper tropospheric and stratospheric aerosols. Computer programs were developed for solving multiple scattering and radiative transfer problems, as well as inversion/retrieval problems. A necessary aspect of these efforts was to develop models of aerosol properties.
34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary compute maintenance of effort in...
34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary compute maintenance of effort in...
34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary compute maintenance of effort in...
34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary compute maintenance of effort in...
34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in...
NASA Astrophysics Data System (ADS)
Chonacky, Norman; Winch, David
2008-04-01
There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.
Micro-video display with ocular tracking and interactive voice control
NASA Technical Reports Server (NTRS)
Miller, James E.
1993-01-01
In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.
Coupled Hydrogeophysical Inversion and Hydrogeological Data Fusion
NASA Astrophysics Data System (ADS)
Cirpka, O. A.; Schwede, R. L.; Li, W.
2012-12-01
Tomographic geophysical monitoring methods give the opportunity to observe hydrogeological tests at higher spatial resolution than is possible with classical hydraulic monitoring tools. This has been demonstrated in a substantial number of studies in which electrical resistivity tomography (ERT) has been used to monitor salt-tracer experiments. It is now accepted that inversion of such data sets requires a fully coupled framework, explicitly accounting for the hydraulic processes (groundwater flow and solute transport), the relationship between solute and geophysical properties (petrophysical relationship such as Archie's law), and the governing equations of the geophysical surveying techniques (e.g., the Poisson equation) as consistent coupled system. These data sets can be amended with data from other - more direct - hydrogeological tests to infer the distribution of hydraulic aquifer parameters. In the inversion framework, meaningful condensation of data does not only contribute to inversion efficiency but also increases the stability of the inversion. In particular, transient concentration data themselves only weakly depend on hydraulic conductivity, and model improvement using gradient-based methods is only possible when a substantial agreement between measurements and model output already exists. The latter also holds when concentrations are monitored by ERT. Tracer arrival times, by contrast, show high sensitivity and a more monotonic dependence on hydraulic conductivity than concentrations themselves. Thus, even without using temporal-moment generating equations, inverting travel times rather than concentrations or related geoelectrical signals themselves is advantageous. We have applied this approach to concentrations measured directly or via ERT, and to heat-tracer data. We present a consistent inversion framework including temporal moments of concentrations, geoelectrical signals obtained during salt-tracer tests, drawdown data from hydraulic tomography and flowmeter measurements to identify mainly the hydraulic-conductivity distribution. By stating the inversion as geostatistical conditioning problem, we obtain parameter sets together with their correlated uncertainty. While we have applied the quasi-linear geostatistical approach as inverse kernel, other methods - such as ensemble Kalman methods - may suit the same purpose, particularly when many data points are to be included. In order to identify 3-D fields, discretized by about 50 million grid points, we use the high-performance-computing framework DUNE to solve the involved partial differential equations on midrange computer cluster. We have quantified the worth of different data types in these inference problems. In practical applications, the constitutive relationships between geophysical, thermal, and hydraulic properties can pose a problem, requiring additional inversion. However, not well constrained transient boundary conditions may put inversion efforts on larger (e.g. regional) scales even more into question. We envision that future hydrogeophysical inversion efforts will target boundary conditions, such as groundwater recharge rates, in conjunction with - or instead of - aquifer parameters. By this, the distinction between data assimilation and parameter estimation will gradually vanish.
Measuring and Modeling Change in Examinee Effort on Low-Stakes Tests across Testing Occasions
ERIC Educational Resources Information Center
Sessoms, John; Finney, Sara J.
2015-01-01
Because schools worldwide use low-stakes tests to make important decisions, value-added indices computed from test scores must accurately reflect student learning, which requires equal test-taking effort across testing occasions. Evaluating change in effort assumes effort is measured equivalently across occasions. We evaluated the longitudinal…
MOLAR: Modular Linux and Adaptive Runtime Support for HEC OS/R Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank Mueller
2009-02-05
MOLAR is a multi-institution research effort that concentrates on adaptive, reliable,and efficient operating and runtime system solutions for ultra-scale high-end scientific computing on the next generation of supercomputers. This research addresses the challenges outlined by the FAST-OS - forum to address scalable technology for runtime and operating systems --- and HECRTF --- high-end computing revitalization task force --- activities by providing a modular Linux and adaptable runtime support for high-end computing operating and runtime systems. The MOLAR research has the following goals to address these issues. (1) Create a modular and configurable Linux system that allows customized changes based onmore » the requirements of the applications, runtime systems, and cluster management software. (2) Build runtime systems that leverage the OS modularity and configurability to improve efficiency, reliability, scalability, ease-of-use, and provide support to legacy and promising programming models. (3) Advance computer reliability, availability and serviceability (RAS) management systems to work cooperatively with the OS/R to identify and preemptively resolve system issues. (4) Explore the use of advanced monitoring and adaptation to improve application performance and predictability of system interruptions. The overall goal of the research conducted at NCSU is to develop scalable algorithms for high-availability without single points of failure and without single points of control.« less
NASA Technical Reports Server (NTRS)
Vickers, John
2015-01-01
The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes
Multiphysics Thrust Chamber Modeling for Nuclear Thermal Propulsion
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Cheng, Gary; Chen, Yen-Sen
2006-01-01
The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation. A two-pronged approach is employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of heat transfer on thrust performance. Preliminary results on both aspects are presented.
Advances in Quantum Trajectory Approaches to Dynamics
NASA Astrophysics Data System (ADS)
Askar, Attila
2001-03-01
The quantum fluid dynamics (QFD) formulation is based on the separation of the amplitude and phase of the complex wave function in Schrodinger's equation. The approach leads to conservation laws for an equivalent "gas continuum". The Lagrangian [1] representation corresponds to following the particles of the fluid continuum, i. e. calculating "quantum trajectories". The Eulerian [2] representation on the other hand, amounts to observing the dynamics of the gas continuum at the points of a fixed coordinate frame. The combination of several factors leads to a most encouraging computational efficiency. QFD enables the numerical analysis to deal with near monotonic amplitude and phase functions. The Lagrangian description concentrates the computation effort to regions of highest probability as an optimal adaptive grid. The Eulerian representation allows the study of multi-coordinate problems as a set of one-dimensional problems within an alternating direction methodology. An explicit time integrator limits the increase in computational effort with the number of discrete points to linear. Discretization of the space via local finite elements [1,2] and global radial functions [3] will be discussed. Applications include wave packets in four-dimensional quadratic potentials and two coordinate photo-dissociation problems for NOCl and NO2. [1] "Quantum fluid dynamics (QFD) in the Lagrangian representation with applications to photo-dissociation problems", F. Sales, A. Askar and H. A. Rabitz, J. Chem. Phys. 11, 2423 (1999) [2] "Multidimensional wave-packet dynamics within the fluid dynamical formulation of the Schrodinger equation", B. Dey, A. Askar and H. A. Rabitz, J. Chem. Phys. 109, 8770 (1998) [3] "Solution of the quantum fluid dynamics equations with radial basis function interpolation", Xu-Guang Hu, Tak-San Ho, H. A. Rabitz and A. Askar, Phys. Rev. E. 61, 5967 (2000)
Billones, Junie B; Carrillo, Maria Constancia O; Organo, Voltaire G; Sy, Jamie Bernadette A; Clavio, Nina Abigail B; Macalino, Stephani Joy Y; Emnacen, Inno A; Lee, Alexandra P; Ko, Paul Kenny L; Concepcion, Gisela P
2017-01-01
Computer-aided drug discovery and development approaches such as virtual screening, molecular docking, and in silico drug property calculations have been utilized in this effort to discover new lead compounds against tuberculosis. The enzyme 7,8-diaminopelargonic acid aminotransferase (BioA) in Mycobacterium tuberculosis ( Mtb ), primarily involved in the lipid biosynthesis pathway, was chosen as the drug target due to the fact that humans are not capable of synthesizing biotin endogenously. The computational screening of 4.5 million compounds from the Enamine REAL database has ultimately yielded 45 high-scoring, high-affinity compounds with desirable in silico absorption, distribution, metabolism, excretion, and toxicity properties. Seventeen of the 45 compounds were subjected to bioactivity validation using the resazurin microtiter assay. Among the 4 actives, compound 7 (( Z )- N -(2-isopropoxyphenyl)-2-oxo-2-((3-(trifluoromethyl)cyclohexyl)amino)acetimidic acid) displayed inhibitory activity up to 83% at 10 μg/mL concentration against the growth of the Mtb H37Ra strain.
Billones, Junie B; Carrillo, Maria Constancia O; Organo, Voltaire G; Sy, Jamie Bernadette A; Clavio, Nina Abigail B; Macalino, Stephani Joy Y; Emnacen, Inno A; Lee, Alexandra P; Ko, Paul Kenny L; Concepcion, Gisela P
2017-01-01
Computer-aided drug discovery and development approaches such as virtual screening, molecular docking, and in silico drug property calculations have been utilized in this effort to discover new lead compounds against tuberculosis. The enzyme 7,8-diaminopelargonic acid aminotransferase (BioA) in Mycobacterium tuberculosis (Mtb), primarily involved in the lipid biosynthesis pathway, was chosen as the drug target due to the fact that humans are not capable of synthesizing biotin endogenously. The computational screening of 4.5 million compounds from the Enamine REAL database has ultimately yielded 45 high-scoring, high-affinity compounds with desirable in silico absorption, distribution, metabolism, excretion, and toxicity properties. Seventeen of the 45 compounds were subjected to bioactivity validation using the resazurin microtiter assay. Among the 4 actives, compound 7 ((Z)-N-(2-isopropoxyphenyl)-2-oxo-2-((3-(trifluoromethyl)cyclohexyl)amino)acetimidic acid) displayed inhibitory activity up to 83% at 10 μg/mL concentration against the growth of the Mtb H37Ra strain. PMID:28280303
A CAD Approach to Integrating NDE With Finite Element
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Downey, James; Ghosn, Louis J.; Baaklini, George Y.
2004-01-01
Nondestructive evaluation (NDE) is one of several technologies applied at NASA Glenn Research Center to determine atypical deformities, cracks, and other anomalies experienced by structural components. NDE consists of applying high-quality imaging techniques (such as x-ray imaging and computed tomography (CT)) to discover hidden manufactured flaws in a structure. Efforts are in progress to integrate NDE with the finite element (FE) computational method to perform detailed structural analysis of a given component. This report presents the core outlines for an in-house technical procedure that incorporates this combined NDE-FE interrelation. An example is presented to demonstrate the applicability of this analytical procedure. FE analysis of a test specimen is performed, and the resulting von Mises stresses and the stress concentrations near the anomalies are observed, which indicates the fidelity of the procedure. Additional information elaborating on the steps needed to perform such an analysis is clearly presented in the form of mini step-by-step guidelines.
NASA Technical Reports Server (NTRS)
Wey, Thomas; Liu, Nan-Suey
2008-01-01
This paper at first describes the fluid network approach recently implemented into the National Combustion Code (NCC) for the simulation of transport of aerosols (volatile particles and soot) in the particulate sampling systems. This network-based approach complements the other two approaches already in the NCC, namely, the lower-order temporal approach and the CFD-based approach. The accuracy and the computational costs of these three approaches are then investigated in terms of their application to the prediction of particle losses through sample transmission and distribution lines. Their predictive capabilities are assessed by comparing the computed results with the experimental data. The present work will help establish standard methodologies for measuring the size and concentration of particles in high-temperature, high-velocity jet engine exhaust. Furthermore, the present work also represents the first step of a long term effort of validating physics-based tools for the prediction of aircraft particulate emissions.
Field measurements and modeling of dilution in the wake of a US navy frigate.
Katz, C N; Chadwick, D B; Rohr, J; Hyman, M; Ondercin, D
2003-08-01
A field measurement and computer modeling effort was made to assess the dilution field of pulped waste materials discharged into the wake of a US Navy frigate. Pulped paper and fluorescein dye were discharged from the frigate's pulper at known rates. The subsequent particle and dye concentration field was then measured throughout the wake by a following vessel using multiple independent measures. Minimum dilution of the pulped paper reached 3.2 x 10(5) within 1900 m behind the frigate, or about 8 min after discharge. Independent measures typically agreed within 25% of one another and within 20% of model predictions. Minimum dilution of dye reached 2.3 x 10(5) at a down-wake distance of approximately 3500 m, or roughly 15 min. Comparison to model measurements were again within 20%. The field test was not only successful at characterizing wake dilution under one set of at-sea conditions, but was successful at validating the computer model used for assessing a wide range of ships and conditions.
34 CFR 403.185 - How does the Secretary compute maintenance of effort in the event of a waiver?
Code of Federal Regulations, 2010 CFR
2010-07-01
... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in the event of a waiver? 403.185 Section 403.185 Education Regulations of the Offices of the Department...
Computational Fluid Dynamics Technology for Hypersonic Applications
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2003-01-01
Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.
ERIC Educational Resources Information Center
Kolata, Gina
1984-01-01
Examines social influences which discourage women from pursuing studies in computer science, including monopoly of computer time by boys at the high school level, sexual harassment in college, movies, and computer games. Describes some initial efforts to encourage females of all ages to study computer science. (JM)
Robust peptide bundles designed computationally
NASA Astrophysics Data System (ADS)
Haider, Michael; Zhang, Huixi Violet; Kiick, Kristi; Saven, Jeffery; Pochan, Darrin
Peptides are ideal candidates for the design and controlled assembly of nanoscale materials due to their potential to assemble with atomistic precision as in biological systems. Unlike other work utilizing natural proteins and structural motifs, this effort is completely de novo in order to build arbitrary structures with desired size for the specific placement and separation of functional groups. We have successfully computationally designed soluble, coiled coil, peptide, tetramer bundles which are robust and stable. Using circular dichroism we demonstrated the thermal stability of these bundles as well as confirmed their alpha helical and coiled coil nature. The stability of these bundles arises from the computational design of the coiled coil interior core residues. The coiled coil tetramer was confirmed to be the dominant species by analytical ultra-centrifugation sedimentation studies. We also established how these bundles behave in solution using small angle neutron scattering. The form factor of the bundles is well represented by a cylinder model and their behavior at high concentrations is modeled using a structure factor for aggregates of the cylinders. All of these experiments support our claim that the designed coiled coil bundles were achieved in solution. NSF DMREF 1234161.
Combining Computational and Social Effort for Collaborative Problem Solving
Wagy, Mark D.; Bongard, Josh C.
2015-01-01
Rather than replacing human labor, there is growing evidence that networked computers create opportunities for collaborations of people and algorithms to solve problems beyond either of them. In this study, we demonstrate the conditions under which such synergy can arise. We show that, for a design task, three elements are sufficient: humans apply intuitions to the problem, algorithms automatically determine and report back on the quality of designs, and humans observe and innovate on others’ designs to focus creative and computational effort on good designs. This study suggests how such collaborations should be composed for other domains, as well as how social and computational dynamics mutually influence one another during collaborative problem solving. PMID:26544199
NASA Astrophysics Data System (ADS)
Lele, Sanjiva K.
2002-08-01
Funds were received in April 2001 under the Department of Defense DURIP program for construction of a 48 processor high performance computing cluster. This report details the hardware which was purchased and how it has been used to enable and enhance research activities directly supported by, and of interest to, the Air Force Office of Scientific Research and the Department of Defense. The report is divided into two major sections. The first section after this summary describes the computer cluster, its setup, and some cluster performance benchmark results. The second section explains ongoing research efforts which have benefited from the cluster hardware, and presents highlights of those efforts since installation of the cluster.
Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.
2009-01-01
In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.
contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and
Modeling and control for closed environment plant production systems
NASA Technical Reports Server (NTRS)
Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)
2002-01-01
A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.
Wang, Yi; Hess, Tamara Noelle; Jones, Victoria; Zhou, Joe Zhongxiang; McNeil, Michael R.; McCammon, J. Andrew
2011-01-01
The complex and highly impermeable cell wall of Mycobacterium tuberculosis (Mtb) is largely responsible for the ability of the mycobacterium to resist the action of chemical therapeutics. An L-rhamnosyl residue, which occupies an important anchoring position in the Mtb cell wall, is an attractive target for novel anti-tuberculosis drugs. In this work, we report a virtual screening (VS) study targeting Mtb dTDP-deoxy-L-lyxo-4-hexulose reductase (RmlD), the last enzyme in the L-rhamnosyl synthesis pathway. Through two rounds of VS, we have identified four RmlD inhibitors with half inhibitory concentrations of 0.9-25 μM, and whole-cell minimum inhibitory concentrations of 20-200 μg/ml. Compared with our previous high throughput screening targeting another enzyme involved in L-rhamnosyl synthesis, virtual screening produced higher hit rates, supporting the use of computational methods in future anti-tuberculosis drug discovery efforts. PMID:22014548
Validation of a reduced-order jet model for subsonic and underexpanded hydrogen jets
Li, Xuefang; Hecht, Ethan S.; Christopher, David M.
2016-01-01
Much effort has been made to model hydrogen releases from leaks during potential failures of hydrogen storage systems. A reduced-order jet model can be used to quickly characterize these flows, with low computational cost. Notional nozzle models are often used to avoid modeling the complex shock structures produced by the underexpanded jets by determining an “effective” source to produce the observed downstream trends. In our work, the mean hydrogen concentration fields were measured in a series of subsonic and underexpanded jets using a planar laser Rayleigh scattering system. Furthermore, we compared the experimental data to a reduced order jet modelmore » for subsonic flows and a notional nozzle model coupled to the jet model for underexpanded jets. The values of some key model parameters were determined by comparisons with the experimental data. Finally, the coupled model was also validated against hydrogen concentrations measurements for 100 and 200 bar hydrogen jets with the predictions agreeing well with data in the literature.« less
Pharmacokinetic modeling of a gel-delivered dapivirine microbicide in humans.
Halwes, Michael E; Steinbach-Rankins, Jill M; Frieboes, Hermann B
2016-10-10
Although a number of drugs have been developed for the treatment and prevention of human immunodeficiency virus (HIV) infection, it has proven difficult to optimize the drug and dosage parameters. The vaginal tissue, comprised of epithelial, stromal and blood compartments presents a complex system which challenges evaluation of drug kinetics solely through empirical effort. To provide insight into the underlying processes, mathematical modeling and computational simulation have been applied to the study of retroviral microbicide pharmacokinetics. Building upon previous pioneering work that modeled the delivery of Tenofovir (TFV) via topical delivery to the vaginal environment, here we computationally evaluate the performance of the retroviral inhibitor dapivirine released from a microbicide gel. We adapt the TFV model to simulate the multicompartmental diffusion and uptake of dapivirine into the blood plasma and vaginal compartments. The results show that dapivirine is expected to accumulate at the interface between the gel and epithelium compartments due to its hydrophobic characteristics. Hydrophobicity also results in decreased diffusivity, which may impact distribution by up to 2 orders of magnitude compared to TFV. Maximum concentrations of dapivirine in the epithelium, stroma, and blood were 9.9e7, 2.45e6, and 119pg/mL, respectively. This suggests that greater initial doses or longer time frames are required to obtain higher drug concentrations in the epithelium. These observations may have important ramifications if a specific time frame is required for efficacy, or if a minimum/maximum concentration is needed in the mucus, epithelium, or stroma based on combined efficacy and safety data. Copyright © 2016 Elsevier B.V. All rights reserved.
Study of the Use of Time-Mean Vortices to Generate Lift for MAV Applications
2011-05-31
microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters (geometry, frequency, amplitude of oscillation, etc...issue involved. Towards this end, a suspended microplate was fabricated via MEMS technology and driven to in-plane resonance via Lorentz force...force to drive the suspended MEMS-based microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters
A General Approach to Measuring Test-Taking Effort on Computer-Based Tests
ERIC Educational Resources Information Center
Wise, Steven L.; Gao, Lingyun
2017-01-01
There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Technical Reports Server (NTRS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-01-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Astrophysics Data System (ADS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-08-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron
2016-10-01
The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Bennett, Jerome (Technical Monitor)
2002-01-01
The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.
NASA Astrophysics Data System (ADS)
Wimer, N. T.; Mackoweicki, A. S.; Poludnenko, A. Y.; Hoffman, C.; Daily, J. W.; Rieker, G. B.; Hamlington, P.
2017-12-01
Results are presented from a joint computational and experimental research effort focused on understanding and characterizing wildland fire spread at small scales (roughly 1m-1mm) using direct numerical simulations (DNS) with chemical kinetics mechanisms that have been calibrated using data from high-speed laser diagnostics. The simulations are intended to directly resolve, with high physical accuracy, all small-scale fluid dynamic and chemical processes relevant to wildland fire spread. The high fidelity of the simulations is enabled by the calibration and validation of DNS sub-models using data from high-speed laser diagnostics. These diagnostics have the capability to measure temperature and chemical species concentrations, and are used here to characterize evaporation and pyrolysis processes in wildland fuels subjected to an external radiation source. The chemical kinetics code CHEMKIN-PRO is used to study and reduce complex reaction mechanisms for water removal, pyrolysis, and gas phase combustion during solid biomass burning. Simulations are then presented for a gaseous pool fire coupled with the resulting multi-step chemical reaction mechanisms, and the results are connected to the fundamental structure and spread of wildland fires. It is anticipated that the combined computational and experimental approach of this research effort will provide unprecedented access to information about chemical species, temperature, and turbulence during the entire pyrolysis, evaporation, ignition, and combustion process, thereby permitting more complete understanding of the physics that must be represented by coarse-scale numerical models of wildland fire spread.
A stratospheric aerosol model with perturbations induced by the space shuttle particulate effluents
NASA Technical Reports Server (NTRS)
Rosen, J. M.; Hofmann, D. J.
1977-01-01
A one dimensional steady state stratospheric aerosol model is developed that considers the subsequent perturbations caused by including the expected space shuttle particulate effluents. Two approaches to the basic modeling effort were made: in one, enough simplifying assumptions were introduced so that a more or less exact solution to the descriptive equations could be obtained; in the other approach very few simplifications were made and a computer technique was used to solve the equations. The most complex form of the model contains the effects of sedimentation, diffusion, particle growth and coagulation. Results of the perturbation calculations show that there will probably be an immeasurably small increase in the stratospheric aerosol concentration for particles larger than about 0.15 micrometer radius.
Space Life Support Engineering Program
NASA Technical Reports Server (NTRS)
Seagrave, Richard C.
1993-01-01
This report covers the second year of research relating to the development of closed-loop long-term life support systems. Emphasis was directed toward concentrating on the development of dynamic simulation techniques and software and on performing a thermodynamic systems analysis in an effort to begin optimizing the system needed for water purification. Four appendices are attached. The first covers the ASPEN modeling of the closed loop Environmental Control Life Support System (ECLSS) and its thermodynamic analysis. The second is a report on the dynamic model development for water regulation in humans. The third regards the development of an interactive computer-based model for determining exercise limitations. The fourth attachment is an estimate of the second law thermodynamic efficiency of the various units comprising an ECLSS.
Computerizing the Accounting Curriculum.
ERIC Educational Resources Information Center
Nash, John F.; England, Thomas G.
1986-01-01
Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)
Computers in Schools: White Boys Only?
ERIC Educational Resources Information Center
Hammett, Roberta F.
1997-01-01
Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)
Computers and Instruction: Implications of the Rising Tide of Criticism for Reading Education.
ERIC Educational Resources Information Center
Balajthy, Ernest
1988-01-01
Examines two major reasons that schools have adopted computers without careful prior examination and planning. Surveys a variety of criticisms targeted toward some aspects of computer-based instruction in reading in an effort to direct attention to the beneficial implications of computers in the classroom. (MS)
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
Computers for the Faculty: How on a Limited Budget.
ERIC Educational Resources Information Center
Arman, Hal; Kostoff, John
An informal investigation of the use of computers at Delta College (DC) in Michigan revealed reasonable use of computers by faculty in disciplines such as mathematics, business, and technology, but very limited use in the humanities and social sciences. In an effort to increase faculty computer usage, DC decided to make computers available to any…
Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming
Philip A. Araman
1990-01-01
This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...
Biomechanics of Head, Neck, and Chest Injury Prevention for Soldiers: Phase 2 and 3
2016-08-01
understanding of the biomechanics of the head and brain. Task 2.3 details the computational modeling efforts conducted to evaluate the response of the...section also details the progress made on the development of a testing apparatus to evaluate cervical spine implants in survivable loading scenarios...computational modeling efforts conducted to evaluate the response of the cervical spine and the effects of cervical arthrodesis and arthroplasty during
Limits on fundamental limits to computation.
Markov, Igor L
2014-08-14
An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
Office workers' computer use patterns are associated with workplace stressors.
Eijckelhof, Belinda H W; Huysmans, Maaike A; Blatter, Birgitte M; Leider, Priscilla C; Johnson, Peter W; van Dieën, Jaap H; Dennerlein, Jack T; van der Beek, Allard J
2014-11-01
This field study examined associations between workplace stressors and office workers' computer use patterns. We collected keyboard and mouse activities of 93 office workers (68F, 25M) for approximately two work weeks. Linear regression analyses examined the associations between self-reported effort, reward, overcommitment, and perceived stress and software-recorded computer use duration, number of short and long computer breaks, and pace of input device usage. Daily duration of computer use was, on average, 30 min longer for workers with high compared to low levels of overcommitment and perceived stress. The number of short computer breaks (30 s-5 min long) was approximately 20% lower for those with high compared to low effort and for those with low compared to high reward. These outcomes support the hypothesis that office workers' computer use patterns vary across individuals with different levels of workplace stressors. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Mass balance assessment for mercury in Lake Champlain
Gao, N.; Armatas, N.G.; Shanley, J.B.; Kamman, N.C.; Miller, E.K.; Keeler, G.J.; Scherbatskoy, T.; Holsen, T.M.; Young, T.; McIlroy, L.; Drake, S.; Olsen, Bill; Cady, C.
2006-01-01
A mass balance model for mercury in Lake Champlain was developed in an effort to understand the sources, inventories, concentrations, and effects of mercury (Hg) contamination in the lake ecosystem. To construct the mass balance model, air, water, and sediment were sampled as a part of this project and other research/monitoring projects in the Lake Champlain Basin. This project produced a STELLA-based computer model and quantitative apportionments of the principal input and output pathways of Hg for each of 13 segments in the lake. The model Hg concentrations in the lake were consistent with measured concentrations. Specifically, the modeling identified surface water inflows as the largest direct contributor of Hg into the lake. Direct wet deposition to the lake was the second largest source of Hg followed by direct dry deposition. Volatilization and sedimentation losses were identified as the two major removal mechanisms. This study significantly improves previous estimates of the relative importance of Hg input pathways and of wet and dry deposition fluxes of Hg into Lake Champlain. It also provides new estimates of volatilization fluxes across different lake segments and sedimentation loss in the lake. ?? 2006 American Chemical Society.
Neurocomputational mechanisms underlying subjective valuation of effort costs
Giehl, Kathrin; Sillence, Annie
2017-01-01
In everyday life, we have to decide whether it is worth exerting effort to obtain rewards. Effort can be experienced in different domains, with some tasks requiring significant cognitive demand and others being more physically effortful. The motivation to exert effort for reward is highly subjective and varies considerably across the different domains of behaviour. However, very little is known about the computational or neural basis of how different effort costs are subjectively weighed against rewards. Is there a common, domain-general system of brain areas that evaluates all costs and benefits? Here, we used computational modelling and functional magnetic resonance imaging (fMRI) to examine the mechanisms underlying value processing in both the cognitive and physical domains. Participants were trained on two novel tasks that parametrically varied either cognitive or physical effort. During fMRI, participants indicated their preferences between a fixed low-effort/low-reward option and a variable higher-effort/higher-reward offer for each effort domain. Critically, reward devaluation by both cognitive and physical effort was subserved by a common network of areas, including the dorsomedial and dorsolateral prefrontal cortex, the intraparietal sulcus, and the anterior insula. Activity within these domain-general areas also covaried negatively with reward and positively with effort, suggesting an integration of these parameters within these areas. Additionally, the amygdala appeared to play a unique, domain-specific role in processing the value of rewards associated with cognitive effort. These results are the first to reveal the neurocomputational mechanisms underlying subjective cost–benefit valuation across different domains of effort and provide insight into the multidimensional nature of motivation. PMID:28234892
Parallel computing for automated model calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.
2002-07-29
Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less
2010-07-01
Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to
Validation of hydrogen gas stratification and mixing models
Wu, Hsingtzu; Zhao, Haihua
2015-05-26
Two validation benchmarks confirm that the BMIX++ code is capable of simulating unintended hydrogen release scenarios efficiently. The BMIX++ (UC Berkeley mechanistic MIXing code in C++) code has been developed to accurately and efficiently predict the fluid mixture distribution and heat transfer in large stratified enclosures for accident analyses and design optimizations. The BMIX++ code uses a scaling based one-dimensional method to achieve large reduction in computational effort compared to a 3-D computational fluid dynamics (CFD) simulation. Two BMIX++ benchmark models have been developed. One is for a single buoyant jet in an open space and another is for amore » large sealed enclosure with both a jet source and a vent near the floor. Both of them have been validated by comparisons with experimental data. Excellent agreements are observed. The entrainment coefficients of 0.09 and 0.08 are found to fit the experimental data for hydrogen leaks with the Froude number of 99 and 268 best, respectively. In addition, the BIX++ simulation results of the average helium concentration for an enclosure with a vent and a single jet agree with the experimental data within a margin of about 10% for jet flow rates ranging from 1.21 × 10⁻⁴ to 3.29 × 10⁻⁴ m³/s. In conclusion, computing time for each BMIX++ model with a normal desktop computer is less than 5 min.« less
Future Computer Requirements for Computational Aerodynamics
NASA Technical Reports Server (NTRS)
1978-01-01
Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.
2010-06-01
DATES COVEREDAPR 2009 – JAN 2010 (From - To) APR 2009 – JAN 2010 4. TITLE AND SUBTITLE EMERGING NEUROMORPHIC COMPUTING ARCHITECTURES AND ENABLING...14. ABSTRACT The highly cross-disciplinary emerging field of neuromorphic computing architectures for cognitive information processing applications...belief systems, software, computer engineering, etc. In our effort to develop cognitive systems atop a neuromorphic computing architecture, we explored
Overview 1993: Computational applications
NASA Technical Reports Server (NTRS)
Benek, John A.
1993-01-01
Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.
Computing at the speed limit (supercomputers)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernhard, R.
1982-07-01
The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less
Computer Augmented Video Education.
ERIC Educational Resources Information Center
Sousa, M. B.
1979-01-01
Describes project CAVE (Computer Augmented Video Education), an ongoing effort at the U.S. Naval Academy to present lecture material on videocassette tape, reinforced by drill and practice through an interactive computer system supported by a 12 channel closed circuit television distribution and production facility. (RAO)
Computer Guided Instructional Design.
ERIC Educational Resources Information Center
Merrill, M. David; Wood, Larry E.
1984-01-01
Describes preliminary efforts to create the Lesson Design System, a computer-guided instructional design system written in Pascal for Apple microcomputers. Its content outline, strategy, display, and online lesson editors correspond roughly to instructional design phases of content and strategy analysis, display creation, and computer programing…
CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY
The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...
Langley's Computational Efforts in Sonic-Boom Softening of the Boeing HSCT
NASA Technical Reports Server (NTRS)
Fouladi, Kamran
1999-01-01
NASA Langley's computational efforts in the sonic-boom softening of the Boeing high-speed civil transport are discussed in this paper. In these efforts, an optimization process using a higher order Euler method for analysis was employed to reduce the sonic boom of a baseline configuration through fuselage camber and wing dihedral modifications. Fuselage modifications did not provide any improvements, but the dihedral modifications were shown to be an important tool for the softening process. The study also included aerodynamic and sonic-boom analyses of the baseline and some of the proposed "softened" configurations. Comparisons of two Euler methodologies and two propagation programs for sonic-boom predictions are also discussed in the present paper.
The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diachin, L F; Garaizar, F X; Henson, V E
2009-10-12
In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less
Development and Validation of a Computational Model for Androgen Receptor Activity
2016-01-01
Testing thousands of chemicals to identify potential androgen receptor (AR) agonists or antagonists would cost millions of dollars and take decades to complete using current validated methods. High-throughput in vitro screening (HTS) and computational toxicology approaches can more rapidly and inexpensively identify potential androgen-active chemicals. We integrated 11 HTS ToxCast/Tox21 in vitro assays into a computational network model to distinguish true AR pathway activity from technology-specific assay interference. The in vitro HTS assays probed perturbations of the AR pathway at multiple points (receptor binding, coregulator recruitment, gene transcription, and protein production) and multiple cell types. Confirmatory in vitro antagonist assay data and cytotoxicity information were used as additional flags for potential nonspecific activity. Validating such alternative testing strategies requires high-quality reference data. We compiled 158 putative androgen-active and -inactive chemicals from a combination of international test method validation efforts and semiautomated systematic literature reviews. Detailed in vitro assay information and results were compiled into a single database using a standardized ontology. Reference chemical concentrations that activated or inhibited AR pathway activity were identified to establish a range of potencies with reproducible reference chemical results. Comparison with existing Tier 1 AR binding data from the U.S. EPA Endocrine Disruptor Screening Program revealed that the model identified binders at relevant test concentrations (<100 μM) and was more sensitive to antagonist activity. The AR pathway model based on the ToxCast/Tox21 assays had balanced accuracies of 95.2% for agonist (n = 29) and 97.5% for antagonist (n = 28) reference chemicals. Out of 1855 chemicals screened in the AR pathway model, 220 chemicals demonstrated AR agonist or antagonist activity and an additional 174 chemicals were predicted to have potential weak AR pathway activity. PMID:27933809
Craven, Stephen; Shirsat, Nishikant; Whelan, Jessica; Glennon, Brian
2013-01-01
A Monod kinetic model, logistic equation model, and statistical regression model were developed for a Chinese hamster ovary cell bioprocess operated under three different modes of operation (batch, bolus fed-batch, and continuous fed-batch) and grown on two different bioreactor scales (3 L bench-top and 15 L pilot-scale). The Monod kinetic model was developed for all modes of operation under study and predicted cell density, glucose glutamine, lactate, and ammonia concentrations well for the bioprocess. However, it was computationally demanding due to the large number of parameters necessary to produce a good model fit. The transferability of the Monod kinetic model structure and parameter set across bioreactor scales and modes of operation was investigated and a parameter sensitivity analysis performed. The experimentally determined parameters had the greatest influence on model performance. They changed with scale and mode of operation, but were easily calculated. The remaining parameters, which were fitted using a differential evolutionary algorithm, were not as crucial. Logistic equation and statistical regression models were investigated as alternatives to the Monod kinetic model. They were less computationally intensive to develop due to the absence of a large parameter set. However, modeling of the nutrient and metabolite concentrations proved to be troublesome due to the logistic equation model structure and the inability of both models to incorporate a feed. The complexity, computational load, and effort required for model development has to be balanced with the necessary level of model sophistication when choosing which model type to develop for a particular application. Copyright © 2012 American Institute of Chemical Engineers (AIChE).
Overview of NASA/OAST efforts related to manufacturing technology
NASA Technical Reports Server (NTRS)
Saunders, N. T.
1976-01-01
An overview of some of NASA's current efforts related to manufacturing technology and some possible directions for the future are presented. The topics discussed are: computer-aided design, composite structures, and turbine engine components.
Data Network Weather Service Reporting - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Frey
2012-08-30
A final report is made of a three-year effort to develop a new forecasting paradigm for computer network performance. This effort was made in co-ordination with Fermi Lab's construction of e-Weather Center.
Lane, Andrew M.; Totterdell, Peter; MacDonald, Ian; Devonport, Tracey J.; Friesen, Andrew P.; Beedie, Christopher J.; Stanley, Damian; Nevill, Alan
2016-01-01
In conjunction with BBC Lab UK, the present study developed 12 brief psychological skill interventions for online delivery. A protocol was designed that captured data via self-report measures, used video recordings to deliver interventions, involved a competitive concentration task against an individually matched computer opponent, and provided feedback on the effects of the interventions. Three psychological skills were used; imagery, self-talk, and if-then planning, with each skill directed to one of four different foci: outcome goal, process goal, instruction, or arousal-control. This resulted in 12 different intervention participant groups (randomly assigned) with a 13th group acting as a control. Participants (n = 44,742) completed a competitive task four times—practice, baseline, following an intervention, and again after repeating the intervention. Results revealed performance improved following practice with incremental effects for imagery-outcome, imagery-process, and self-talk-outcome and self-talk-process over the control group, with the same interventions increasing the intensity of effort invested, arousal and pleasant emotion. Arousal-control interventions associated with pleasant emotions, low arousal, and low effort invested in performance. Instructional interventions were not effective. Results offer support for the utility of online interventions in teaching psychological skills and suggest brief interventions that focus on increasing motivation, increased arousal, effort invested, and pleasant emotions were the most effective. PMID:27065904
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
Implementing Equal Access Computer Labs.
ERIC Educational Resources Information Center
Clinton, Janeen; And Others
This paper discusses the philosophy followed in Palm Beach County to adapt computer literacy curriculum, hardware, and software to meet the needs of all children. The Department of Exceptional Student Education and the Department of Instructional Computing Services cooperated in planning strategies and coordinating efforts to implement equal…
Mentoring the Next Generation of Science Gateway Developers and Users
NASA Astrophysics Data System (ADS)
Hayden, L. B.; Jackson-Ward, F.
2016-12-01
The Science Gateway Institute (SGW-I) for the Democratization and Acceleration of Science was a SI2-SSE Collaborative Research conceptualization award funded by NSF in 2012. From 2012 through 2015, we engaged interested members of the science and engineering community in a planning process for a Science Gateway Community Institute (SGCI). Science Gateways provide Web interfaces to some of the most sophisticated cyberinfrastructure resources. They interact with remotely executing science applications on supercomputers, they connect to remote scientific data collections, instruments and sensor streams, and support large collaborations. Gateways allow scientists to concentrate on the most challenging science problems while underlying components such as computing architectures and interfaces to data collection changes. The goal of our institute was to provide coordinating activities across the National Science Foundation, eventually providing services more broadly to projects funded by other agencies. SGW-I has succeeded in identifying two underrepresented communities of future gateway designers and users. The Association of Computer and Information Science/Engineering Departments at Minority Institutions (ADMI) was identified as a source of future gateway designers. The National Organization for the Professional Advancement of Black Chemists and Chemical Engineers (NOBCChE) was identified as a community of future science gateway users. SGW-I efforts to engage NOBCChE and ADMI faculty and students in SGW-I are now woven into the workforce development component of SGCI. SGCI (ScienceGateways.org ) is a collaboration of six universities, led by San Diego Supercomputer Center. The workforce development component is led by Elizabeth City State University (ECSU). ECSU efforts focus is on: Produce a model of engagement; Integration of research into education; and Mentoring of students while aggressively addressing diversity. This paper documents the outcome of the SGW-I conceptualization project and describes the extensive Workforce Development effort going forward into the 5-year SGCI project recently funded by NSF.
The Effort Paradox: Effort Is Both Costly and Valued.
Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y
2018-04-01
According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.
76 FR 28443 - President's National Security Telecommunications Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
... Government's use of cloud computing; the Federal Emergency Management Agency's NS/EP communications... Commercial Satellite Mission Assurance; and the way forward for the committee's cloud computing effort. The...
System Award for developing a tool that has had a lasting influence on computing. Project Jupyter evolved lasting influence on computing. Project Jupyter evolved from IPython, an effort pioneered by Fernando PÃ
Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts
NASA Technical Reports Server (NTRS)
Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky
2012-01-01
We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.
NASA Astrophysics Data System (ADS)
Johnson, K. S.; Plant, J. N.; Sakamoto, C.; Coletti, L. J.; Sarmiento, J. L.; Riser, S.; Talley, L. D.
2016-12-01
Sixty profiling floats with ISUS and SUNA nitrate sensors have been deployed in the Southern Ocean (south of 30 degrees S) as part of the SOCCOM (Southern Ocean Carbon and Climate Observations and Modeling) program and earlier efforts. These floats have produced detailed records of the annual cycle of nitrate concentration throughout the region from the surface to depths near 2000 m. In surface waters, there are clear cycles in nitrate concentration that result from uptake of nitrate during austral spring and summer. These changes in nitrate concentration were used to compute the annual net community production over this region. NCP was computed using a simplified version of the approach detailed by Plant et al. (2016, Global Biogeochemical Cycles, 30, 859-879, DOI: 10.1002/2015GB005349). At the time the abstract was written 41 complete annual cycles were available from floats deployed before the austral summer of 2015/2016. After filtering the data to remove floats that crossed distinct frontal boundaries, floats with other anomalies, and floats in sub-tropical waters, 23 cycles were available. A preliminary assessment of the data yields an NCP of 2.8 +/- 0.95 (1 SD) mol C/m2/y after integrating to 100 m depth and converting nitrate uptake to carbon using the Redfield ratio. This preliminary assessment ignores vertical transport across the nitracline and is, therefore, a minimum estimate. The number of cycles available for analysis will increase rapidly, as 32 of the floats were deployed in the austral summer of 2015/2016 and have not yet been analyzed.
A neuronal model of a global workspace in effortful cognitive tasks.
Dehaene, S; Kerszberg, M; Changeux, J P
1998-11-24
A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.
A specific role for serotonin in overcoming effort cost.
Meyniel, Florent; Goodwin, Guy M; Deakin, Jf William; Klinge, Corinna; MacFadyen, Christine; Milligan, Holly; Mullings, Emma; Pessiglione, Mathias; Gaillard, Raphaël
2016-11-08
Serotonin is implicated in many aspects of behavioral regulation. Theoretical attempts to unify the multiple roles assigned to serotonin proposed that it regulates the impact of costs, such as delay or punishment, on action selection. Here, we show that serotonin also regulates other types of action costs such as effort. We compared behavioral performance in 58 healthy humans treated during 8 weeks with either placebo or the selective serotonin reuptake inhibitor escitalopram. The task involved trading handgrip force production against monetary benefits. Participants in the escitalopram group produced more effort and thereby achieved a higher payoff. Crucially, our computational analysis showed that this effect was underpinned by a specific reduction of effort cost, and not by any change in the weight of monetary incentives. This specific computational effect sheds new light on the physiological role of serotonin in behavioral regulation and on the clinical effect of drugs for depression. ISRCTN75872983.
DARPA-funded efforts in the development of novel brain-computer interface technologies.
Miranda, Robbin A; Casebeer, William D; Hein, Amy M; Judy, Jack W; Krotkov, Eric P; Laabs, Tracy L; Manzo, Justin E; Pankratz, Kent G; Pratt, Gill A; Sanchez, Justin C; Weber, Douglas J; Wheeler, Tracey L; Ling, Geoffrey S F
2015-04-15
The Defense Advanced Research Projects Agency (DARPA) has funded innovative scientific research and technology developments in the field of brain-computer interfaces (BCI) since the 1970s. This review highlights some of DARPA's major advances in the field of BCI, particularly those made in recent years. Two broad categories of DARPA programs are presented with respect to the ultimate goals of supporting the nation's warfighters: (1) BCI efforts aimed at restoring neural and/or behavioral function, and (2) BCI efforts aimed at improving human training and performance. The programs discussed are synergistic and complementary to one another, and, moreover, promote interdisciplinary collaborations among researchers, engineers, and clinicians. Finally, this review includes a summary of some of the remaining challenges for the field of BCI, as well as the goals of new DARPA efforts in this domain. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Kinetic Monte Carlo Simulation of Oxygen Diffusion in Ytterbium Disilicate
NASA Technical Reports Server (NTRS)
Good, Brian S.
2015-01-01
Ytterbium disilicate is of interest as a potential environmental barrier coating for aerospace applications, notably for use in next generation jet turbine engines. In such applications, the transport of oxygen and water vapor through these coatings to the ceramic substrate is undesirable if high temperature oxidation is to be avoided. In an effort to understand the diffusion process in these materials, we have performed kinetic Monte Carlo simulations of vacancy-mediated and interstitial oxygen diffusion in Ytterbium disilicate. Oxygen vacancy and interstitial site energies, vacancy and interstitial formation energies, and migration barrier energies were computed using Density Functional Theory. We have found that, in the case of vacancy-mediated diffusion, many potential diffusion paths involve large barrier energies, but some paths have barrier energies smaller than one electron volt. However, computed vacancy formation energies suggest that the intrinsic vacancy concentration is small. In the case of interstitial diffusion, migration barrier energies are typically around one electron volt, but the interstitial defect formation energies are positive, with the result that the disilicate is unlikely to exhibit experience significant oxygen permeability except at very high temperature.
Kuhn, Stefan; Egert, Björn; Neumann, Steffen; Steinbeck, Christoph
2008-09-25
Current efforts in Metabolomics, such as the Human Metabolome Project, collect structures of biological metabolites as well as data for their characterisation, such as spectra for identification of substances and measurements of their concentration. Still, only a fraction of existing metabolites and their spectral fingerprints are known. Computer-Assisted Structure Elucidation (CASE) of biological metabolites will be an important tool to leverage this lack of knowledge. Indispensable for CASE are modules to predict spectra for hypothetical structures. This paper evaluates different statistical and machine learning methods to perform predictions of proton NMR spectra based on data from our open database NMRShiftDB. A mean absolute error of 0.18 ppm was achieved for the prediction of proton NMR shifts ranging from 0 to 11 ppm. Random forest, J48 decision tree and support vector machines achieved similar overall errors. HOSE codes being a notably simple method achieved a comparatively good result of 0.17 ppm mean absolute error. NMR prediction methods applied in the course of this work delivered precise predictions which can serve as a building block for Computer-Assisted Structure Elucidation for biological metabolites.
CICE, The Los Alamos Sea Ice Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunke, Elizabeth; Lipscomb, William; Jones, Philip
The Los Alamos sea ice model (CICE) is the result of an effort to develop a computationally efficient sea ice component for a fully coupled atmosphere–land–ocean–ice global climate model. It was originally designed to be compatible with the Parallel Ocean Program (POP), an ocean circulation model developed at Los Alamos National Laboratory for use on massively parallel computers. CICE has several interacting components: a vertical thermodynamic model that computes local growth rates of snow and ice due to vertical conductive, radiative and turbulent fluxes, along with snowfall; an elastic-viscous-plastic model of ice dynamics, which predicts the velocity field of themore » ice pack based on a model of the material strength of the ice; an incremental remapping transport model that describes horizontal advection of the areal concentration, ice and snow volume and other state variables; and a ridging parameterization that transfers ice among thickness categories based on energetic balances and rates of strain. It also includes a biogeochemical model that describes evolution of the ice ecosystem. The CICE sea ice model is used for climate research as one component of complex global earth system models that include atmosphere, land, ocean and biogeochemistry components. It is also used for operational sea ice forecasting in the polar regions and in numerical weather prediction models.« less
Study of Fluid Experiment System (FES)/CAST/Holographic Ground System (HGS)
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Cummings, Rick; Jones, Brian
1992-01-01
The use of holographic and schlieren optical techniques for studying the concentration gradients in solidification processes has been used by several investigators over the years. The HGS facility at MSFC has been primary resource in researching this capability. Consequently, scientific personnel have been able to utilize these techniques in both ground based research and in space experiments. An important event in the scientific utilization of the HGS facilities was the TGS Crystal Growth and the casting and solidification technology (CAST) experiments that were flown on the International Microgravity Laboratory (IML) mission in March of this year. The preparation and processing of these space observations are the primary experiments reported in this work. This project provides some ground-based studies to optimize on the holographic techniques used to acquire information about the crystal growth processes flown on IML. Since the ground-based studies will be compared with the space-based experimental results, it is necessary to conduct sufficient ground based studies to best determine how the experiment worked in space. The current capabilities in computer based systems for image processing and numerical computation have certainly assisted in those efforts. As anticipated, this study has certainly shown that these advanced computing capabilities are helpful in the data analysis of such experiments.
Computing the proton aurora at early Mars
NASA Astrophysics Data System (ADS)
Lovato, K.; Gronoff, G.; Curry, S.; Simon Wedlund, C.; Moore, W. B.
2017-12-01
In the early Solar System, ( 4 Gyr ago) our Sun was 70% less luminous than what is seen today but much more active. Indeed, for young stars, solar flares occurs more frequently and therefore so do coronal mass ejections and solar energetic particle events. With an increase in solar events, the flux of protons becomes extremely high, and affects planetary atmosphere in a more extreme way as today. Proton precipitation on planets has an impact on the energy balance of their upper atmospheres, can affect the photochemistry and create auroral emissions. Understanding the protons precipitation at the early Mars can help in understanding occurring chemical process as well as atmospheric evolution and escape. We concentrated our effort on the proton up to a MeV since they have the most important influence on the upper atmosphere. Using scaling laws, we estimated the proton flux for the Early Mars up to a MeV. A kinetic 1D code, validated for the current Mars, was used to compute the effects of the low energy protons precipitation on the Early Mars. This model solves the coupled H+/H multi-stream dissipative transport equation as well as the transport of the secondary electron. For the Early Mars, it allowed to compute the magnitude of the proton Aurora, as well as the corresponding upwards H flux.
Numerical Optimization Using Desktop Computers
1980-09-11
concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett
Computer Technology and Social Issues.
ERIC Educational Resources Information Center
Garson, G. David
Computing involves social issues and political choices. Issues such as privacy, computer crime, gender inequity, disemployment, and electronic democracy versus "Big Brother" are addressed in the context of efforts to develop a national public policy for information technology. A broad range of research and case studies are examined in an…
Advances in computational design and analysis of airbreathing propulsion systems
NASA Technical Reports Server (NTRS)
Klineberg, John M.
1989-01-01
The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feltus, M.A.
1996-12-31
Previously, nuclear utilities have been considered {open_quotes}deep pockets{close_quotes} for university research; however, in the current cost-cutting competitive environment, most utilities have drastically reduced or eliminated research. Any collaboration with universities requires that any research effort have a focused objective, short-term duration, and tangible payback. Furthermore, the research must concentrate on solving operating problems, rather than on long-term general concerns. Although practical studies may seem mundane, untheoretical, and uninteresting for most academics, such pragmatic topics can provide interesting research for students and helpful results for the utilities. This paper provides examples of the author`s research funded by utilities. Each project hasmore » a specific objective involving a particular utility need or computer code analysis tool.« less
Development of the engineering design integration (EDIN) system: A computer aided design development
NASA Technical Reports Server (NTRS)
Glatt, C. R.; Hirsch, G. N.
1977-01-01
The EDIN (Engineering Design Integration) System which provides a collection of hardware and software, enabling the engineer to perform man-in-the-loop interactive evaluation of aerospace vehicle concepts, was considered. Study efforts were concentrated in the following areas: (1) integration of hardware with the Univac Exec 8 System; (2) development of interactive software for the EDIN System; (3) upgrading of the EDIN technology module library to an interactive status; (4) verification of the soundness of the developing EDIN System; (5) support of NASA in design analysis studies using the EDIN System; (6) provide training and documentation in the use of the EDIN System; and (7) provide an implementation plan for the next phase of development and recommendations for meeting long range objectives.
Geophysical parameters from the analysis of laser ranging to Starlette
NASA Technical Reports Server (NTRS)
Schutz, B. E.; Shum, C. K.; Tapley, B. D.
1991-01-01
The University of Texas Center for Space Research (UT/CSR) research efforts covering the time period from August 1, 1990 through January 31, 1991 have concentrated on the following areas: (1) Laser Data Processing (more than 15 years of Starlette data (1975-90) have been processed and cataloged); (2) Seasonal Variation of Zonal Tides (observed Starlette time series has been compared with meteorological data-derived time series); (3) Ocean Tide Solutions . (error analysis has been performed using Starlette and other tide solutions); and (4) Lunar Deceleration (formulation to compute theoretical lunar deceleration has been verified and applied to several tidal solutions). Concise descriptions of research achievement for each of the above areas are given. Copies of abstracts for some of the publications and conference presentations are included in the appendices.
Song, Tianqi; Garg, Sudhanshu; Mokhtar, Reem; Bui, Hieu; Reif, John
2018-01-19
A main goal in DNA computing is to build DNA circuits to compute designated functions using a minimal number of DNA strands. Here, we propose a novel architecture to build compact DNA strand displacement circuits to compute a broad scope of functions in an analog fashion. A circuit by this architecture is composed of three autocatalytic amplifiers, and the amplifiers interact to perform computation. We show DNA circuits to compute functions sqrt(x), ln(x) and exp(x) for x in tunable ranges with simulation results. A key innovation in our architecture, inspired by Napier's use of logarithm transforms to compute square roots on a slide rule, is to make use of autocatalytic amplifiers to do logarithmic and exponential transforms in concentration and time. In particular, we convert from the input that is encoded by the initial concentration of the input DNA strand, to time, and then back again to the output encoded by the concentration of the output DNA strand at equilibrium. This combined use of strand-concentration and time encoding of computational values may have impact on other forms of molecular computation.
Chen, Wei-Yu; Jou, Li-John; Chen, Suz-Hsin; Liao, Chung-Min
2012-05-01
Arsenic (As) is the element of greatest ecotoxicological concern in aquatic environments. Effective monitoring and diagnosis of As pollution via a biological early warning system is a great challenge for As-affected regions. The purpose of this study was to synthesize water chemistry-based bioavailability and valve daily rhythm in Corbicula fluminea to design a biomonitoring system for detecting waterborne As. We integrated valve daily rhythm dynamic patterns and water chemistry-based Hill dose-response model to build into a programmatic mechanism of inductance-based valvometry technique for providing a rapid and cost-effective dynamic detection system. A LabVIEW graphic control program in a personal computer was employed to demonstrate completely the functional presentation of the present dynamic system. We verified the simulated dissolved As concentrations based on the valve daily rhythm behavior with published experimental data. Generally, the performance of this proposed biomonitoring system demonstrates fairly good applicability to detect waterborne As concentrations when the field As concentrations are less than 1 mg L(-1). We also revealed that the detection times were dependent on As exposure concentrations. This biomonitoring system could particularly provide real-time transmitted information on the waterborne As activity under various aquatic environments. This parsimonious C. fluminea valve rhythm behavior-based real-time biomonitoring system presents a valuable effort to promote the automated biomonitoring and offers early warnings on potential ecotoxicological risks in regions with elevated As exposure concentrations.
Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen
2006-01-01
The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.
A two-dimensional analytical model of vapor intrusion involving vertical heterogeneity.
Yao, Yijun; Verginelli, Iason; Suuberg, Eric M
2017-05-01
In this work, we present an analytical chlorinated vapor intrusion (CVI) model that can estimate source-to-indoor air concentration attenuation by simulating two-dimensional (2-D) vapor concentration profile in vertically heterogeneous soils overlying a homogenous vapor source. The analytical solution describing the 2-D soil gas transport was obtained by applying a modified Schwarz-Christoffel mapping method. A partial field validation showed that the developed model provides results (especially in terms of indoor emission rates) in line with the measured data from a case involving a building overlying a layered soil. In further testing, it was found that the new analytical model can very closely replicate the results of three-dimensional (3-D) numerical models at steady state in scenarios involving layered soils overlying homogenous groundwater sources. By contrast, by adopting a two-layer approach (capillary fringe and vadose zone) as employed in the EPA implementation of the Johnson and Ettinger model, the spatially and temporally averaged indoor concentrations in the case of groundwater sources can be higher than the ones estimated by the numerical model up to two orders of magnitude. In short, the model proposed in this work can represent an easy-to-use tool that can simulate the subsurface soil gas concentration in layered soils overlying a homogenous vapor source while keeping the simplicity of an analytical approach that requires much less computational effort.
Experimental Evaluation and Workload Characterization for High-Performance Computer Architectures
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek A.
1995-01-01
This research is conducted in the context of the Joint NSF/NASA Initiative on Evaluation (JNNIE). JNNIE is an inter-agency research program that goes beyond typical.bencbking to provide and in-depth evaluations and understanding of the factors that limit the scalability of high-performance computing systems. Many NSF and NASA centers have participated in the effort. Our research effort was an integral part of implementing JNNIE in the NASA ESS grand challenge applications context. Our research work under this program was composed of three distinct, but related activities. They include the evaluation of NASA ESS high- performance computing testbeds using the wavelet decomposition application; evaluation of NASA ESS testbeds using astrophysical simulation applications; and developing an experimental model for workload characterization for understanding workload requirements. In this report, we provide a summary of findings that covers all three parts, a list of the publications that resulted from this effort, and three appendices with the details of each of the studies using a key publication developed under the respective work.
Quadratic Programming for Allocating Control Effort
NASA Technical Reports Server (NTRS)
Singh, Gurkirpal
2005-01-01
A computer program calculates an optimal allocation of control effort in a system that includes redundant control actuators. The program implements an iterative (but otherwise single-stage) algorithm of the quadratic-programming type. In general, in the quadratic-programming problem, one seeks the values of a set of variables that minimize a quadratic cost function, subject to a set of linear equality and inequality constraints. In this program, the cost function combines control effort (typically quantified in terms of energy or fuel consumed) and control residuals (differences between commanded and sensed values of variables to be controlled). In comparison with prior control-allocation software, this program offers approximately equal accuracy but much greater computational efficiency. In addition, this program offers flexibility, robustness to actuation failures, and a capability for selective enforcement of control requirements. The computational efficiency of this program makes it suitable for such complex, real-time applications as controlling redundant aircraft actuators or redundant spacecraft thrusters. The program is written in the C language for execution in a UNIX operating system.
Topical perspective on massive threading and parallelism.
Farber, Robert M
2011-09-01
Unquestionably computer architectures have undergone a recent and noteworthy paradigm shift that now delivers multi- and many-core systems with tens to many thousands of concurrent hardware processing elements per workstation or supercomputer node. GPGPU (General Purpose Graphics Processor Unit) technology in particular has attracted significant attention as new software development capabilities, namely CUDA (Compute Unified Device Architecture) and OpenCL™, have made it possible for students as well as small and large research organizations to achieve excellent speedup for many applications over more conventional computing architectures. The current scientific literature reflects this shift with numerous examples of GPGPU applications that have achieved one, two, and in some special cases, three-orders of magnitude increased computational performance through the use of massive threading to exploit parallelism. Multi-core architectures are also evolving quickly to exploit both massive-threading and massive-parallelism such as the 1.3 million threads Blue Waters supercomputer. The challenge confronting scientists in planning future experimental and theoretical research efforts--be they individual efforts with one computer or collaborative efforts proposing to use the largest supercomputers in the world is how to capitalize on these new massively threaded computational architectures--especially as not all computational problems will scale to massive parallelism. In particular, the costs associated with restructuring software (and potentially redesigning algorithms) to exploit the parallelism of these multi- and many-threaded machines must be considered along with application scalability and lifespan. This perspective is an overview of the current state of threading and parallelize with some insight into the future. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johanna H Oxstrand; Katya L Le Blanc
The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts wemore » are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups, sharing procedures between fellow coworkers, the use of multiple procedures at once, etc. were considered. The model describes which affordances associated with paper based procedures should be transferred to computer-based procedures as well as what features should not be incorporated. The model also provides a means to identify what new features not present in paper based procedures need to be added to the computer-based procedures to further enhance performance. The next step is to use the requirements and specifications to develop concepts and prototypes of computer-based procedures. User tests and other data collection efforts will be conducted to ensure that the real issues with field procedures and their usage are being addressed and solved in the best manner possible. This paper describes the baseline study, the construction of the model of procedure use, and the requirements and specifications for computer-based procedures that were developed based on the model. It also addresses how the model and the insights gained from it were used to develop concepts and prototypes for computer based procedures.« less
Near-Source Modeling Updates: Building Downwash & Near-Road
The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...
76 FR 17424 - President's National Security Telecommunications Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-29
... discuss and vote on the Communications Resiliency Report and receive an update on the cloud computing... Communications Resiliency Report III. Update on the Cloud Computing Scoping Effort IV. Closing Remarks Dated...
Conjugate Gradient Algorithms For Manipulator Simulation
NASA Technical Reports Server (NTRS)
Fijany, Amir; Scheid, Robert E.
1991-01-01
Report discusses applicability of conjugate-gradient algorithms to computation of forward dynamics of robotic manipulators. Rapid computation of forward dynamics essential to teleoperation and other advanced robotic applications. Part of continuing effort to find algorithms meeting requirements for increased computational efficiency and speed. Method used for iterative solution of systems of linear equations.
Using Information Technology in Mathematics Education.
ERIC Educational Resources Information Center
Tooke, D. James, Ed.; Henderson, Norma, Ed.
This collection of essays examines the history and impact of computers in mathematics and mathematics education from the early, computer-assisted instruction efforts through LOGO, the constructivist educational software for K-9 schools developed in the 1980s, to MAPLE, the computer algebra system for mathematical problem solving developed in the…
Cooperation Support in Computer-Aided Authoring and Learning.
ERIC Educational Resources Information Center
Muhlhauser, Max; Rudebusch, Tom
This paper discusses the use of Computer Supported Cooperative Work (CSCW) techniques for computer-aided learning (CAL); the work was started in the context of project Nestor, a joint effort of German universities about cooperative multimedia authoring/learning environments. There are four major categories of cooperation for CAL: author/author,…
2016 Annual Report - Argonne Leadership Computing Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, Jim; Papka, Michael E.; Cerny, Beth A.
The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.
Computer-Based Training: Capitalizing on Lessons Learned
ERIC Educational Resources Information Center
Bedwell, Wendy L.; Salas, Eduardo
2010-01-01
Computer-based training (CBT) is a methodology for providing systematic, structured learning; a useful tool when properly designed. CBT has seen a resurgence given the serious games movement, which is at the forefront of integrating primarily entertainment computer-based games into education and training. This effort represents a multidisciplinary…
"Computer Science Can Feed a Lot of Dreams"
ERIC Educational Resources Information Center
Educational Horizons, 2014
2014-01-01
Pat Yongpradit is the director of education at Code.org. He leads all education efforts, including professional development and curriculum creation, and he builds relationships with school districts. Pat joined "Educational Horizons" to talk about why it is important to teach computer science--even for non-computer science teachers. This…
ERIC Educational Resources Information Center
Oblinger, Diana
The Internet is an international network linking hundreds of smaller computer networks in North America, Europe, and Asia. Using the Internet, computer users can connect to a variety of computers with little effort or expense. The potential for use by college faculty is enormous. The largest problem faced by most users is understanding what such…
"Intelligent" Computer Assisted Instruction (CAI) Applications. Interim Report.
ERIC Educational Resources Information Center
Brown, John Seely; And Others
Interim work is documented describing efforts to modify computer techniques used to recognize and process English language requests to an instructional simulator. The conversion from a hand-coded to a table driven technique are described in detail. Other modifications to a simulation based computer assisted instruction program to allow a gaming…
NASA Astrophysics Data System (ADS)
Foo, Kam Keong
A two-dimensional dual-mode scramjet flowpath is developed and evaluated using the ANSYS Fluent density-based flow solver with various computational grids. Results are obtained for fuel-off, fuel-on non-reacting, and fuel-on reacting cases at different equivalence ratios. A one-step global chemical kinetics hydrogen-air model is used in conjunction with the eddy-dissipation model. Coarse, medium and fine computational grids are used to evaluate grid sensitivity and to investigate a lack of grid independence. Different grid adaptation strategies are performed on the coarse grid in an attempt to emulate the solutions obtained from the finer grids. The goal of this study is to investigate the feasibility of using various mesh adaptation criteria to significantly decrease computational efforts for high-speed reacting flows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakhai, B.
A new method for solving radiation transport problems is presented. The heart of the technique is a new cross section processing procedure for the calculation of group-to-point and point-to-group cross sections sets. The method is ideally suited for problems which involve media with highly fluctuating cross sections, where the results of the traditional multigroup calculations are beclouded by the group averaging procedures employed. Extensive computational efforts, which would be required to evaluate double integrals in the multigroup treatment numerically, prohibit iteration to optimize the energy boundaries. On the other hand, use of point-to-point techniques (as in the stochastic technique) ismore » often prohibitively expensive due to the large computer storage requirement. The pseudo-point code is a hybrid of the two aforementioned methods (group-to-group and point-to-point) - hence the name pseudo-point - that reduces the computational efforts of the former and the large core requirements of the latter. The pseudo-point code generates the group-to-point or the point-to-group transfer matrices, and can be coupled with the existing transport codes to calculate pointwise energy-dependent fluxes. This approach yields much more detail than is available from the conventional energy-group treatments. Due to the speed of this code, several iterations could be performed (in affordable computing efforts) to optimize the energy boundaries and the weighting functions. The pseudo-point technique is demonstrated by solving six problems, each depicting a certain aspect of the technique. The results are presented as flux vs energy at various spatial intervals. The sensitivity of the technique to the energy grid and the savings in computational effort are clearly demonstrated.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less
NASA Astrophysics Data System (ADS)
Florio, Gina; Stiso, Kimberly; Campanelli, Joseph; Dessources, Kimberly; Folkes, Trudi
2012-02-01
Scanning tunneling microscopy (STM) was used to investigate the molecular self-assembly of four different benzene carboxylic acid derivatives at the liquid/graphite interface: pyromellitic acid (1,2,4,5-benzenetetracarboxylic acid), trimellitic acid (1,2,4-benzenetricarboxylic acid), trimesic acid (1,3,5-benzenetricarboxylic acid), and 1,3,5-benzenetriacetic acid. A range of two dimensional networks are observed that depend sensitively on the number of carboxylic acids present, the nature of the solvent, and the solution concentration. We will describe our recent efforts to determine (a) the preferential two-dimensional structure(s) for each benzene carboxylic acid at the liquid/graphite interface, (b) the thermodynamic and kinetic factors influencing self-assembly (or lack thereof), (c) the role solvent plays in the assembly, (e) the effect of in situ versus ex situ dilution on surface packing density, and (f) the temporal evolution of the self-assembled monolayer. Results of computational analysis of analog molecules and model monolayer films will also be presented to aid assignment of network structures and to provide a qualitative picture of surface adsorption and network formation.
Implementation of a best management practice (BMP) system for a clay mining facility in Taiwan.
Lin, Jen-Yang; Chen, Yen-Chang; Chen, Walter; Lee, Tsu-Chuan; Yu, Shaw L
2006-01-01
The present paper describes the planning and implementation of a best management practice (BMP) system for a clay mining facility in Northern Taiwan. It is a challenge to plan and design BMPs for mitigating the impact of clay mining operations due to the fact that clay mining drainage typically contains very high concentrations of suspended solids (SS), Fe-ions, and [H+] concentrations. In the present study, a field monitoring effort was conducted to collect data for runoff quality and quantity from a clay mining area in Northern Taiwan. A BMP system including holding ponds connected in series was designed and implemented and its pollutant removal performance was assessed. The assessment was based on mass balance computations and an analysis of the relationship between BMP design parameters such as pond depth, detention time, surface loading rate, etc. and the pollutant removal efficiency. Field sampling results showed that the surface-loading rate is exponential related to the removing rate. The results provide the basis for a more comprehensive and efficient BMP implementation plan for clay mining operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
TEDESCHI AR; CORBETT JE; WILSON RA
2012-01-26
Simulant testing of a full-scale thin-film evaporator system was conducted in 2011 for technology development at the Hanford tank farms. Test results met objectives of water removal rate, effluent quality, and operational evaluation. Dilute tank waste simulant, representing a typical double-shell tank supernatant liquid layer, was concentrated from a 1.1 specific gravity to approximately 1.5 using a 4.6 m{sup 2} (50 ft{sup 2}) heated transfer area Rototherm{reg_sign} evaporator from Artisan Industries. The condensed evaporator vapor stream was collected and sampled validating efficient separation of the water. An overall decontamination factor of 1.2E+06 was achieved demonstrating excellent retention of key radioactivemore » species within the concentrated liquid stream. The evaporator system was supported by a modular steam supply, chiller, and control computer systems which would be typically implemented at the tank farms. Operation of these support systems demonstrated successful integration while identifying areas for efficiency improvement. Overall testing effort increased the maturation of this technology to support final deployment design and continued project implementation.« less
User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge
Koltun, G.F.; Gray, John R.; McElhone, T.J.
1994-01-01
Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.
Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Givi, P.; Frankel, S. H.; Adumitroaie, V.; Sabini, G.; Madnia, C. K.
1993-01-01
The primary objective of this research is to extend current capabilities of Large Eddy Simulations (LES) and Direct Numerical Simulations (DNS) for the computational analyses of high speed reacting flows. Our efforts in the first two years of this research have been concentrated on a priori investigations of single-point Probability Density Function (PDF) methods for providing subgrid closures in reacting turbulent flows. In the efforts initiated in the third year, our primary focus has been on performing actual LES by means of PDF methods. The approach is based on assumed PDF methods and we have performed extensive analysis of turbulent reacting flows by means of LES. This includes simulations of both three-dimensional (3D) isotropic compressible flows and two-dimensional reacting planar mixing layers. In addition to these LES analyses, some work is in progress to assess the extent of validity of our assumed PDF methods. This assessment is done by making detailed companions with recent laboratory data in predicting the rate of reactant conversion in parallel reacting shear flows. This report provides a summary of our achievements for the first six months of the third year of this program.
A high-level 3D visualization API for Java and ImageJ.
Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin
2010-05-21
Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.
Cooperation partners in information sharing within the context of an Asian cancer network.
Kawahara, Norie
2007-01-01
It would be a great mistake to analyze the health situation in Asia relying on the focus on individualism inherent in the sense of values of Europeans and Americans. Cooperation across fields is indispensable for effective control of the epidemic of disease we are facing in the 21st century. We need to concentrate efforts on bringing together specialists, not only within the various areas of medical practice, but also across such fields as economics, politics and information technology (IT). Asia differs from Europe and America in that it does not have any group political structure and therefore we must rely on voluntary integration of our efforts if we are to achieve the most effective application of our combined resources. Non-intervention in internal affairs is naturally a very important condition for success. Sharing of information while abiding by national regulations regarding medical data confidentiality does pose difficulties, but gentle persuasion to standardize processes with a shared commitment to overcoming problems should reduce opposition. Our common purpose in maintaining healthy societies, whether we be scientists, medical staff, economists, computer specialists or politicians, provides the bond. Ways and means by which this bond can be strengthened deserve our attention.
Application Portable Parallel Library
NASA Technical Reports Server (NTRS)
Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott
1995-01-01
Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.
The Mental Effort-Reward Imbalances Model and Its Implications for Behaviour Management
ERIC Educational Resources Information Center
Poulton, Alison; Whale, Samina; Robinson, Joanne
2016-01-01
Attention deficit hyperactivity disorder (ADHD) is frequently associated with oppositional defiant disorder (ODD). The Mental Effort Reward Imbalances Model (MERIM) explains this observational association as follows: in ADHD a disproportionate level of mental effort is required for sustaining concentration for achievement; in ODD the subjective…
Infrared Algorithm Development for Ocean Observations with EOS/MODIS
NASA Technical Reports Server (NTRS)
Brown, Otis B.
1997-01-01
Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
Staff | Computational Science | NREL
develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High
ERIC Educational Resources Information Center
Dasuki, Salihu Ibrahim; Ogedebe, Peter; Kanya, Rislana Abdulazeez; Ndume, Hauwa; Makinde, Julius
2015-01-01
Efforts are been made by Universities in developing countries to ensure that it's graduate are not left behind in the competitive global information society; thus have adopted international computing curricular for their computing degree programs. However, adopting these international curricula seem to be very challenging for developing countries…
Automated computer grading of hardwood lumber
P. Klinkhachorn; J.P. Franklin; Charles W. McMillin; R.W. Conners; H.A. Huber
1988-01-01
This paper describes an improved computer program to grade hardwood lumber. The program was created as part of a system to automate various aspects of the hardwood manufacturing industry. It enhances previous efforts by considering both faces of the board and provides easy application of species dependent rules. The program can be readily interfaced with a computer...
Distance Learning and Cloud Computing: "Just Another Buzzword or a Major E-Learning Breakthrough?"
ERIC Educational Resources Information Center
Romiszowski, Alexander J.
2012-01-01
"Cloud computing is a model for the enabling of ubiquitous, convenient, and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and other services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." This…
The Use of Computers in the Math Classroom.
ERIC Educational Resources Information Center
Blass, Barbara; And Others
In an effort to increase faculty use and knowledge of computers, Oakland Community College (OCC), in Michigan, developed a Summer Technology Institute (STI), and a Computer Technology Grants (CTG) project beginning in 1989. The STI involved 3-day forums during summers 1989, 1990, and 1991 to expose faculty to hardware and software applications.…
Commentary: It Is Not Only about the Computers--An Argument for Broadening the Conversation
ERIC Educational Resources Information Center
DeWitt, Scott W.
2006-01-01
In 2002 the members of the National Technology Leadership Initiative (NTLI) framed seven conclusions relating to handheld computers and ubiquitous computing in schools. While several of the conclusions are laudable efforts to increase research and professional development, the factual and conceptual bases for this document are seriously flawed.…
The Relationship between Computational Fluency and Student Success in General Studies Mathematics
ERIC Educational Resources Information Center
Hegeman, Jennifer; Waters, Gavin
2012-01-01
Many developmental mathematics programs emphasize computational fluency with the assumption that this is a necessary contributor to student success in general studies mathematics. In an effort to determine which skills are most essential, scores on a computational fluency test were correlated with student success in general studies mathematics at…
NASA Technical Reports Server (NTRS)
Iida, H. T.
1966-01-01
Computational procedure reduces the numerical effort whenever the method of finite differences is used to solve ablation problems for which the surface recession is large relative to the initial slab thickness. The number of numerical operations required for a given maximum space mesh size is reduced.
Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.
ERIC Educational Resources Information Center
Knerr, Bruce W.; And Others
Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…
Computational protein design-the next generation tool to expand synthetic biology applications.
Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel
2018-05-02
One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
Many Masses on One Stroke:. Economic Computation of Quark Propagators
NASA Astrophysics Data System (ADS)
Frommer, Andreas; Nöckel, Bertold; Güsken, Stephan; Lippert, Thomas; Schilling, Klaus
The computational effort in the calculation of Wilson fermion quark propagators in Lattice Quantum Chromodynamics can be considerably reduced by exploiting the Wilson fermion matrix structure in inversion algorithms based on the non-symmetric Lanczos process. We consider two such methods: QMR (quasi minimal residual) and BCG (biconjugate gradients). Based on the decomposition M/κ = 1/κ-D of the Wilson mass matrix, using QMR, one can carry out inversions on a whole trajectory of masses simultaneously, merely at the computational expense of a single propagator computation. In other words, one has to compute the propagator corresponding to the lightest mass only, while all the heavier masses are given for free, at the price of extra storage. Moreover, the symmetry γ5M = M†γ5 can be used to cut the computational effort in QMR and BCG by a factor of two. We show that both methods then become — in the critical regime of small quark masses — competitive to BiCGStab and significantly better than the standard MR method, with optimal relaxation factor, and CG as applied to the normal equations.
Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design
NASA Technical Reports Server (NTRS)
Tweedt, Daniel L.
2010-01-01
Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.
F‐GHG Emissions Reduction Efforts: FY2015 Supplier Profiles
The Supplier Profiles outlined in this document detail the efforts of large‐area flat panel suppliers to reduce their F‐GHG emissions in manufacturing facilities that make today’s large‐area panels used for products such as TVs and computer monitors.
F‐GHG Emissions Reduction Efforts: FY2016 Supplier Profiles
The Supplier Profiles outlined in this document detail the efforts of large‐area flat panel suppliers to reduce their F‐GHG emissions in manufacturing facilities that make today’s large‐area panels used for products such as TVs and computer monitors.
Operating manual for coaxial injection combustion model. [for the space shuttle main engine
NASA Technical Reports Server (NTRS)
Sutton, R. D.; Schuman, M. D.; Chadwick, W. D.
1974-01-01
An operating manual for the coaxial injection combustion model (CICM) is presented as the final report for an eleven month effort designed to provide improvement, to verify, and to document the comprehensive computer program for analyzing the performance of thrust chamber operation with gas/liquid coaxial jet injection. The effort culminated in delivery of an operation FORTRAN IV computer program and associated documentation pertaining to the combustion conditions in the space shuttle main engine. The computer program is structured for compatibility with the standardized Joint Army-Navy-NASA-Air Force (JANNAF) performance evaluation procedure. Use of the CICM in conjunction with the JANNAF procedure allows the analysis of engine systems using coaxial gas/liquid injection.
Aviation security : vulnerabilities still exist in the aviation security system
DOT National Transportation Integrated Search
2000-04-06
The testimony today discusses the Federal Aviation Administration's (FAA) efforts to implement and improve security in two key areas: air traffic control computer systems and airport passenger screening checkpoints. Computer systems-and the informati...
Psychological Issues in Online Adaptive Task Allocation
NASA Technical Reports Server (NTRS)
Morris, N. M.; Rouse, W. B.; Ward, S. L.; Frey, P. R.
1984-01-01
Adaptive aiding is an idea that offers potential for improvement over many current approaches to aiding in human-computer systems. The expected return of tailoring the system to fit the user could be in the form of improved system performance and/or increased user satisfaction. Issues such as the manner in which information is shared between human and computer, the appropriate division of labor between them, and the level of autonomy of the aid are explored. A simulated visual search task was developed. Subjects are required to identify targets in a moving display while performing a compensatory sub-critical tracking task. By manipulating characteristics of the situation such as imposed task-related workload and effort required to communicate with the computer, it is possible to create conditions in which interaction with the computer would be more or less desirable. The results of preliminary research using this experimental scenario are presented, and future directions for this research effort are discussed.
Automated Boundary Conditions for Wind Tunnel Simulations
NASA Technical Reports Server (NTRS)
Carlson, Jan-Renee
2018-01-01
Computational fluid dynamic (CFD) simulations of models tested in wind tunnels require a high level of fidelity and accuracy particularly for the purposes of CFD validation efforts. Considerable effort is required to ensure the proper characterization of both the physical geometry of the wind tunnel and recreating the correct flow conditions inside the wind tunnel. The typical trial-and-error effort used for determining the boundary condition values for a particular tunnel configuration are time and computer resource intensive. This paper describes a method for calculating and updating the back pressure boundary condition in wind tunnel simulations by using a proportional-integral-derivative controller. The controller methodology and equations are discussed, and simulations using the controller to set a tunnel Mach number in the NASA Langley 14- by 22-Foot Subsonic Tunnel are demonstrated.
A Noise-Assisted Reprogrammable Nanomechanical Logic Gate
2009-01-01
effort toward scalable mechanical computation.1-4 This effort can be traced back to 1822 (at least), when Charles Babbage presented a mechanical...the ONR (N000140910963). REFERENCES AND NOTES (1) Babbage , H. P. Babbage’s Calculating Engines; Charles Babbage Reprint Series for the History of
On the evaluation of derivatives of Gaussian integrals
NASA Technical Reports Server (NTRS)
Helgaker, Trygve; Taylor, Peter R.
1992-01-01
We show that by a suitable change of variables, the derivatives of molecular integrals over Gaussian-type functions required for analytic energy derivatives can be evaluated with significantly less computational effort than current formulations. The reduction in effort increases with the order of differentiation.
Meyniel, Florent; Safra, Lou; Pessiglione, Mathias
2014-01-01
A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty. PMID:24743711
Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, Wes
2016-07-24
The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less
Identifying Differences between Depressed Adolescent Suicide Ideators and Attempters
Auerbach, Randy P.; Millner, Alexander J.; Stewart, Jeremy G.; Esposito, Erika
2015-01-01
Background Adolescent depression and suicide are pressing public health concerns, and identifying key differences among suicide ideators and attempters is critical. The goal of the current study is to test whether depressed adolescent suicide attempters report greater anhedonia severity and exhibit aberrant effort-cost computations in the face of uncertainty. Methods Depressed adolescents (n = 101) ages 13–19 years were administered structured clinical interviews to assess current mental health disorders and a history of suicidality (suicide ideators = 55, suicide attempters = 46). Then, participants completed self-report instruments assessing symptoms of suicidal ideation, depression, anhedonia, and anxiety as well as a computerized effort-cost computation task. Results Compared with depressed adolescent suicide ideators, attempters report greater anhedonia severity, even after concurrently controlling for symptoms of suicidal ideation, depression, and anxiety. Additionally, when completing the effort-cost computation task, suicide attempters are less likely to pursue the difficult, high value option when outcomes are uncertain. Follow-up, trial-level analyses of effort-cost computations suggest that receipt of reward does not influence future decision-making among suicide attempters, however, suicide ideators exhibit a win-stay approach when receiving rewards on previous trials. Limitations Findings should be considered in light of limitations including a modest sample size, which limits generalizability, and the cross-sectional design. Conclusions Depressed adolescent suicide attempters are characterized by greater anhedonia severity, which may impair the ability to integrate previous rewarding experiences to inform future decisions. Taken together, this may generate a feeling of powerlessness that contributes to increased suicidality and a needless loss of life. PMID:26233323
Developing and evaluating prediactive strategies to elucidate the mode of biological activity of environmental chemicals is a major objective of the concerted efforts of the US-EPA's computational toxicology program.
Globus Quick Start Guide. Globus Software Version 1.1
NASA Technical Reports Server (NTRS)
1999-01-01
The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
NASA Technical Reports Server (NTRS)
Lane, R. L.
1981-01-01
Six growth runs used the Kayex-Hameo Automatic Games Logic (AGILE) computer based system for growth from larger melts in the Mod CG2000. The implementation of the melt pyrometer sensor allowed for dip temperature monitoring and usage by the operator/AGILE system. Use of AGILE during recharge operations was successfully evaluated. The tendency of crystals to lose cylindrical shape (spiraling) continued to be a problem. The hygrometer was added to the Furnace Gas Analysis System and used on several growth runs. The gas chromatograph, including the integrator, was also used for more accurate carbon monoxide concentration measurements. Efforts continued for completing the automation of the total Gas Analysis System. An economic analysis, based on revised achievable straight growth rate, is presented.
A Comparison of Approaches for Solving Hard Graph-Theoretic Problems
2015-05-01
collaborative effort “ Adiabatic Quantum Computing Applications Research” (14-RI-CRADA-02) between the Information Directorate and Lock- 3 Algorithm 3...using Matlab, a quantum annealing approach using the D-Wave computer , and lastly using satisfiability modulo theory (SMT) and corresponding SMT...methods are explored and consist of a parallel computing approach using Matlab, a quantum annealing approach using the D-Wave computer , and lastly using
Computational Omics Pre-Awardees | Office of Cancer Clinical Proteomics Research
The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the pre-awardees of the Computational Omics solicitation. Working with NVIDIA Foundation's Compute the Cure initiative and Leidos Biomedical Research Inc., the NCI, through this solicitation, seeks to leverage computational efforts to provide tools for the mining and interpretation of large-scale publicly available ‘omics’ datasets.
Computers on Wheels: An Alternative to Each One Has One
ERIC Educational Resources Information Center
Grant, Michael M.; Ross, Steven M.; Wang, Weiping; Potter, Allison
2005-01-01
Four fifth-grade classrooms embarked on a modified ubiquitous computing initiative in the fall of 2003. Two 15-computer wireless laptop carts were shared among the four classrooms in an effort to integrate technology across the curriculum and affect change in student learning and teacher pedagogy. This initiative--in contrast to other one-to-one…
Apple Seeks To Regain Its Stature in World of Academic Computing.
ERIC Educational Resources Information Center
Young, Jeffrey R.; Blumenstyk, Goldie
1998-01-01
Managers of Apple Computer, the company that pioneered campus personal computing and later lost most of its share of the market, are again focusing energies on academic buyers. Campus technology officials, even those fond of Apples, are greeting the company's efforts with caution. Some feel it may be too late for Apple to regain a significant…
Crossbar Nanocomputer Development
2012-04-01
their utilization. Areas such as neuromorphic computing, signal processing, arithmetic processing, and crossbar computing are only some of the...due to its intrinsic, network-on- chip flexibility to re-route around defects. Preliminary efforts in crossbar computing have been demonstrated by...they approach their scaling limits [2]. Other applications that memristive devices are suited for include FPGA [3], encryption [4], and neuromorphic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, R.E.
1983-11-01
Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2013 CFR
2013-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2014 CFR
2014-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2012 CFR
2012-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2011 CFR
2011-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Reading Teachers' Beliefs and Utilization of Computer and Technology: A Case Study
ERIC Educational Resources Information Center
Remetio, Jessica Espinas
2014-01-01
Many researchers believe that computers have the ability to help improve the reading skills of students. In an effort to improve the poor reading scores of students on state tests, as well as improve students' overall academic performance, computers and other technologies have been installed in Frozen Bay School classrooms. As the success of these…
Attitudes of Design Students toward Computer Usage in Design
ERIC Educational Resources Information Center
Pektas, Sule Tasli; Erkip, Feyzan
2006-01-01
The success of efforts to integrate technology with design education is largely affected by the attitudes of students toward technology. This paper presents the findings of a research on the attitudes of design students toward the use of computers in design and its correlates. Computer Aided Design (CAD) tools are the most widely used computer…
ERIC Educational Resources Information Center
Peng, Jacob C.
2009-01-01
The author investigated whether students' effort in working on homework problems was affected by their need for cognition, their perception of the system, and their computer efficacy when instructors used an online system to collect accounting homework. Results showed that individual intrinsic motivation and computer efficacy are important factors…
ERIC Educational Resources Information Center
Good, Jonathon; Keenan, Sarah; Mishra, Punya
2016-01-01
The popular press is rife with examples of how students in the United States and around the globe are learning to program, make, and tinker. The Hour of Code, maker-education, and similar efforts are advocating that more students be exposed to principles found within computer science. We propose an expansion beyond simply teaching computational…
Human-computer interaction in multitask situations
NASA Technical Reports Server (NTRS)
Rouse, W. B.
1977-01-01
Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.
Robotics-Centered Outreach Activities: An Integrated Approach
ERIC Educational Resources Information Center
Ruiz-del-Solar, Javier
2010-01-01
Nowadays, universities are making extensive efforts to attract prospective students to the fields of electrical, electronic, and computer engineering. Thus, outreach is becoming increasingly important, and activities with schoolchildren are being extensively carried out as part of this effort. In this context, robotics is a very attractive and…
Powsiri Klinkhachorn; J. Moody; Philip A. Araman
1995-01-01
For the past few decades, researchers have devoted time and effort to apply automation and modern computer technologies towards improving the productivity of traditional industries. To be competitive, one must streamline operations and minimize production costs, while maintaining an acceptable margin of profit. This paper describes the effort of one such endeavor...
Positioning Continuing Education Computer Programs for the Corporate Market.
ERIC Educational Resources Information Center
Tilney, Ceil
1993-01-01
Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)
Researching and Reducing the Health Burden of Stroke
... the result of continuing research to map the brain and interface it with a computer to enable stroke patients to regain function. How important is the new effort to map the human brain? The brain is more complex than any computer ...
Brain transcriptome atlases: a computational perspective.
Mahfouz, Ahmed; Huisman, Sjoerd M H; Lelieveldt, Boudewijn P F; Reinders, Marcel J T
2017-05-01
The immense complexity of the mammalian brain is largely reflected in the underlying molecular signatures of its billions of cells. Brain transcriptome atlases provide valuable insights into gene expression patterns across different brain areas throughout the course of development. Such atlases allow researchers to probe the molecular mechanisms which define neuronal identities, neuroanatomy, and patterns of connectivity. Despite the immense effort put into generating such atlases, to answer fundamental questions in neuroscience, an even greater effort is needed to develop methods to probe the resulting high-dimensional multivariate data. We provide a comprehensive overview of the various computational methods used to analyze brain transcriptome atlases.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
Xue, Ying; Rusli, Jannov; Chang, Hou-Min; Phillips, Richard; Jameel, Hasan
2012-02-01
Process simulation and lab trials were carried out to demonstrate and confirm the efficiency of the concept that recycling hydrolysate at low total solid enzymatic hydrolysis is one of the options to increase the sugar concentration without mixing problems. Higher sugar concentration can reduce the capital cost for fermentation and distillation because of smaller retention volume. Meanwhile, operation cost will also decrease for less operating volume and less energy required for distillation. With the computer simulation, time and efforts can be saved to achieve the steady state of recycling process, which is the scenario for industrial production. This paper, to the best of our knowledge, is the first paper discussing steady-state saccharification with recycling of the filtrate form enzymatic hydrolysis to increase sugar concentration. Recycled enzymes in the filtrate (15-30% of the original enzyme loading) resulted in 5-10% higher carbohydrate conversion compared to the case in which recycled enzymes were denatured. The recycled hydrolysate yielded 10% higher carbohydrate conversion compared to pure sugar simulated hydrolysate at the same enzyme loading, which indicated hydrolysis by-products could boost enzymatic hydrolysis. The high sugar concentration (pure sugar simulated) showed inhibition effect, since about 15% decrease in carbohydrate conversion was observed compared with the case with no sugar added. The overall effect of hydrolysate recycling at WinGEMS simulated steady-state conditions with 5% total solids was increasing the sugar concentration from 35 to 141 g/l, while the carbohydrate conversion was 2% higher for recycling at steady state (87%) compared with no recycling strategy (85%). Ten percent and 15% total solid processes were also evaluated in this study.
Space shuttle low cost/risk avionics study
NASA Technical Reports Server (NTRS)
1971-01-01
All work breakdown structure elements containing any avionics related effort were examined for pricing the life cycle costs. The analytical, testing, and integration efforts are included for the basic onboard avionics and electrical power systems. The design and procurement of special test equipment and maintenance and repair equipment are considered. Program management associated with these efforts is described. Flight test spares and labor and materials associated with the operations and maintenance of the avionics systems throughout the horizontal flight test are examined. It was determined that cost savings can be achieved by using existing hardware, maximizing orbiter-booster commonality, specifying new equipments to MIL quality standards, basing redundancy on cost effective analysis, minimizing software complexity and reducing cross strapping and computer-managed functions, utilizing compilers and floating point computers, and evolving the design as dictated by the horizontal flight test schedules.
Distributed Accounting on the Grid
NASA Technical Reports Server (NTRS)
Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.
2001-01-01
By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.
NASA Technical Reports Server (NTRS)
1993-01-01
This video documents efforts at NASA Langley Research Center to improve safety and economy in aircraft. Featured are the cockpit weather information needs computer system, which relays real time weather information to the pilot, and efforts to improve techniques to detect structural flaws and corrosion, such as the thermal bond inspection system.
MUMPS Based Integration of Disparate Computer-Assisted Medical Diagnosis Modules
1989-12-12
modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used to develop an...Abdominal and Chest Pain modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used
Interactive Electronic Storybooks for Kindergartners to Promote Vocabulary Growth
ERIC Educational Resources Information Center
Smeets, Daisy J. H.; Bus, Adriana G.
2012-01-01
The goals of this study were to examine (a) whether extratextual vocabulary instructions embedded in electronic storybooks facilitated word learning over reading alone and (b) whether instructional formats that required children to invest more effort were more effective than formats that required less effort. A computer-based "assistant" was added…
Evaluation of the UnTRIM model for 3-D tidal circulation
Cheng, R.T.; Casulli, V.; ,
2001-01-01
A family of numerical models, known as the TRIM models, shares the same modeling philosophy for solving the shallow water equations. A characteristic analysis of the shallow water equations points out that the numerical instability is controlled by the gravity wave terms in the momentum equations and by the transport terms in the continuity equation. A semi-implicit finite-difference scheme has been formulated so that these terms and the vertical diffusion terms are treated implicitly and the remaining terms explicitly to control the numerical stability and the computations are carried out over a uniform finite-difference computational mesh without invoking horizontal or vertical coordinate transformations. An unstructured grid version of TRIM model is introduced, or UnTRIM (pronounces as "you trim"), which preserves these basic numerical properties and modeling philosophy, only the computations are carried out over an unstructured orthogonal grid. The unstructured grid offers the flexibilities in representing complex study areas so that fine grid resolution can be placed in regions of interest, and coarse grids are used to cover the remaining domain. Thus, the computational efforts are concentrated in areas of importance, and an overall computational saving can be achieved because the total number of grid-points is dramatically reduced. To use this modeling approach, an unstructured grid mesh must be generated to properly reflect the properties of the domain of the investigation. The new modeling flexibility in grid structure is accompanied by new challenges associated with issues of grid generation. To take full advantage of this new model flexibility, the model grid generation should be guided by insights into the physics of the problems; and the insights needed may require a higher degree of modeling skill.
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-01-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-09-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.
Understanding human visual systems and its impact on our intelligent instruments
NASA Astrophysics Data System (ADS)
Strojnik Scholl, Marija; Páez, Gonzalo; Scholl, Michelle K.
2013-09-01
We review the evolution of machine vision and comment on the cross-fertilization from the neural sciences onto flourishing fields of neural processing, parallel processing, and associative memory in optical sciences and computing. Then we examine how the intensive efforts in mapping the human brain have been influenced by concepts in computer sciences, control theory, and electronic circuits. We discuss two neural paths that employ the input from the vision sense to determine the navigational options and object recognition. They are ventral temporal pathway for object recognition (what?) and dorsal parietal pathway for navigation (where?), respectively. We describe the reflexive and conscious decision centers in cerebral cortex involved with visual attention and gaze control. Interestingly, these require return path though the midbrain for ocular muscle control. We find that the cognitive psychologists currently study human brain employing low-spatial-resolution fMRI with temporal response on the order of a second. In recent years, the life scientists have concentrated on insect brains to study neural processes. We discuss how reflexive and conscious gaze-control decisions are made in the frontal eye field and inferior parietal lobe, constituting the fronto-parietal attention network. We note that ethical and experiential learnings impact our conscious decisions.
NASA Technical Reports Server (NTRS)
2005-01-01
Topics include: Hidden Identification on Parts: Magnetic Machine-Readable Matrix Symbols; System for Processing Coded OFDM Under Doppler and Fading; Multipurpose Hyperspectral Imaging System; Magnetic-Flux-Compensated Voltage Divider; High-Performance Satellite/Terrestrial-Network Gateway; Internet-Based System for Voice Communication With the ISS; Stripline/Microstrip Transition in Multilayer Circuit Board; Dual-Band Feed for a Microwave Reflector Antenna; Quadratic Programming for Allocating Control Effort; Range Process Simulation Tool; Simulator of Space Communication Networks; Computing Q-D Relationships for Storage of Rocket Fuels; Contour Error Map Algorithm; Portfolio Analysis Tool; Glass Frit Filters for Collecting Metal Oxide Nanoparticles; Anhydrous Proton-Conducting Membranes for Fuel Cells; Portable Electron-Beam Free-Form Fabrication System; Miniature Laboratory for Detecting Sparse Biomolecules; Multicompartment Liquid-Cooling/Warming Protective Garments; Laser Metrology for an Optical-Path-Length Modulator; PCM Passive Cooling System Containing Active Subsystems; Automated Electrostatics Environmental Chamber; Estimating Aeroheating of a 3D Body Using a 2D Flow Solver; Artificial Immune System for Recognizing Patterns; Computing the Thermodynamic State of a Cryogenic Fluid; Safety and Mission Assurance Performance Metric; Magnetic Control of Concentration Gradient in Microgravity; Avionics for a Small Robotic Inspection Spacecraft; and Simulation of Dynamics of a Flexible Miniature Airplane.
Challenging the Myth of Disability.
ERIC Educational Resources Information Center
Brightman, Alan
1989-01-01
Discussion of the rhetoric of disability, including physical, hearing, and visual impairments, highlights possible benefits that computer technology can provide. Designing for disabled individuals is discussed, and product development efforts by Apple Computer to increase microcomputer access to disabled children and adults are described. (LRW)
ERIC Educational Resources Information Center
Stanford Univ., CA. Inst. for Mathematical Studies in Social Science.
In 1963, the Institute began a program of research and development in computer-assisted instruction (CAI). Their efforts have been funded at various times by the Carnegie Corporation of New York, The National Science Foundation and the United States Office of Education. Starting with a medium-sized computer and six student stations, the Institute…
Efficient Computational Prototyping of Mixed Technology Microfluidic Components and Systems
2002-08-01
AFRL-IF-RS-TR-2002-190 Final Technical Report August 2002 EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC...SUBTITLE EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC COMPONENTS AND SYSTEMS 6. AUTHOR(S) Narayan R. Aluru, Jacob White...Aided Design (CAD) tools for microfluidic components and systems were developed in this effort. Innovative numerical methods and algorithms for mixed
ERIC Educational Resources Information Center
Ke, Fengfeng; Im, Tami
2014-01-01
This case study examined team-based computer-game design efforts by children with diverse abilities to explore the nature of their collective design actions and cognitive processes. Ten teams of middle-school children, with a high percentage of minority students, participated in a 6-weeks, computer-assisted math-game-design program. Essential…
Computer Science Lesson Study: Building Computing Skills among Elementary School Teachers
ERIC Educational Resources Information Center
Newman, Thomas R.
2017-01-01
The lack of diversity in the technology workforce in the United States has proven to be a stubborn problem, resisting even the most well-funded reform efforts. With the absence of computer science education in the mainstream K-12 curriculum, only a narrow band of students in public schools go on to careers in technology. The problem persists…
Using Computer Games to Train Information Warfare Teams
2004-01-01
Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2004 2004 Paper No 1729 Page 1 of 10 Using Computer Games to...responses they will experience on real missions is crucial. 3D computer games have proved themselves to be highly effective in engaging players...motivationally and emotionally. This effort, therefore, uses gaming technology to provide realistic simulations. These games are augmented with
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Information Management and Technology Div.
This report was prepared in response to a request from the Senate Committee on Commerce, Science, and Transportation, and from the House Committee on Science, Space, and Technology, for information on efforts to develop high-speed computer networks in the United States, Europe (limited to France, Germany, Italy, the Netherlands, and the United…
Terrestrial implications of mathematical modeling developed for space biomedical research
NASA Technical Reports Server (NTRS)
Lujan, Barbara F.; White, Ronald J.; Leonard, Joel I.; Srinivasan, R. Srini
1988-01-01
This paper summarizes several related research projects supported by NASA which seek to apply computer models to space medicine and physiology. These efforts span a wide range of activities, including mathematical models used for computer simulations of physiological control systems; power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and computer-aided diagnosis programs.
ERIC Educational Resources Information Center
Hiebert, Elfrieda H.; And Others
This report summarizes the curriculum development and research effort that took place at the Cupertino Apple Classrooms of Tomorrow (ACOT) site from January through June 1987. Based on the premise that computers make revising and editing much easier, the four major objectives emphasized by the computer-intensive writing program are fluency,…
Small Computer Applications for Base Supply.
1984-03-01
research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign
ERIC Educational Resources Information Center
Bozzone, Meg A.
1997-01-01
Purchasing custom-made desks with durable glass tops to house computers and double as student work space solved the problem of how to squeeze in additional classroom computers at Johnson Park Elementary School in Princeton, New Jersey. This article describes a K-5 grade school's efforts to overcome barriers to integrating technology. (PEN)
Twenty Years of Girls into Computing Days: Has It Been Worth the Effort?
ERIC Educational Resources Information Center
Craig, Annemieke; Lang, Catherine; Fisher, Julie
2008-01-01
The first documented day-long program to encourage girls to consider computing as a career was held in 1987 in the U.K. Over the last 20 years these one-day events, labeled "Girls into Computing" days, have been conducted by academics and professionals to foster female-student interest in information technology (IT) degrees and careers.…
Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities. AI Memo 417.
ERIC Educational Resources Information Center
Carr, Brian
Wusor II is the second intelligent computer aided instruction (ICAI) program that has been developed to monitor the progress of, and offer suggestions to, students playing Wumpus, a computer game designed to teach logical thinking and problem solving. From the earlier efforts with Wusor I, it was possible to produce a rule-based expert which…
NASA Astrophysics Data System (ADS)
Hurtado, Daniel E.; Rojas, Guillermo
2018-04-01
Computer simulations constitute a powerful tool for studying the electrical activity of the human heart, but computational effort remains prohibitively high. In order to recover accurate conduction velocities and wavefront shapes, the mesh size in linear element (Q1) formulations cannot exceed 0.1 mm. Here we propose a novel non-conforming finite-element formulation for the non-linear cardiac electrophysiology problem that results in accurate wavefront shapes and lower mesh-dependance in the conduction velocity, while retaining the same number of global degrees of freedom as Q1 formulations. As a result, coarser discretizations of cardiac domains can be employed in simulations without significant loss of accuracy, thus reducing the overall computational effort. We demonstrate the applicability of our formulation in biventricular simulations using a coarse mesh size of ˜ 1 mm, and show that the activation wave pattern closely follows that obtained in fine-mesh simulations at a fraction of the computation time, thus improving the accuracy-efficiency trade-off of cardiac simulations.
Computational Lipidomics and Lipid Bioinformatics: Filling In the Blanks.
Pauling, Josch; Klipp, Edda
2016-12-22
Lipids are highly diverse metabolites of pronounced importance in health and disease. While metabolomics is a broad field under the omics umbrella that may also relate to lipids, lipidomics is an emerging field which specializes in the identification, quantification and functional interpretation of complex lipidomes. Today, it is possible to identify and distinguish lipids in a high-resolution, high-throughput manner and simultaneously with a lot of structural detail. However, doing so may produce thousands of mass spectra in a single experiment which has created a high demand for specialized computational support to analyze these spectral libraries. The computational biology and bioinformatics community has so far established methodology in genomics, transcriptomics and proteomics but there are many (combinatorial) challenges when it comes to structural diversity of lipids and their identification, quantification and interpretation. This review gives an overview and outlook on lipidomics research and illustrates ongoing computational and bioinformatics efforts. These efforts are important and necessary steps to advance the lipidomics field alongside analytic, biochemistry, biomedical and biology communities and to close the gap in available computational methodology between lipidomics and other omics sub-branches.
The Computational Infrastructure for Geodynamics as a Community of Practice
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2016-12-01
Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.
Aircraft integrated design and analysis: A classroom experience
NASA Technical Reports Server (NTRS)
1988-01-01
AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.
40 CFR 721.91 - Computation of estimated surface water concentrations: Instructions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Computation of estimated surface water concentrations: Instructions. 721.91 Section 721.91 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT SIGNIFICANT NEW USES OF CHEMICAL SUBSTANCES Certain Significant New Uses § 721.91 Computation of...
40 CFR 721.91 - Computation of estimated surface water concentrations: Instructions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Computation of estimated surface water concentrations: Instructions. 721.91 Section 721.91 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT SIGNIFICANT NEW USES OF CHEMICAL SUBSTANCES Certain Significant New Uses § 721.91 Computation of...
Telecommunications: Working To Enhance Global Understanding and Peace Education.
ERIC Educational Resources Information Center
Schrum, Lynne M.
This paper describes educational activities that make use of microcomputers and information networks to link elementary and secondary students electronically using telecommunications, i.e., communication across distances using personal computers, modems, telephone lines, and computer networks. Efforts to promote global understanding and awareness…
Measuring Impact of EPAs Computational Toxicology Research (BOSC)
Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...
Aerodynamic Characterization of a Modern Launch Vehicle
NASA Technical Reports Server (NTRS)
Hall, Robert M.; Holland, Scott D.; Blevins, John A.
2011-01-01
A modern launch vehicle is by necessity an extremely integrated design. The accurate characterization of its aerodynamic characteristics is essential to determine design loads, to design flight control laws, and to establish performance. The NASA Ares Aerodynamics Panel has been responsible for technical planning, execution, and vetting of the aerodynamic characterization of the Ares I vehicle. An aerodynamics team supporting the Panel consists of wind tunnel engineers, computational engineers, database engineers, and other analysts that address topics such as uncertainty quantification. The team resides at three NASA centers: Langley Research Center, Marshall Space Flight Center, and Ames Research Center. The Panel has developed strategies to synergistically combine both the wind tunnel efforts and the computational efforts with the goal of validating the computations. Selected examples highlight key flow physics and, where possible, the fidelity of the comparisons between wind tunnel results and the computations. Lessons learned summarize what has been gleaned during the project and can be useful for other vehicle development projects.
Research in the design of high-performance reconfigurable systems
NASA Technical Reports Server (NTRS)
Mcewan, S. D.; Spry, A. J.
1985-01-01
Computer aided design and computer aided manufacturing have the potential for greatly reducing the cost and lead time in the development of VLSI components. This potential paves the way for the design and fabrication of a wide variety of economically feasible high level functional units. It was observed that current computer systems have only a limited capacity to absorb new VLSI component types other than memory, microprocessors, and a relatively small number of other parts. The first purpose is to explore a system design which is capable of effectively incorporating a considerable number of VLSI part types and will both increase the speed of computation and reduce the attendant programming effort. A second purpose is to explore design techniques for VLSI parts which when incorporated by such a system will result in speeds and costs which are optimal. The proposed work may lay the groundwork for future efforts in the extensive simulation and measurements of the system's cost effectiveness and lead to prototype development.
Clinical nursing informatics. Developing tools for knowledge workers.
Ozbolt, J G; Graves, J R
1993-06-01
Current research in clinical nursing informatics is proceeding along three important dimensions: (1) identifying and defining nursing's language and structuring its data; (2) understanding clinical judgment and how computer-based systems can facilitate and not replace it; and (3) discovering how well-designed systems can transform nursing practice. A number of efforts are underway to find and use language that accurately represents nursing and that can be incorporated into computer-based information systems. These efforts add to understanding nursing problems, interventions, and outcomes, and provide the elements for databases from which nursing's costs and effectiveness can be studied. Research on clinical judgment focuses on how nurses (perhaps with different levels of expertise) assess patient needs, set goals, and plan and deliver care, as well as how computer-based systems can be developed to aid these cognitive processes. Finally, investigators are studying not only how computers can help nurses with the mechanics and logistics of processing information but also and more importantly how access to informatics tools changes nursing care.
38 CFR 4.76a - Computation of average concentric contraction of visual fields.
Code of Federal Regulations, 2010 CFR
2010-07-01
... concentric contraction of visual fields. 4.76a Section 4.76a Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS SCHEDULE FOR RATING DISABILITIES Disability Ratings The Organs of Special Sense § 4.76a Computation of average concentric contraction of visual fields. Table III—Normal Visual...
Computational Electromagnetic Modeling of SansEC(Trade Mark) Sensors
NASA Technical Reports Server (NTRS)
Smith, Laura J.; Dudley, Kenneth L.; Szatkowski, George N.
2011-01-01
This paper describes the preliminary effort to apply computational design tools to aid in the development of an electromagnetic SansEC resonant sensor composite materials damage detection system. The computational methods and models employed on this research problem will evolve in complexity over time and will lead to the development of new computational methods and experimental sensor systems that demonstrate the capability to detect, diagnose, and monitor the damage of composite materials and structures on aerospace vehicles.
An application of interactive computer graphics technology to the design of dispersal mechanisms
NASA Technical Reports Server (NTRS)
Richter, B. J.; Welch, B. H.
1977-01-01
Interactive computer graphics technology is combined with a general purpose mechanisms computer code to study the operational behavior of three guided bomb dispersal mechanism designs. These studies illustrate the use of computer graphics techniques to discover operational anomalies, to assess the effectiveness of design improvements, to reduce the time and cost of the modeling effort, and to provide the mechanism designer with a visual understanding of the physical operation of such systems.
Structural behavior of composites with progressive fracture
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Murthy, P. L. N.; Chamis, C. C.
1989-01-01
The objective of the study is to unify several computational tools developed for the prediction of progressive damage and fracture with efforts for the prediction of the overall response of damaged composite structures. In particular, a computational finite element model for the damaged structure is developed using a computer program as a byproduct of the analysis of progressive damage and fracture. Thus, a single computational investigation can predict progressive fracture and the resulting variation in structural properties of angleplied composites.
Multicore: Fallout from a Computing Evolution
Yelick, Kathy [Director, NERSC
2017-12-09
July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
Advanced computational tools for 3-D seismic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhen, J.; Glover, C.W.; Protopopescu, V.A.
1996-06-01
The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less
2011-01-01
mile reach from Lewiston Lake to the North Fork of the Trinity, which includes the sites above. As of this writing, all data has been analyzed and...collection effort, probably a bathymetric LiDAR effort on the Kootenai River near Bonner’s Ferry, Idaho . Detailed multibeam acoustic surveys already
Neural Network Research: A Personal Perspective,
1988-03-01
problems in computer science and technology today. Still others do both. Whatever the focus, here isafidred to adre efforts of a wide variety of gifted ...Still others do both. Whatever the focus, here is a field ready to challenge and reward the sustained efforts of a wide variety of gifted people. 14 7eN. a rcb
Research efforts by the US Environmental Protection Agency have set out to develop alternative testing programs to prioritize limited testing resources toward chemicals that likely represent the greatest hazard to human health and the environment. Efforts such as EPA’s ToxCast r...
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1988-01-01
Research focused on two major areas. The first effort addressed the design and implementation of a technique that allows for the visualization of the real time variation of physical properties. The second effort focused on the design and implementation of an on-line help system with components designed for both authors and users of help information.
Assignment Choice: Do Students Choose Briefer Assignments or Finishing What They Started?
ERIC Educational Resources Information Center
Hawthorn-Embree, Meredith L.; Skinner, Christopher H.; Parkhurst, John; O'Neil, Michael; Conley, Elisha
2010-01-01
Academic skill development requires engagement in effortful academic behaviors. Although students may be more likely to choose to engage in behaviors that require less effort, they also may be motivated to complete assignments that they have already begun. Seventh-grade students (N = 88) began a mathematics computation worksheet, but were stopped…
A Meta-Analysis of Writing Instruction for Students in the Elementary Grades
ERIC Educational Resources Information Center
Graham, Steve; McKeown, Debra; Kiuhara, Sharlene; Harris, Karen R.
2012-01-01
In an effort to identify effective instructional practices for teaching writing to elementary grade students, we conducted a meta-analysis of the writing intervention literature, focusing our efforts on true and quasi-experiments. We located 115 documents that included the statistics for computing an effect size (ES). We calculated an average…
2015-05-01
say/. According to the article, “the hackers targeted big-name makers of nuclear and solar technology, stealing confidential business information...As JTF-GNO synchronized efforts to disinfect and protect over 2.5 million computers in 3,500 DoD organizations spanning 99 countries, Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Judith Alice; Long, Kevin Nicholas
2018-05-01
Sylgard® 184/Glass Microballoon (GMB) potting material is currently used in many NW systems. Analysts need a macroscale constitutive model that can predict material behavior under complex loading and damage evolution. To address this need, ongoing modeling and experimental efforts have focused on study of damage evolution in these materials. Micromechanical finite element simulations that resolve individual GMB and matrix components promote discovery and better understanding of the material behavior. With these simulations, we can study the role of the GMB volume fraction, time-dependent damage, behavior under confined vs. unconfined compression, and the effects of partial damage. These simulations are challengingmore » and push the boundaries of capability even with the high performance computing tools available at Sandia. We summarize the major challenges and the current state of this modeling effort, as an exemplar of micromechanical modeling needs that can motivate advances in future computing efforts.« less
Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Doug; Ziegler, Andrew C.
2010-01-01
Over the last decade, use of a method for computing suspended-sediment concentration and loads using turbidity sensors—primarily nephelometry, but also optical backscatter—has proliferated. Because an in- itu turbidity sensor is capa le of measuring turbidity instantaneously, a turbidity time series can be recorded and related directly to time-varying suspended-sediment concentrations. Depending on the suspended-sediment characteristics of the measurement site, this method can be more reliable and, in many cases, a more accurate means for computing suspended-sediment concentrations and loads than traditional U.S. Geological Survey computational methods. Guidelines and procedures for estimating time s ries of suspended-sediment concentration and loading as a function of turbidity and streamflow data have been published in a U.S. Geological Survey Techniques and Methods Report, Book 3, Chapter C4. This paper is a summary of these guidelines and discusses some of the concepts, s atistical procedures, and techniques used to maintain a multiyear suspended sediment time series.
ERIC Educational Resources Information Center
Smetana, Frederick O.; Phillips, Dennis M.
In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…
ERIC Educational Resources Information Center
Casey, James B.
1998-01-01
Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…
ERIC Educational Resources Information Center
Kell, Diane; And Others
This paper presents findings from a recently completed study of the use of computers in primary classrooms as one source of evidence concerning the role technology can play in school restructuring efforts. The sites for the study were selected by Apple Computer, Inc. in the spring of 1988 and included 43 classrooms in 10 schools in 6 large, mostly…
Active Computer Network Defense: An Assessment
2001-04-01
sufficient base of knowledge in information technology can be assumed to be working on some form of computer network warfare, even if only defensive in...the Defense Information Infrastructure (DII) to attack. Transmission Control Protocol/ Internet Protocol (TCP/IP) networks are inherently resistant to...aims to create this part of information superiority, and computer network defense is one of its fundamental components. Most of these efforts center
ERIC Educational Resources Information Center
Decker, Adrienne; Phelps, Andrew; Egert, Christopher A.
2017-01-01
This article explores the critical need to articulate computing as a creative discipline and the potential for gender and ethnic diversity that such efforts enable. By embracing a culture shift within the discipline and using games as a medium of discourse, we can engage students and faculty in a broader definition of computing. The transformative…
MIPS: analysis and annotation of proteins from whole genomes in 2005
Mewes, H. W.; Frishman, D.; Mayer, K. F. X.; Münsterkötter, M.; Noubibou, O.; Pagel, P.; Rattei, T.; Oesterheld, M.; Ruepp, A.; Stümpflen, V.
2006-01-01
The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein–protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (). PMID:16381839
MIPS: analysis and annotation of proteins from whole genomes in 2005.
Mewes, H W; Frishman, D; Mayer, K F X; Münsterkötter, M; Noubibou, O; Pagel, P; Rattei, T; Oesterheld, M; Ruepp, A; Stümpflen, V
2006-01-01
The Munich Information Center for Protein Sequences (MIPS at the GSF), Neuherberg, Germany, provides resources related to genome information. Manually curated databases for several reference organisms are maintained. Several of these databases are described elsewhere in this and other recent NAR database issues. In a complementary effort, a comprehensive set of >400 genomes automatically annotated with the PEDANT system are maintained. The main goal of our current work on creating and maintaining genome databases is to extend gene centered information to information on interactions within a generic comprehensive framework. We have concentrated our efforts along three lines (i) the development of suitable comprehensive data structures and database technology, communication and query tools to include a wide range of different types of information enabling the representation of complex information such as functional modules or networks Genome Research Environment System, (ii) the development of databases covering computable information such as the basic evolutionary relations among all genes, namely SIMAP, the sequence similarity matrix and the CABiNet network analysis framework and (iii) the compilation and manual annotation of information related to interactions such as protein-protein interactions or other types of relations (e.g. MPCDB, MPPI, CYGD). All databases described and the detailed descriptions of our projects can be accessed through the MIPS WWW server (http://mips.gsf.de).
A collaborative institutional model for integrating computer applications in the medical curriculum.
Friedman, C. P.; Oxford, G. S.; Juliano, E. L.
1991-01-01
The introduction and promotion of information technology in an established medical curriculum with existing academic and technical support structures poses a number of challenges. The UNC School of Medicine has developed the Taskforce on Educational Applications in Medicine (TEAM), to coordinate this effort. TEAM works as a confederation of existing research and support units with interests in computers and education, along with a core of interested faculty with curricular responsibilities. Constituent units of the TEAM confederation include the medical center library, medical television studios, basic science teaching laboratories, educational development office, microcomputer and network support groups, academic affairs administration, and a subset of course directors and teaching faculty. Among our efforts have been the establishment of (1) a mini-grant program to support faculty initiated development and implementation of computer applications in the curriculum, (2) a symposium series with visiting speakers to acquaint faculty with current developments in medical informatics and related curricular efforts at other institution, (3) 20 computer workstations located in the multipurpose teaching labs where first and second year students do much of their academic work, (4) a demonstration center for evaluation of courseware and technologically advanced delivery systems. The student workstations provide convenient access to electronic mail, University schedules and calendars, the CoSy computer conferencing system, and several software applications integral to their courses in pathology, histology, microbiology, biochemistry, and neurobiology. The progress achieved toward the primary goal has modestly exceeded our initial expectations, while the collegiality and interest expressed toward TEAM activities in the local environment stand as empirical measures of the success of the concept. PMID:1807705
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, E.M.; Masso, J.D.
This project involved the manufacturing of curved-faceted, injection-molded, four-element Fresnel lens parquets for concentrating photovoltaic arrays. Previous efforts showed that high-efficiency (greater than 82%) Fresnel concentrators could be injection molded. This report encompasses the mold design, molding, and physical testing of a four-lens parquet for a solar photovoltaic concentrator system.
Spectral analysis of sinus arrhythmia - A measure of mental effort
NASA Technical Reports Server (NTRS)
Vicente, Kim J.; Craig Thornton, D.; Moray, Neville
1987-01-01
The validity of the spectral analysis of sinus arrhythmia as a measure of mental effort was investigated using a computer simulation of a hovercraft piloted along a river as the experimental task. Strong correlation was observed between the subjective effort-ratings and the heart-rate variability (HRV) power spectrum between 0.06 and 0.14 Hz. Significant correlations were observed not only between subjects but, more importantly, within subjects as well, indicating that the spectral analysis of HRV is an accurate measure of the amount of effort being invested by a subject. Results also indicate that the intensity of effort invested by subjects cannot be inferred from the objective ratings of task difficulty or from performance.
Parallel aeroelastic computations for wing and wing-body configurations
NASA Technical Reports Server (NTRS)
Byun, Chansup
1994-01-01
The objective of this research is to develop computationally efficient methods for solving fluid-structural interaction problems by directly coupling finite difference Euler/Navier-Stokes equations for fluids and finite element dynamics equations for structures on parallel computers. This capability will significantly impact many aerospace projects of national importance such as Advanced Subsonic Civil Transport (ASCT), where the structural stability margin becomes very critical at the transonic region. This research effort will have direct impact on the High Performance Computing and Communication (HPCC) Program of NASA in the area of parallel computing.
Using a cloud to replenish parched groundwater modeling efforts.
Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B
2010-01-01
Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.
Using a cloud to replenish parched groundwater modeling efforts
Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.
2010-01-01
Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.
Preliminary development of digital signal processing in microwave radiometers
NASA Technical Reports Server (NTRS)
Stanley, W. D.
1980-01-01
Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.
Experimental and Analytical Studies for a Computational Materials Program
NASA Technical Reports Server (NTRS)
Knauss, W. G.
1999-01-01
The studies supported by Grant NAG1-1780 were directed at providing physical data on polymer behavior that would form the basis for computationally modeling these types of materials. Because of ongoing work in polymer characterization this grant supported part of a larger picture in this regard. Efforts went into two combined areas of their time dependent mechanical response characteristics: Creep properties on the one hand, subject to different volumetric changes (nonlinearly viscoelastic behavior) and time or frequency dependence of dilatational material behavior. The details of these endeavors are outlined sufficiently in the two appended publications, so that no further description of the effort is necessary.
A structure adapted multipole method for electrostatic interactions in protein dynamics
NASA Astrophysics Data System (ADS)
Niedermeier, Christoph; Tavan, Paul
1994-07-01
We present an algorithm for rapid approximate evaluation of electrostatic interactions in molecular dynamics simulations of proteins. Traditional algorithms require computational work of the order O(N2) for a system of N particles. Truncation methods which try to avoid that effort entail untolerably large errors in forces, energies and other observables. Hierarchical multipole expansion algorithms, which can account for the electrostatics to numerical accuracy, scale with O(N log N) or even with O(N) if they become augmented by a sophisticated scheme for summing up forces. To further reduce the computational effort we propose an algorithm that also uses a hierarchical multipole scheme but considers only the first two multipole moments (i.e., charges and dipoles). Our strategy is based on the consideration that numerical accuracy may not be necessary to reproduce protein dynamics with sufficient correctness. As opposed to previous methods, our scheme for hierarchical decomposition is adjusted to structural and dynamical features of the particular protein considered rather than chosen rigidly as a cubic grid. As compared to truncation methods we manage to reduce errors in the computation of electrostatic forces by a factor of 10 with only marginal additional effort.
Soviet Cybernetics Review, Volume 3, Number 11.
ERIC Educational Resources Information Center
Holland, Wade B.
Soviet efforts in designing third-generation computers are discussed in two featured articles which describe (1) the development and production of integrated circuits, and their role in computers; and (2) the use of amorphous chalcogenide glass in lasers, infrared devices, and semiconductors. Other articles discuss production-oriented branch…
76 FR 34965 - Cybersecurity, Innovation, and the Internet Economy
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-15
... disrupt computing systems. These threats are exacerbated by the interconnected and interdependent architecture of today's computing environment. Theoretically, security deficiencies in one area may provide... does the move to cloud-based services have on education and research efforts in the I3S? 45. What is...
NASA Aerodynamics Program Annual Report 1991
1992-04-01
results have been compared relation effort on an AH-1G Cobr : helicopter v- ,,odeI wind tunnel data at different has been completed. Computational...The computational studies cant discovery . Preliminary water-channel have shown the trapped vortex to be a viable and wind-tunnel tests have shown the
ERIC Educational Resources Information Center
Uston, Ken
1983-01-01
Discusses Apple Computer Inc.'s plan to donate an Apple IIe to eligible elementary/secondary schools in California, dealer incentives for conducting orientation sessions for school personnel, and school uses of the computer (including peer tutoring and teacher education). Also discusses similar efforts of other microcomputer manufacturers. (JN)
Computational Systems for Multidisciplinary Applications
NASA Technical Reports Server (NTRS)
Soni, Bharat; Haupt, Tomasz; Koomullil, Roy; Luke, Edward; Thompson, David
2002-01-01
In this paper, we briefly describe our efforts to develop complex simulation systems. We focus first on four key infrastructure items: enterprise computational services, simulation synthesis, geometry modeling and mesh generation, and a fluid flow solver for arbitrary meshes. We conclude by presenting three diverse applications developed using these technologies.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-02
... Standards and Technology's (NIST) Computer Security Division maintains a Computer Security Resource Center... Regarding Driver History Record Information Security, Continuity of Operation Planning, and Disaster... (SDLAs) to support their efforts at maintaining the security of information contained in the driver...
Adaptive mesh refinement and front-tracking for shear bands in an antiplane shear model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garaizar, F.X.; Trangenstein, J.
1998-09-01
In this paper the authors describe a numerical algorithm for the study of hear-band formation and growth in a two-dimensional antiplane shear of granular materials. The algorithm combines front-tracking techniques and adaptive mesh refinement. Tracking provides a more careful evolution of the band when coupled with special techniques to advance the ends of the shear band in the presence of a loss of hyperbolicity. The adaptive mesh refinement allows the computational effort to be concentrated in important areas of the deformation, such as the shear band and the elastic relief wave. The main challenges are the problems related to shearmore » bands that extend across several grid patches and the effects that a nonhyperbolic growth rate of the shear bands has in the refinement process. They give examples of the success of the algorithm for various levels of refinement.« less
Global/local stress analysis of composite panels
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Knight, Norman F., Jr.
1989-01-01
A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Wierl, Judy A.; Giddings, Elise M.P.; Bannerman, Roger T.
1998-01-01
Control of phosphorus from rural nonpoint sources is a major focus of current efforts to improve and protect water resources in Wisconsin and is recommended in almost every priority watershed plan prepared for the State's Nonpoint Source (NFS) Program. Barnyards and crop- lands usually are identified as the primary rural sources of phosphorus. Numerous questions have arisen about which of these two sources to control and about the method currently being used by the NFS program to compare phosphorus loads from barnyards and croplands. To evaluate the method, the U.S. Geological Survey (USGS). in cooperation with the Wisconsin Department of Natural Resources, used phosphorus-load and sediment-load data from streams and phosphorus concentrations in soils from the Otter Creek Watershed (located in the Sheboygan River Basin: fig. 1) in conjunction with two computer-based models.
Global/local stress analysis of composite structures. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
1989-01-01
A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Application Modernization at LLNL and the Sierra Center of Excellence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neely, J. Robert; de Supinski, Bronis R.
We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less
Relationship between fluid bed aerosol generator operation and the aerosol produced
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, R.L.; Yerkes, K.
1980-12-01
The relationships between bed operation in a fluid bed aerosol generator and aerosol output were studied. A two-inch diameter fluid bed aerosol generator (FBG) was constructed using stainless steel powder as a fluidizing medium. Fly ash from coal combustion was aerosolized and the influence of FBG operating parameters on aerosol mass median aerodynamic diameter (MMAD), geometric standard deviation (sigma/sub g/) and concentration was examined. In an effort to extend observations on large fluid beds to small beds using fine bed particles, minimum fluidizing velocities and elutriation constant were computed. Although FBG minimum fluidizing velocity agreed well with calculations, FBG elutriationmore » constant did not. The results of this study show that the properties of aerosols produced by a FBG depend on fluid bed height and air flow through the bed after the minimum fluidizing velocity is exceeded.« less
Investigation of the structural behavior of the blades of a darrieus wind turbine†
NASA Astrophysics Data System (ADS)
Rosen, A.; Abramovich, H.
1985-06-01
A theoretical model in which account is taken of the non-linear, non-planar structural behavior of the curved blades of a Darrieus wind turbine is described. This model is simpler and needs less computational effort than some other models, but is still accurate enough for most engineering purposes. By using the present method, it is possible to treat any blade geometry, any structural, mass and aerodynamic blade properties distribution and any combination of boundary conditions. The model is used in order to calculate the blade behavior under the influence of concentrated loads, gravity loads and centrifugal loads. In order to verify the theoretical model, predictions are compared with experimental results which are obtained from tests with small models of curved blades. Usually the agreement between the theoretical and experimental results is very good. The influence of different parameters on blade behavior is presented and discussed.
Assessment of solar-assisted gas-fired heat pump systems
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1981-01-01
As a possible application for the Goldstone Energy Project, the performance of a 10 ton heat pump unit using a hybrid solar gas energy source was evaluated in an effort to optimize the solar collector size. The heat pump system is designed to provide all the cooling and/or heating requirements of a selected office building. The system performance is to be augmented in the heating mode by utilizing the waste heat from the power cycle. A simplified system analysis is described to assess and compute interrrelationships of the engine, heat pump, and solar and building performance parameters, and to optimize the solar concentrator/building area ratio for a minimum total system cost. In addition, four alternative heating cooling systems, commonly used for building comfort, are described; their costs are compared, and are found to be less competitive with the gas solar heat pump system at the projected solar equipment costs.
A coarse-grained DNA model for the prediction of current signals in DNA translocation experiments
NASA Astrophysics Data System (ADS)
Weik, Florian; Kesselheim, Stefan; Holm, Christian
2016-11-01
We present an implicit solvent coarse-grained double-stranded DNA (dsDNA) model confined to an infinite cylindrical pore that reproduces the experimentally observed current modulations of a KaCl solution at various concentrations. Our model extends previous coarse-grained and mean-field approaches by incorporating a position dependent friction term on the ions, which Kesselheim et al. [Phys. Rev. Lett. 112, 018101 (2014)] identified as an essential ingredient to correctly reproduce the experimental data of Smeets et al. [Nano Lett. 6, 89 (2006)]. Our approach reduces the computational effort by orders of magnitude compared with all-atom simulations and serves as a promising starting point for modeling the entire translocation process of dsDNA. We achieve a consistent description of the system's electrokinetics by using explicitly parameterized ions, a friction term between the DNA beads and the ions, and a lattice-Boltzmann model for the solvent.
BUILDING TRIBAL CAPABILITIES IN ENERGY AND ENVIRONMENTAL MANAGEMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mary Lopez
2003-07-01
During this reporting period efforts were concentrated on finding a suitable candidate to replace the vacated internship position at the National Transportation Program in Albuquerque, New Mexico after the departure of Jacqueline Agnew. After completing an extensive search and interviews, Byron Yepa, a member of Jemez Pueblo, was selected to fill the internship vacancy. Intern Byron Yepa began his internship on June 12, 2003. Initially, Mr. Yepa was familiarized with the National Transportation Program facility, introduced to staff and was set up on the computer system. He began educating himself by reading a book which focused on the Nevada Testmore » site and its impact on Indian Tribes. He is helping in the development of a geographic information system (GIS) project and will assist other departments with their projects. At the time of this report he was waiting for new software to aid in the development of that project.« less
Application Modernization at LLNL and the Sierra Center of Excellence
Neely, J. Robert; de Supinski, Bronis R.
2017-09-01
We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less
Aerobiology and Its Role in the Transmission of Infectious Diseases
Fernstrom, Aaron; Goldblatt, Michael
2013-01-01
Aerobiology plays a fundamental role in the transmission of infectious diseases. As infectious disease and infection control practitioners continue employing contemporary techniques (e.g., computational fluid dynamics to study particle flow, polymerase chain reaction methodologies to quantify particle concentrations in various settings, and epidemiology to track the spread of disease), the central variables affecting the airborne transmission of pathogens are becoming better known. This paper reviews many of these aerobiological variables (e.g., particle size, particle type, the duration that particles can remain airborne, the distance that particles can travel, and meteorological and environmental factors), as well as the common origins of these infectious particles. We then review several real-world settings with known difficulties controlling the airborne transmission of infectious particles (e.g., office buildings, healthcare facilities, and commercial airplanes), while detailing the respective measures each of these industries is undertaking in its effort to ameliorate the transmission of airborne infectious diseases. PMID:23365758
Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression
Wiedenhoeft, John; Brugel, Eric; Schliep, Alexander
2016-01-01
By integrating Haar wavelets with Hidden Markov Models, we achieve drastically reduced running times for Bayesian inference using Forward-Backward Gibbs sampling. We show that this improves detection of genomic copy number variants (CNV) in array CGH experiments compared to the state-of-the-art, including standard Gibbs sampling. The method concentrates computational effort on chromosomal segments which are difficult to call, by dynamically and adaptively recomputing consecutive blocks of observations likely to share a copy number. This makes routine diagnostic use and re-analysis of legacy data collections feasible; to this end, we also propose an effective automatic prior. An open source software implementation of our method is available at http://schlieplab.org/Software/HaMMLET/ (DOI: 10.5281/zenodo.46262). This paper was selected for oral presentation at RECOMB 2016, and an abstract is published in the conference proceedings. PMID:27177143
Modelling collective cell migration of neural crest
Szabó, András; Mayor, Roberto
2016-01-01
Collective cell migration has emerged in the recent decade as an important phenomenon in cell and developmental biology and can be defined as the coordinated and cooperative movement of groups of cells. Most studies concentrate on tightly connected epithelial tissues, even though collective migration does not require a constant physical contact. Movement of mesenchymal cells is more independent, making their emergent collective behaviour less intuitive and therefore lending importance to computational modelling. Here we focus on such modelling efforts that aim to understand the collective migration of neural crest cells, a mesenchymal embryonic population that migrates large distances as a group during early vertebrate development. By comparing different models of neural crest migration, we emphasize the similarity and complementary nature of these approaches and suggest a future direction for the field. The principles derived from neural crest modelling could aid understanding the collective migration of other mesenchymal cell types. PMID:27085004
Modelling collective cell migration of neural crest.
Szabó, András; Mayor, Roberto
2016-10-01
Collective cell migration has emerged in the recent decade as an important phenomenon in cell and developmental biology and can be defined as the coordinated and cooperative movement of groups of cells. Most studies concentrate on tightly connected epithelial tissues, even though collective migration does not require a constant physical contact. Movement of mesenchymal cells is more independent, making their emergent collective behaviour less intuitive and therefore lending importance to computational modelling. Here we focus on such modelling efforts that aim to understand the collective migration of neural crest cells, a mesenchymal embryonic population that migrates large distances as a group during early vertebrate development. By comparing different models of neural crest migration, we emphasize the similarity and complementary nature of these approaches and suggest a future direction for the field. The principles derived from neural crest modelling could aid understanding the collective migration of other mesenchymal cell types. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Seamon, E.; Gessler, P. E.; Flathers, E.
2015-12-01
The creation and use of large amounts of data in scientific investigations has become common practice. Data collection and analysis for large scientific computing efforts are not only increasing in volume as well as number, the methods and analysis procedures are evolving toward greater complexity (Bell, 2009, Clarke, 2009, Maimon, 2010). In addition, the growth of diverse data-intensive scientific computing efforts (Soni, 2011, Turner, 2014, Wu, 2008) has demonstrated the value of supporting scientific data integration. Efforts to bridge this gap between the above perspectives have been attempted, in varying degrees, with modular scientific computing analysis regimes implemented with a modest amount of success (Perez, 2009). This constellation of effects - 1) an increasing growth in the volume and amount of data, 2) a growing data-intensive science base that has challenging needs, and 3) disparate data organization and integration efforts - has created a critical gap. Namely, systems of scientific data organization and management typically do not effectively enable integrated data collaboration or data-intensive science-based communications. Our research efforts attempt to address this gap by developing a modular technology framework for data science integration efforts - with climate variation as the focus. The intention is that this model, if successful, could be generalized to other application areas. Our research aim focused on the design and implementation of a modular, deployable technology architecture for data integration. Developed using aspects of R, interactive python, SciDB, THREDDS, Javascript, and varied data mining and machine learning techniques, the Modular Data Response Framework (MDRF) was implemented to explore case scenarios for bio-climatic variation as they relate to pacific northwest ecosystem regions. Our preliminary results, using historical NETCDF climate data for calibration purposes across the inland pacific northwest region (Abatzoglou, Brown, 2011), show clear ecosystems shifting over a ten-year period (2001-2011), based on multiple supervised classifier methods for bioclimatic indicators.
NASA Astrophysics Data System (ADS)
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real dispersion data for two cases involving contaminant dispersion in highly disturbed flows over urban and complex environments where the idealizations of horizontal homogeneity and/or temporal stationarity in the flow cannot be applied to simplify the problem. Furthermore, the methodology is applied to the case of reconstruction of multiple sources.
NASA Astrophysics Data System (ADS)
Marrapu, Pallavi
Deteriorating air quality is one of the major problems faced worldwide and in particular in Asia. The world's most polluted megacities are located in Asia highlighting the urgent need for efforts to improve the air quality. New Delhi (India), one of the world's most polluted cities, was the host of the Common Wealth Games during the period of 4-14 October 2010. This high profile event provided a good opportunity to accelerate efforts to improve air quality. Computational advances now allow air quality forecast models to fully couple the meteorology with chemical constituents within a unified modeling system that allows two-way interactions. The WRF-Chem model is used to simulate air quality in New Delhi. The thesis focuses on evaluating air quality and meteorology feedbacks. Four nested domains ranging from South Asia, Northern India, NCR Delhi and Delhi city at 45km, 15km, 5km and 1.67km resolution for a period of 20 day (26th Sep--15th Oct, 2010) are used in the study. The predicted mean surface concentrations of various pollutants show similar spatial distributions with peak values in the middle of the domain reflecting the traffic and population patterns in the city. Along with these activities, construction dust and industrial emissions contribute to high levels of criteria pollutants. The study evaluates the WRF-Chem capabilities using a new emission inventory developed over Delhi at a fine resolution of 1.67km and evaluating the results with observational data from 11 monitoring sties placed at various Game venues. The contribution of emission sectors including transportation, power, industry, and domestic to pollutant concentrations at targeted regions are studied and the results show that transportation and domestic sector are the major contributors to the pollution levels in Delhi, followed by industry. Apart from these sectors, emissions outside of Delhi contribute 20-50% to surface concentrations depending on the species. This indicates that pollution control efforts should take a regional perspective. Air quality projections in Delhi for 2030 are investigated. The Greenhouse Gas and Air Pollution I nteractions and Synergies (GAINS) model is used to generate a 2030 future emission scenario for Delhi using projections of air quality control measures and energy demands. Net reductions in CO concentrations by 50%, and increases of 140% and 40% in BC and NOx concentrations, respectively, are predicted. The net changes in concentration are associated with increases in transport and industry sectors. The domestic sector still has a significant contribution to air pollutant levels. The air quality levels show a profound effect under this scenario on the environment and human health. The increase in pollution from 2010 to 2030 is predicted to cause an increase in surface temperature by ˜0.65K. These increasing pollution levels also show effects on the radiative forcing. The high aerosols loading i.e. BC, PM2.5 and PM10 levels show strong influence on the short and longwave fluxes causing strong surface dimming and strong atmosphere heating due to BC. These results indicate transport and domestic sectors should be targeted for air quality and climate mitigations.
NASA Technical Reports Server (NTRS)
Westra, Doug G.; West, Jeffrey S.; Richardson, Brian R.
2015-01-01
Historically, the analysis and design of liquid rocket engines (LREs) has relied on full-scale testing and one-dimensional empirical tools. The testing is extremely expensive and the one-dimensional tools are not designed to capture the highly complex, and multi-dimensional features that are inherent to LREs. Recent advances in computational fluid dynamics (CFD) tools have made it possible to predict liquid rocket engine performance, stability, to assess the effect of complex flow features, and to evaluate injector-driven thermal environments, to mitigate the cost of testing. Extensive efforts to verify and validate these CFD tools have been conducted, to provide confidence for using them during the design cycle. Previous validation efforts have documented comparisons of predicted heat flux thermal environments with test data for a single element gaseous oxygen (GO2) and gaseous hydrogen (GH2) injector. The most notable validation effort was a comprehensive validation effort conducted by Tucker et al. [1], in which a number of different groups modeled a GO2/GH2 single element configuration by Pal et al [2]. The tools used for this validation comparison employed a range of algorithms, from both steady and unsteady Reynolds Averaged Navier-Stokes (U/RANS) calculations, large-eddy simulations (LES), detached eddy simulations (DES), and various combinations. A more recent effort by Thakur et al. [3] focused on using a state-of-the-art CFD simulation tool, Loci/STREAM, on a two-dimensional grid. Loci/STREAM was chosen because it has a unique, very efficient flamelet parameterization of combustion reactions that are too computationally expensive to simulate with conventional finite-rate chemistry calculations. The current effort focuses on further advancement of validation efforts, again using the Loci/STREAM tool with the flamelet parameterization, but this time with a three-dimensional grid. Comparisons to the Pal et al. heat flux data will be made for both RANS and Hybrid RANSLES/ Detached Eddy simulations (DES). Computation costs will be reported, along with comparison of accuracy and cost to much less expensive two-dimensional RANS simulations of the same geometry.
NASA Technical Reports Server (NTRS)
1973-01-01
Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.
Fault Tolerant Software Technology for Distributed Computer Systems
1989-03-01
RAY.) &-TR-88-296 I Fin;.’ Technical Report ,r 19,39 i A28 3329 F’ULT TOLERANT SOFTWARE TECHNOLOGY FOR DISTRIBUTED COMPUTER SYSTEMS Georgia Institute...GrfisABN 34-70IiWftlI NO0. IN?3. NO IACCESSION NO. 158 21 7 11. TITLE (Incld security Cassification) FAULT TOLERANT SOFTWARE FOR DISTRIBUTED COMPUTER ...Technology for Distributed Computing Systems," a two year effort performed at Georgia Institute of Technology as part of the Clouds Project. The Clouds
Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)
Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)
2018-05-07
Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.A. Bamberger; L.M. Liljegren; P.S. Lowery
This document presents an analysis of the mechanisms influencing mixing within double-shell slurry tanks. A research program to characterize mixing of slurries within tanks has been proposed. The research program presents a combined experimental and computational approach to produce correlations describing the tank slurry concentration profile (and therefore uniformity) as a function of mixer pump operating conditions. The TEMPEST computer code was used to simulate both a full-scale (prototype) and scaled (model) double-shell waste tank to predict flow patterns resulting from a stationary jet centered in the tank. The simulation results were used to evaluate flow patterns in the tankmore » and to determine whether flow patterns are similar between the full-scale prototype and an existing 1/12-scale model tank. The flow patterns were sufficiently similar to recommend conducting scoping experiments at 1/12-scale. Also, TEMPEST modeled velocity profiles of the near-floor jet were compared to experimental measurements of the near-floor jet with good agreement. Reported values of physical properties of double-shell tank slurries were analyzed to evaluate the range of properties appropriate for conducting scaled experiments. One-twelfth scale scoping experiments are recommended to confirm the prioritization of the dimensionless groups (gravitational settling, Froude, and Reynolds numbers) that affect slurry suspension in the tank. Two of the proposed 1/12-scale test conditions were modeled using the TEMPEST computer code to observe the anticipated flow fields. This information will be used to guide selection of sampling probe locations. Additional computer modeling is being conducted to model a particulate laden, rotating jet centered in the tank. The results of this modeling effort will be compared to the scaled experimental data to quantify the agreement between the code and the 1/12-scale experiment. The scoping experiment results will guide selection of parameters to be varied in the follow-on experiments. Data from the follow-on experiments will be used to develop correlations to describe slurry concentration profile as a function of mixing pump operating conditions. This data will also be used to further evaluate the computer model applications. If the agreement between the experimental data and the code predictions is good, the computer code will be recommended for use to predict slurry uniformity in the tanks under various operating conditions. If the agreement between the code predictions and experimental results is not good, the experimental data correlations will be used to predict slurry uniformity in the tanks within the range of correlation applicability.« less
Swept-Wing Ice Accretion Characterization and Aerodynamics
NASA Technical Reports Server (NTRS)
Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.
2013-01-01
NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65% scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20%, 64% and 83% semispan stations of the baseline-reference wing. Three-dimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date. 1
Swept-Wing Ice Accretion Characterization and Aerodynamics
NASA Technical Reports Server (NTRS)
Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.
2013-01-01
NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65 percent scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20, 64 and 83 percent semispan stations of the baseline-reference wing. Threedimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date.
The Impact of a Library Flood on Computer Operations.
ERIC Educational Resources Information Center
Myles, Barbara
2000-01-01
Describes the efforts at Boston Public Library to recover from serious flooding that damaged computer equipment. Discusses vendor help in assessing the damage; the loss of installation disks; hiring consultants to help with financial matters; effects on staff; repairing and replacing damaged equipment; insurance issues; and disaster recovery…
A Man-Machine System for Contemporary Counseling Practice: Diagnosis and Prediction.
ERIC Educational Resources Information Center
Roach, Arthur J.
This paper looks at present and future capabilities for diagnosis and prediction in computer-based guidance efforts and reviews the problems and potentials which will accompany the implementation of such capabilities. In addition to necessary procedural refinement in prediction, future developments in computer-based educational and career…
curriculum for its course Physics In and Through Cosmology. The Distributed Observatory aims to become the world's largest cosmic ray telescope, using the distributed sensing and computing power of the world's cell phones. Modeled after the distributed computing efforts of SETI@Home and Folding@Home, the
Evaluation of Complex Human Performance: The Promise of Computer-Based Simulation
ERIC Educational Resources Information Center
Newsom, Robert S.; And Others
1978-01-01
For the training and placement of professional workers, multiple-choice instruments are the norm for wide-scale measurement and evaluation efforts. These instruments contain fundamental problems. Computer-based management simulations may provide solutions to these problems, appear scoreable and reliable, offer increased validity, and are better…
Modifications Of Hydrostatic-Bearing Computer Program
NASA Technical Reports Server (NTRS)
Hibbs, Robert I., Jr.; Beatty, Robert F.
1991-01-01
Several modifications made to enhance utility of HBEAR, computer program for analysis and design of hydrostatic bearings. Modifications make program applicable to more realistic cases and reduce time and effort necessary to arrive at a suitable design. Uses search technique to iterate on size of orifice to obtain required pressure ratio.
Recruiting Women into Computer Science and Information Systems
ERIC Educational Resources Information Center
Broad, Steven; McGee, Meredith
2014-01-01
While many technical disciplines have reached or are moving toward gender parity in the number of bachelors degrees in those fields, the percentage of women graduating in computer science remains stubbornly low. Many recent efforts to address this situation have focused on retention of undergraduate majors or graduate students, recruiting…
The Classroom, Board Room, Chat Room, and Court Room: School Computers at the Crossroads.
ERIC Educational Resources Information Center
Stewart, Michael
2000-01-01
In schools' efforts to maximize technology's benefits, ethical considerations have often taken a back seat. Computer misuse is growing exponentially and assuming many forms: unauthorized data access, hacking, piracy, information theft, fraud, virus creation, harassment, defamation, and discrimination. Integrated-learning activities will help…
UNIX Micros for Students Majoring in Computer Science and Personal Information Retrieval.
ERIC Educational Resources Information Center
Fox, Edward A.; Birch, Sandra
1986-01-01
Traces the history of Virginia Tech's requirement that incoming freshmen majoring in computer science each acquire a microcomputer running the UNIX operating system; explores rationale for the decision; explains system's key features; and describes program implementation and research and development efforts to provide personal information…
Verification and Validation of Monte Carlo N-Particle 6 for Computing Gamma Protection Factors
2015-03-26
methods for evaluating RPFs, which it used for the subsequent 30 years. These approaches included computational modeling, radioisotopes , and a high...1.2.1. Past Methods of Experimental Evaluation ........................................................ 2 1.2.2. Modeling Efforts...Other Considerations ......................................................................................... 14 2.4. Monte Carlo Methods
New Directions in Statewide Computer Planning and Cooperation.
ERIC Educational Resources Information Center
Norris, Donald M.; St. John, Edward P.
1981-01-01
In the 1960s and early 1970s, statewide planning efforts usually resulted in plans for centralized hardware networks. The focus of statewide planning has shifted to the issue of improved computer financing, information sharing, and enhanced utilization in instruction, administration. A "facilitating network" concept and Missouri efforts…
A Model for Conducting and Assessing Interdisciplinary Undergraduate Dissertations
ERIC Educational Resources Information Center
Engström, Henrik
2015-01-01
This paper presents an effort to create a unified model for conducting and assessing undergraduate dissertations, shared by all disciplines involved in computer game development at a Swedish university. Computer game development includes technology-oriented disciplines as well as disciplines with aesthetical traditions. The challenge has been to…
ERIC Educational Resources Information Center
Tennyson, Robert
1984-01-01
Reviews educational applications of artificial intelligence and presents empirically-based design variables for developing a computer-based instruction management system. Taken from a programmatic research effort based on the Minnesota Adaptive Instructional System, variables include amount and sequence of instruction, display time, advisement,…
The Stabilization, Exploration, and Expression of Computer Game History
ERIC Educational Resources Information Center
Kaltman, Eric
2017-01-01
Computer games are now a significant cultural phenomenon, and a significant artistic output of humanity. However, little effort and attention have been paid to how the medium of games and interactive software developed, and even less to the historical storage of software development documentation. This thesis borrows methodologies and practices…
Photodynamic therapy: computer modeling of diffusion and reaction phenomena
NASA Astrophysics Data System (ADS)
Hampton, James A.; Mahama, Patricia A.; Fournier, Ronald L.; Henning, Jeffery P.
1996-04-01
We have developed a transient, one-dimensional mathematical model for the reaction and diffusion phenomena that occurs during photodynamic therapy (PDT). This model is referred to as the PDTmodem program. The model is solved by the Crank-Nicholson finite difference technique and can be used to predict the fates of important molecular species within the intercapillary tissue undergoing PDT. The following factors govern molecular oxygen consumption and singlet oxygen generation within a tumor: (1) photosensitizer concentration; (2) fluence rate; and (3) intercapillary spacing. In an effort to maximize direct tumor cell killing, the model allows educated decisions to be made to insure the uniform generation and exposure of singlet oxygen to tumor cells across the intercapillary space. Based on predictions made by the model, we have determined that the singlet oxygen concentration profile within the intercapillary space is controlled by the product of the drug concentration, and light fluence rate. The model predicts that at high levels of this product, within seconds singlet oxygen generation is limited to a small core of cells immediately surrounding the capillary. The remainder of the tumor tissue in the intercapillary space is anoxic and protected from the generation and toxic effects of singlet oxygen. However, at lower values of this product, the PDT-induced anoxic regions are not observed. An important finding is that an optimal value of this product can be defined that maintains the singlet oxygen concentration throughout the intercapillary space at a near constant level. Direct tumor cell killing is therefore postulated to depend on the singlet oxygen exposure, defined as the product of the uniform singlet oxygen concentration and the time of exposure, and not on the total light dose.
Teachable Agents and the Protege Effect: Increasing the Effort towards Learning
ERIC Educational Resources Information Center
Chase, Catherine C.; Chin, Doris B.; Oppezzo, Marily A.; Schwartz, Daniel L.
2009-01-01
Betty's Brain is a computer-based learning environment that capitalizes on the social aspects of learning. In Betty's Brain, students instruct a character called a Teachable Agent (TA) which can reason based on how it is taught. Two studies demonstrate the "protege effect": students make greater effort to learn for their TAs than they do…
ERIC Educational Resources Information Center
Kahai, Surinder; Jestire, Rebecca; Huang, Rui
2013-01-01
Computer-supported collaborative learning is a common e-learning activity. Instructors have to create appropriate social and instructional interventions in order to promote effective learning. We performed a study that examined the effects of two popular leadership interventions, transformational and transactional, on cognitive effort and outcomes…
The Efficacy of Air Pollution Control Efforts: Evidence from AURA
NASA Technical Reports Server (NTRS)
Dickerson, Russell R.; Canty, Tim; Duncan, Bryan N.; Hao, He; Krotkov, Nickolay A.; Salawitch, Ross J.; Stehr, Jeffrey W.; Vinnikov, Konstatin
2014-01-01
Observations of NO2, SO2, and H2CO from OMI on AURA provide an excellent record of pollutant concentrations for the past decade. Abatement strategies to control criteria pollutants including ozone and fine particulate matter (PM2.5) have met with varying degrees of success. Sulfur controls had a profound impact on local SO2 concentrations and a measurable impact on PM2.5. Although substantial effort has gone into VOC control, ozone in the eastern US has responded dramatically to NOx emissions controls.
ERIC Educational Resources Information Center
Ware, Ronnie J.
In an effort to increase curriculum opportunities in a rural school district, a computer project was implemented involving grade 9-12 students chosen on the basis of national percentile scores, IQ, and desire to attend college. The project offered, through programmed computer instruction, physics, French I and II, and German I. One proctor was…
Computer architecture evaluation for structural dynamics computations: Project summary
NASA Technical Reports Server (NTRS)
Standley, Hilda M.
1989-01-01
The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.
NASA Astrophysics Data System (ADS)
Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.
1989-03-01
The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.
Issues in undergraduate education in computational science and high performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchioro, T.L. II; Martin, D.
1994-12-31
The ever increasing need for mathematical and computational literacy within their society and among members of the work force has generated enormous pressure to revise and improve the teaching of related subjects throughout the curriculum, particularly at the undergraduate level. The Calculus Reform movement is perhaps the best known example of an organized initiative in this regard. The UCES (Undergraduate Computational Engineering and Science) project, an effort funded by the Department of Energy and administered through the Ames Laboratory, is sponsoring an informal and open discussion of the salient issues confronting efforts to improve and expand the teaching of computationalmore » science as a problem oriented, interdisciplinary approach to scientific investigation. Although the format is open, the authors hope to consider pertinent questions such as: (1) How can faculty and research scientists obtain the recognition necessary to further excellence in teaching the mathematical and computational sciences? (2) What sort of educational resources--both hardware and software--are needed to teach computational science at the undergraduate level? Are traditional procedural languages sufficient? Are PCs enough? Are massively parallel platforms needed? (3) How can electronic educational materials be distributed in an efficient way? Can they be made interactive in nature? How should such materials be tied to the World Wide Web and the growing ``Information Superhighway``?« less
Optimal Computing Budget Allocation for Particle Swarm Optimization in Stochastic Optimization.
Zhang, Si; Xu, Jie; Lee, Loo Hay; Chew, Ek Peng; Wong, Wai Peng; Chen, Chun-Hung
2017-04-01
Particle Swarm Optimization (PSO) is a popular metaheuristic for deterministic optimization. Originated in the interpretations of the movement of individuals in a bird flock or fish school, PSO introduces the concept of personal best and global best to simulate the pattern of searching for food by flocking and successfully translate the natural phenomena to the optimization of complex functions. Many real-life applications of PSO cope with stochastic problems. To solve a stochastic problem using PSO, a straightforward approach is to equally allocate computational effort among all particles and obtain the same number of samples of fitness values. This is not an efficient use of computational budget and leaves considerable room for improvement. This paper proposes a seamless integration of the concept of optimal computing budget allocation (OCBA) into PSO to improve the computational efficiency of PSO for stochastic optimization problems. We derive an asymptotically optimal allocation rule to intelligently determine the number of samples for all particles such that the PSO algorithm can efficiently select the personal best and global best when there is stochastic estimation noise in fitness values. We also propose an easy-to-implement sequential procedure. Numerical tests show that our new approach can obtain much better results using the same amount of computational effort.
Optimal Computing Budget Allocation for Particle Swarm Optimization in Stochastic Optimization
Zhang, Si; Xu, Jie; Lee, Loo Hay; Chew, Ek Peng; Chen, Chun-Hung
2017-01-01
Particle Swarm Optimization (PSO) is a popular metaheuristic for deterministic optimization. Originated in the interpretations of the movement of individuals in a bird flock or fish school, PSO introduces the concept of personal best and global best to simulate the pattern of searching for food by flocking and successfully translate the natural phenomena to the optimization of complex functions. Many real-life applications of PSO cope with stochastic problems. To solve a stochastic problem using PSO, a straightforward approach is to equally allocate computational effort among all particles and obtain the same number of samples of fitness values. This is not an efficient use of computational budget and leaves considerable room for improvement. This paper proposes a seamless integration of the concept of optimal computing budget allocation (OCBA) into PSO to improve the computational efficiency of PSO for stochastic optimization problems. We derive an asymptotically optimal allocation rule to intelligently determine the number of samples for all particles such that the PSO algorithm can efficiently select the personal best and global best when there is stochastic estimation noise in fitness values. We also propose an easy-to-implement sequential procedure. Numerical tests show that our new approach can obtain much better results using the same amount of computational effort. PMID:29170617
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.
2013-12-15
Purpose: Four-dimensional computed tomography (4DCT) can be used to make measurements of pulmonary function longitudinally. The sensitivity of such measurements to identify change depends on measurement uncertainty. Previously, intrasubject reproducibility of Jacobian-based measures of lung tissue expansion was studied in two repeat prior-RT 4DCT human acquisitions. Difference in respiratory effort such as breathing amplitude and frequency may affect longitudinal function assessment. In this study, the authors present normalization schemes that correct ventilation images for variations in respiratory effort and assess the reproducibility improvement after effort correction.Methods: Repeat 4DCT image data acquired within a short time interval from 24 patients priormore » to radiation therapy (RT) were used for this analysis. Using a tissue volume preserving deformable image registration algorithm, Jacobian ventilation maps in two scanning sessions were computed and compared on the same coordinate for reproducibility analysis. In addition to computing the ventilation maps from end expiration to end inspiration, the authors investigated the effort normalization strategies using other intermediated inspiration phases upon the principles of equivalent tidal volume (ETV) and equivalent lung volume (ELV). Scatter plots and mean square error of the repeat ventilation maps and the Jacobian ratio map were generated for four conditions: no effort correction, global normalization, ETV, and ELV. In addition, gamma pass rate was calculated from a modified gamma index evaluation between two ventilation maps, using acceptance criterions of 2 mm distance-to-agreement and 5% ventilation difference.Results: The pattern of regional pulmonary ventilation changes as lung volume changes. All effort correction strategies improved reproducibility when changes in respiratory effort were greater than 150 cc (p < 0.005 with regard to the gamma pass rate). Improvement of reproducibility was correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.« less
User's manual for the Graphical Constituent Loading Analysis System (GCLAS)
Koltun, G.F.; Eberle, Michael; Gray, J.R.; Glysson, G.D.
2006-01-01
This manual describes the Graphical Constituent Loading Analysis System (GCLAS), an interactive cross-platform program for computing the mass (load) and average concentration of a constituent that is transported in stream water over a period of time. GCLAS computes loads as a function of an equal-interval streamflow time series and an equal- or unequal-interval time series of constituent concentrations. The constituent-concentration time series may be composed of measured concentrations or a combination of measured and estimated concentrations. GCLAS is not intended for use in situations where concentration data (or an appropriate surrogate) are collected infrequently or where an appreciable amount of the concentration values are censored. It is assumed that the constituent-concentration time series used by GCLAS adequately represents the true time-varying concentration. Commonly, measured constituent concentrations are collected at a frequency that is less than ideal (from a load-computation standpoint), so estimated concentrations must be inserted in the time series to better approximate the expected chemograph. GCLAS provides tools to facilitate estimation and entry of instantaneous concentrations for that purpose. Water-quality samples collected for load computation frequently are collected in a single vertical or at single point in a stream cross section. Several factors, some of which may vary as a function of time and (or) streamflow, can affect whether the sample concentrations are representative of the mean concentration in the cross section. GCLAS provides tools to aid the analyst in assessing whether concentrations in samples collected in a single vertical or at single point in a stream cross section exhibit systematic bias with respect to the mean concentrations. In cases where bias is evident, the analyst can construct coefficient relations in GCLAS to reduce or eliminate the observed bias. GCLAS can export load and concentration data in formats suitable for entry into the U.S. Geological Survey's National Water Information System. GCLAS can also import and export data in formats that are compatible with various commonly used spreadsheet and statistics programs.
Unsteady Full Annulus Simulations of a Transonic Axial Compressor Stage
NASA Technical Reports Server (NTRS)
Herrick, Gregory P.; Hathaway, Michael D.; Chen, Jen-Ping
2009-01-01
Two recent research endeavors in turbomachinery at NASA Glenn Research Center have focused on compression system stall inception and compression system aerothermodynamic performance. Physical experiment and computational research are ongoing in support of these research objectives. TURBO, an unsteady, three-dimensional, Navier-Stokes computational fluid dynamics code commissioned and developed by NASA, has been utilized, enhanced, and validated in support of these endeavors. In the research which follows, TURBO is shown to accurately capture compression system flow range-from choke to stall inception-and also to accurately calculate fundamental aerothermodynamic performance parameters. Rigorous full-annulus calculations are performed to validate TURBO s ability to simulate the unstable, unsteady, chaotic stall inception process; as part of these efforts, full-annulus calculations are also performed at a condition approaching choke to further document TURBO s capabilities to compute aerothermodynamic performance data and support a NASA code assessment effort.
Generic approach to access barriers in dehydrogenation reactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank
The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less
Soft computing techniques toward modeling the water supplies of Cyprus.
Iliadis, L; Maris, F; Tachos, S
2011-10-01
This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C. C.
The Computational Electronics and Electromagnetics thrust area at Lawrence Livermore National Laboratory serves as the focal point for engineering R&D activities for developing computer-based design, analysis, and tools for theory. Key representative applications include design of particle accelerator cells and beamline components; engineering analysis and design of high-power components, photonics, and optoelectronics circuit design; EMI susceptibility analysis; and antenna synthesis. The FY-96 technology-base effort focused code development on (1) accelerator design codes; (2) 3-D massively parallel, object-oriented time-domain EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; (5) 3-D spectral-domainmore » CEM tools; and (6) enhancement of laser drilling codes. Joint efforts with the Power Conversion Technologies thrust area include development of antenna systems for compact, high-performance radar, in addition to novel, compact Marx generators. 18 refs., 25 figs., 1 tab.« less
Generic approach to access barriers in dehydrogenation reactions
Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank
2018-03-08
The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less
Tug-of-war lacunarity—A novel approach for estimating lacunarity
NASA Astrophysics Data System (ADS)
Reiss, Martin A.; Lemmerer, Birgit; Hanslmeier, Arnold; Ahammer, Helmut
2016-11-01
Modern instrumentation provides us with massive repositories of digital images that will likely only increase in the future. Therefore, it has become increasingly important to automatize the analysis of digital images, e.g., with methods from pattern recognition. These methods aim to quantify the visual appearance of captured textures with quantitative measures. As such, lacunarity is a useful multi-scale measure of texture's heterogeneity but demands high computational efforts. Here we investigate a novel approach based on the tug-of-war algorithm, which estimates lacunarity in a single pass over the image. We computed lacunarity for theoretical and real world sample images, and found that the investigated approach is able to estimate lacunarity with low uncertainties. We conclude that the proposed method combines low computational efforts with high accuracy, and that its application may have utility in the analysis of high-resolution images.
Formal Representations of Eligibility Criteria: A Literature Review
Weng, Chunhua; Tu, Samson W.; Sim, Ida; Richesson, Rachel
2010-01-01
Standards-based, computable knowledge representations for eligibility criteria are increasingly needed to provide computer-based decision support for automated research participant screening, clinical evidence application, and clinical research knowledge management. We surveyed the literature and identified five aspects of eligibility criteria knowledge representations that contribute to the various research and clinical applications: the intended use of computable eligibility criteria, the classification of eligibility criteria, the expression language for representing eligibility rules, the encoding of eligibility concepts, and the modeling of patient data. We consider three of them (expression language, codification of eligibility concepts, and patient data modeling), to be essential constructs of a formal knowledge representation for eligibility criteria. The requirements for each of the three knowledge constructs vary for different use cases, which therefore should inform the development and choice of the constructs toward cost-effective knowledge representation efforts. We discuss the implications of our findings for standardization efforts toward sharable knowledge representation of eligibility criteria. PMID:20034594
Cut set-based risk and reliability analysis for arbitrarily interconnected networks
Wyss, Gregory D.
2000-01-01
Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.
NASA Technical Reports Server (NTRS)
1973-01-01
This user's manual describes the FORTRAN IV computer program developed to compute the total vertical load, normal concentrated pressure loads, and the center of pressure of typical SRB water impact slapdown pressure distributions specified in the baseline configuration. The program prepares the concentrated pressure load information in punched card format suitable for input to the STAGS computer program. In addition, the program prepares for STAGS input the inertia reacting loads to the slapdown pressure distributions.
Computational Exposure Science: An Emerging Discipline to ...
Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source
Dynamics of soil microbial communities in agroecosystems managed for biofuel production
USDA-ARS?s Scientific Manuscript database
Elevated atmospheric CO2 concentrations and their link to global climate change are stimulating efforts to reduce dependence on fossil fuels and increase use of alternative energy sources. Initial efforts to incorporate significant amounts of cellulosic ethanol into transportation fuels are focused ...
Schoonheere, N; Dotreppe, O; Pincemail, J; Istasse, L; Hornick, J L
2009-06-01
Selenium is a trace element of importance for animal health. It is essential for adequate functioning of many enzymes such as, the antioxidant enzyme, glutathione peroxidase, which protects the cell against free radicals. A muscular effort induces a rise in reactive oxygen species production which, in turn, can generate an oxidative stress. Two groups of eight racing pigeons were fed respectively with a diet containing 30.3 (control group) and 195.3 (selenium group) microg selenium/kg diet. The pigeons were submitted to a standardised simulation of a flying effort during 2 h. Blood was taken before and after the effort to measure antioxidant markers and blood parameters related to muscle metabolism. Plasma selenium concentration and glutathione peroxidase activity were significantly higher in the selenium group. There were no significant differences for the other measured parameters. As a consequence of the effort, the pigeons of the selenium group showed a higher increase of glutathione peroxidase activity and a smaller increase of plasma lactate concentration. Variations because of the effort in the other markers were not significantly different between the two groups. It is concluded that the selenium status was improved with the feeding of feedstuffs high in Selenium.
Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary
2013-11-01
Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.
External ocular hyperemia: a quantifiable indicator of spacecraft air quality.
Ogle, J W; Cohen, K L
1996-05-01
Eye irritation consistently ranks as a top astronaut complaint but is difficult to measure. Exposure to internal air pollution hypothetically disrupts the eye's tear film, thereby exposing the crewmembers' conjunctivae to the irritating effects of the recirculated, contaminant-laden atmosphere of the space vehicle. Causes elude engineers and toxicologists, who report that measured irritants remain below established Spacecraft Maximum Allowable Concentrations. Lack of objective ocular endpoints stymies efforts to identify etiologies. Computers offer a practical means of analyzing ocular hyperemia in space. We use computer analysis to quantify redness and blood vessels of digitized images of bulbar conjunctivae in near real time. Custom software masks artifacts, lids and lashes for each photographic or telemedicine ocular image, Algorithms then generate semi-independent measurements of hyperemia. Computed difference scores between 34 pairs of images were compared with subjective difference scores as voted on by a panel of ophthalmology residents. Objective data were reliably extracted from ocular images and significantly correlated (r = 0.583, p < 0.05) with subjective scores. This ground-based methodology generates accurate and reliable ocular endpoint data without mass, volume, or power penalty. To assist in identifying and eliminating onboard ocular irritants, these objective data can be regressed against independent variables such as mission elapsed time, subjective astronaut complaints, levels of chemical and electromagnetic contaminants, nephthelometric and barothermal data. As missions lengthen, sensitive tools such as hyperemia quantification will become increasingly important for assessing and optimizing spacecraft environments.
Multiprocessor architecture: Synthesis and evaluation
NASA Technical Reports Server (NTRS)
Standley, Hilda M.
1990-01-01
Multiprocessor computed architecture evaluation for structural computations is the focus of the research effort described. Results obtained are expected to lead to more efficient use of existing architectures and to suggest designs for new, application specific, architectures. The brief descriptions given outline a number of related efforts directed toward this purpose. The difficulty is analyzing an existing architecture or in designing a new computer architecture lies in the fact that the performance of a particular architecture, within the context of a given application, is determined by a number of factors. These include, but are not limited to, the efficiency of the computation algorithm, the programming language and support environment, the quality of the program written in the programming language, the multiplicity of the processing elements, the characteristics of the individual processing elements, the interconnection network connecting processors and non-local memories, and the shared memory organization covering the spectrum from no shared memory (all local memory) to one global access memory. These performance determiners may be loosely classified as being software or hardware related. This distinction is not clear or even appropriate in many cases. The effect of the choice of algorithm is ignored by assuming that the algorithm is specified as given. Effort directed toward the removal of the effect of the programming language and program resulted in the design of a high-level parallel programming language. Two characteristics of the fundamental structure of the architecture (memory organization and interconnection network) are examined.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.
2009-12-01
As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.
NASA Astrophysics Data System (ADS)
Heller, Johann; Flisgen, Thomas; van Rienen, Ursula
The computation of electromagnetic fields and parameters derived thereof for lossless radio frequency (RF) structures filled with isotropic media is an important task for the design and operation of particle accelerators. Unfortunately, these computations are often highly demanding with regard to computational effort. The entire computational demand of the problem can be reduced using decomposition schemes in order to solve the field problems on standard workstations. This paper presents one of the first detailed comparisons between the recently proposed state-space concatenation approach (SSC) and a direct computation for an accelerator cavity with coupler-elements that break the rotational symmetry.
Coquina Elementary students enjoy gift of computers
NASA Technical Reports Server (NTRS)
1999-01-01
Children at Coquina Elementary School, Titusville, Fla., 'practice' using a computer keyboard, part of equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.
Coquina Elementary students enjoy gift of computers
NASA Technical Reports Server (NTRS)
1999-01-01
Children at Coquina Elementary School, Titusville, Fla., look with curiosity at the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.
Audubon Elementary students enjoy gift of computers
NASA Technical Reports Server (NTRS)
1999-01-01
Children at Audubon Elementary School, Merritt Island, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Audubon is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year- long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.
Coquina Elementary students enjoy gift of computers
NASA Technical Reports Server (NTRS)
1999-01-01
Children at Coquina Elementary School, Titusville, Fla., eagerly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year- long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.
Coquina Elementary students enjoy gift of computers
NASA Technical Reports Server (NTRS)
1999-01-01
Children at Coquina Elementary School, Titusville, Fla., excitedly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.
Concentrator optical characterization using computer mathematical modelling and point source testing
NASA Technical Reports Server (NTRS)
Dennison, E. W.; John, S. L.; Trentelman, G. F.
1984-01-01
The optical characteristics of a paraboloidal solar concentrator are analyzed using the intercept factor curve (a format for image data) to describe the results of a mathematical model and to represent reduced data from experimental testing. This procedure makes it possible not only to test an assembled concentrator, but also to evaluate single optical panels or to conduct non-solar tests of an assembled concentrator. The use of three-dimensional ray tracing computer programs to calculate the mathematical model is described. These ray tracing programs can include any type of optical configuration from simple paraboloids to array of spherical facets and can be adapted to microcomputers or larger computers, which can graphically display real-time comparison of calculated and measured data.
QIVIVE Approaches to Evaluate Inter-individual Toxicokinetic Variability
Manifestation of inter-individual variability in toxicokinetics (TK) will result in identical external exposure concentrations yielding differing blood or tissue concentrations. As efforts to incorporate in vitro testing strategies into human health assessment continue to grow, a...
Computational nuclear quantum many-body problem: The UNEDF project
NASA Astrophysics Data System (ADS)
Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.
2013-10-01
The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.
1999-10-06
Children at Audubon Elementary School, Merritt Island, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Audubon is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated
NASA Technical Reports Server (NTRS)
Gillian, Ronnie E.; Lotts, Christine G.
1988-01-01
The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.
Evolution of a standard microprocessor-based space computer
NASA Technical Reports Server (NTRS)
Fernandez, M.
1980-01-01
An existing in inventory computer hardware/software package (B-1 RFS/ECM) was repackaged and applied to multiple missile/space programs. Concurrent with the application efforts, low risk modifications were made to the computer from program to program to take advantage of newer, advanced technology and to meet increasingly more demanding requirements (computational and memory capabilities, longer life, and fault tolerant autonomy). It is concluded that microprocessors hold promise in a number of critical areas for future space computer applications. However, the benefits of the DoD VHSIC Program are required and the old proliferation problem must be revised.
Reducing the Time and Cost of Testing Engines
NASA Technical Reports Server (NTRS)
2004-01-01
Producing a new aircraft engine currently costs approximately $1 billion, with 3 years of development time for a commercial engine and 10 years for a military engine. The high development time and cost make it extremely difficult to transition advanced technologies for cleaner, quieter, and more efficient new engines. To reduce this time and cost, NASA created a vision for the future where designers would use high-fidelity computer simulations early in the design process in order to resolve critical design issues before building the expensive engine hardware. To accomplish this vision, NASA's Glenn Research Center initiated a collaborative effort with the aerospace industry and academia to develop its Numerical Propulsion System Simulation (NPSS), an advanced engineering environment for the analysis and design of aerospace propulsion systems and components. Partners estimate that using NPSS has the potential to dramatically reduce the time, effort, and expense necessary to design and test jet engines by generating sophisticated computer simulations of an aerospace object or system. These simulations will permit an engineer to test various design options without having to conduct costly and time-consuming real-life tests. By accelerating and streamlining the engine system design analysis and test phases, NPSS facilitates bringing the final product to market faster. NASA's NPSS Version (V)1.X effort was a task within the Agency s Computational Aerospace Sciences project of the High Performance Computing and Communication program, which had a mission to accelerate the availability of high-performance computing hardware and software to the U.S. aerospace community for its use in design processes. The technology brings value back to NASA by improving methods of analyzing and testing space transportation components.
Software Development Processes Applied to Computational Icing Simulation
NASA Technical Reports Server (NTRS)
Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.
1999-01-01
The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.
Artificial Intelligence In Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Vogel, Alison Andrews
1991-01-01
Paper compares four first-generation artificial-intelligence (Al) software systems for computational fluid dynamics. Includes: Expert Cooling Fan Design System (EXFAN), PAN AIR Knowledge System (PAKS), grid-adaptation program MITOSIS, and Expert Zonal Grid Generation (EZGrid). Focuses on knowledge-based ("expert") software systems. Analyzes intended tasks, kinds of knowledge possessed, magnitude of effort required to codify knowledge, how quickly constructed, performances, and return on investment. On basis of comparison, concludes Al most successful when applied to well-formulated problems solved by classifying or selecting preenumerated solutions. In contrast, application of Al to poorly understood or poorly formulated problems generally results in long development time and large investment of effort, with no guarantee of success.
Greenberger-Horne-Zeilinger states-based blind quantum computation with entanglement concentration.
Zhang, Xiaoqian; Weng, Jian; Lu, Wei; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing
2017-09-11
In blind quantum computation (BQC) protocol, the quantum computability of servers are complicated and powerful, while the clients are not. It is still a challenge for clients to delegate quantum computation to servers and keep the clients' inputs, outputs and algorithms private. Unfortunately, quantum channel noise is unavoidable in the practical transmission. In this paper, a novel BQC protocol based on maximally entangled Greenberger-Horne-Zeilinger (GHZ) states is proposed which doesn't need a trusted center. The protocol includes a client and two servers, where the client only needs to own quantum channels with two servers who have full-advantage quantum computers. Two servers perform entanglement concentration used to remove the noise, where the success probability can almost reach 100% in theory. But they learn nothing in the process of concentration because of the no-signaling principle, so this BQC protocol is secure and feasible.
Investing in Student Achievement.
ERIC Educational Resources Information Center
Education Commission of the States, Denver, CO.
During 1996-97 the Education Commission of the States explored three policy areas in which state efforts to improve student achievement have been increasingly concentrated: early childhood education, teacher quality and stronger connections between the K-12 and postsecondary systems. The study surveyed the scope and intensity of state efforts in…
Refractive Secondary Concentrators for Solar Thermal Applications
NASA Technical Reports Server (NTRS)
Wong, Wayne A.; Macosko, Robert P.
1999-01-01
The NASA Glenn Research Center is developing technologies that utilize solar energy for various space applications including electrical power conversion, thermal propulsion, and furnaces. Common to all of these applications is the need for highly efficient, solar concentration systems. An effort is underway to develop the innovative single crystal refractive secondary concentrator, which uses refraction and total internal reflection to efficiently concentrate and direct solar energy. The refractive secondary offers very high throughput efficiencies (greater than 90%), and when used in combination with advanced primary concentrators, enables very high concentration ratios (10,0(X) to 1) and very high temperatures (greater than 2000 K). Presented is an overview of the refractive secondary concentrator development effort at the NASA Glenn Research Center, including optical design and analysis techniques, thermal modeling capabilities, crystal materials characterization testing, optical coatings evaluation, and component testing. Also presented is a discussion of potential future activity and technical issues yet to be resolved. Much of the work performed to date has been in support of the NASA Marshall Space Flight Center's Solar Thermal Propulsion Program. The many benefits of a refractive secondary concentrator that enable efficient, high temperature thermal propulsion system designs, apply equally well to other solar applications including furnaces and power generation systems such as solar dynamics, concentrated thermal photovoltaics, and thermionics.
Texas Education Computer Cooperative (TECC) Products and Services.
ERIC Educational Resources Information Center
Texas Education Agency, Austin.
Designed to broaden awareness of the Texas Education Computer Cooperative (TECC) and its efforts to develop products and services which can be most useful to schools, this publication provides an outline of the need, purpose, organization, and funding of TECC and descriptions of 19 projects. The project descriptions are organized by fiscal year in…
DOT National Transportation Integrated Search
1988-11-01
During the past decade a great deal of effort has been focused on the advantages computerization can bring to engineering design and production activities. This is seen in such developments as Group Technology (GT), Manufacturing Resource Planning (M...
ERIC Educational Resources Information Center
Rendiero, Jane; Linder, William W.
This report summarizes the results of a survey of 29 southern land-grant institutions which elicited information on microcomputer capabilities, programming efforts, and computer awareness education for formers, homemakers, community organizations, planning agencies, and other end users. Five topics were covered by the survey: (1) degree of…
DOT National Transportation Integrated Search
1981-09-01
Volume II is the second volume of a three volume document describing the computer program HEVSIM for use with buses and heavy duty trucks. This volume is a user's manual describing how to prepare data input and execute the program. A strong effort ha...
A Puzzle-Based Seminar for Computer Engineering Freshmen
ERIC Educational Resources Information Center
Parhami, Behrooz
2008-01-01
We observe that recruitment efforts aimed at alleviating the shortage of skilled workforce in computer engineering must be augmented with strategies for retaining and motivating the students after they have enrolled in our educational programmes. At the University of California, Santa Barbara, we have taken a first step in this direction by…
Incorporating Flexibility in the Design of Repairable Systems - Design of Microgrids
2014-01-01
MICROGRIDS Vijitashwa Pandey1 Annette Skowronska1,2...optimization of complex systems such as a microgrid is however, computationally intensive. The problem is exacerbated if we must incorporate...flexibility in terms of allowing the microgrid architecture and its running protocol to change with time. To reduce the computational effort, this paper
Integrating Computational Thinking into Technology and Engineering Education
ERIC Educational Resources Information Center
Hacker, Michael
2018-01-01
Computational Thinking (CT) is being promoted as "a fundamental skill used by everyone in the world by the middle of the 21st Century" (Wing, 2006). CT has been effectively integrated into history, ELA, mathematics, art, and science courses (Settle, et al., 2012). However, there has been no analogous effort to integrate CT into…
An Approach to Effortless Construction of Program Animations
ERIC Educational Resources Information Center
Velazquez-Iturbide, J. Angel; Pareja-Flores, Cristobal; Urquiza-Fuentes, Jaime
2008-01-01
Program animation systems have not been as widely adopted by computer science educators as we might expect from the firm belief that they can help in enhancing computer science education. One of the most notable obstacles to their adoption is the considerable effort that the production of program animations represents for the instructor. We…
A Study of Cooperative, Networking, and Computer Activities in Southwestern Libraries.
ERIC Educational Resources Information Center
Corbin, John
The Southwestern Library Association (SWLA) conducted an inventory and study of the SWLA libraries in cooperative, network, and computer activities to collect data for use in planning future activities and in minimizing duplication of efforts. Questionnaires were mailed to 2,060 academic, public, and special libraries in the six SWLA states.…
DOT National Transportation Integrated Search
1976-05-01
As part of its activity under the Rail Equipment Safety Project, computer programs for track/train dynamics analysis are being developed and modified. As part of this effort, derailment behavior of trains negotiating curves under buff or draft has be...
Online Secondary Research in the Advertising Research Class: A Friendly Introduction to Computing.
ERIC Educational Resources Information Center
Adler, Keith
In an effort to promote computer literacy among advertising students, an assignment was devised that required the use of online database search techniques to find secondary research materials. The search program, chosen for economical reasons, was "Classroom Instruction Program" offered by Dialog Information Services. Available for a…
A Framework and Implementation of User Interface and Human-Computer Interaction Instruction
ERIC Educational Resources Information Center
Peslak, Alan
2005-01-01
Researchers have suggested that up to 50 % of the effort in development of information systems is devoted to user interface development (Douglas, Tremaine, Leventhal, Wills, & Manaris, 2002; Myers & Rosson, 1992). Yet little study has been performed on the inclusion of important interface and human-computer interaction topics into a current…
Response Time Differences between Computers and Tablets
ERIC Educational Resources Information Center
Kong, Xiaojing; Davis, Laurie Laughlin; McBride, Yuanyuan; Morrison, Kristin
2018-01-01
Item response time data were used in investigating the differences in student test-taking behavior between two device conditions: computer and tablet. Analyses were conducted to address the questions of whether or not the device condition had a differential impact on rapid guessing and solution behaviors (with response time effort used as an…
Refocusing the Vision: The Future of Instructional Technology
ERIC Educational Resources Information Center
Pence, Harry E.; McIntosh, Steven
2011-01-01
Two decades ago, many campuses mobilized a major effort to deal with a clear problem; faculty and students needed access to desktop computing technologies. Now the situation is much more complex. Responding to the current challenges, like mobile computing and social networking, will be ore difficult but equally important. There is a clear need for…
Pennsylvania's Transition to Enterprise Computing as a Study in Strategic Alignment
ERIC Educational Resources Information Center
Sawyer, Steve; Hinnant, Charles C.; Rizzuto, Tracey
2008-01-01
We theorize about the strategic alignment of computing with organizational mission, using the Commonwealth of Pennsylvania's efforts to pursue digital government initiatives as evidence. To do this we draw on a decade (1995-2004) of changes in Pennsylvania to characterize how a state government shifts from an organizational to an enterprise…
Computer Simulation of Reading.
ERIC Educational Resources Information Center
Leton, Donald A.
In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…
Six Years of Parallel Computing at NAS (1987 - 1993): What Have we Learned?
NASA Technical Reports Server (NTRS)
Simon, Horst D.; Cooper, D. M. (Technical Monitor)
1994-01-01
In the fall of 1987 the age of parallelism at NAS began with the installation of a 32K processor CM-2 from Thinking Machines. In 1987 this was described as an "experiment" in parallel processing. In the six years since, NAS acquired a series of parallel machines, and conducted an active research and development effort focused on the use of highly parallel machines for applications in the computational aerosciences. In this time period parallel processing for scientific applications evolved from a fringe research topic into the one of main activities at NAS. In this presentation I will review the history of parallel computing at NAS in the context of the major progress, which has been made in the field in general. I will attempt to summarize the lessons we have learned so far, and the contributions NAS has made to the state of the art. Based on these insights I will comment on the current state of parallel computing (including the HPCC effort) and try to predict some trends for the next six years.
Computational design of RNAs with complex energy landscapes.
Höner zu Siederdissen, Christian; Hammer, Stefan; Abfalter, Ingrid; Hofacker, Ivo L; Flamm, Christoph; Stadler, Peter F
2013-12-01
RNA has become an integral building material in synthetic biology. Dominated by their secondary structures, which can be computed efficiently, RNA molecules are amenable not only to in vitro and in vivo selection, but also to rational, computation-based design. While the inverse folding problem of constructing an RNA sequence with a prescribed ground-state structure has received considerable attention for nearly two decades, there have been few efforts to design RNAs that can switch between distinct prescribed conformations. We introduce a user-friendly tool for designing RNA sequences that fold into multiple target structures. The underlying algorithm makes use of a combination of graph coloring and heuristic local optimization to find sequences whose energy landscapes are dominated by the prescribed conformations. A flexible interface allows the specification of a wide range of design goals. We demonstrate that bi- and tri-stable "switches" can be designed easily with moderate computational effort for the vast majority of compatible combinations of desired target structures. RNAdesign is freely available under the GPL-v3 license. Copyright © 2013 Wiley Periodicals, Inc.
Practical Use of Computationally Frugal Model Analysis Methods
Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...
2015-03-21
Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less
Middleware for big data processing: test results
NASA Astrophysics Data System (ADS)
Gankevich, I.; Gaiduchok, V.; Korkhov, V.; Degtyarev, A.; Bogdanov, A.
2017-12-01
Dealing with large volumes of data is resource-consuming work which is more and more often delegated not only to a single computer but also to a whole distributed computing system at once. As the number of computers in a distributed system increases, the amount of effort put into effective management of the system grows. When the system reaches some critical size, much effort should be put into improving its fault tolerance. It is difficult to estimate when some particular distributed system needs such facilities for a given workload, so instead they should be implemented in a middleware which works efficiently with a distributed system of any size. It is also difficult to estimate whether a volume of data is large or not, so the middleware should also work with data of any volume. In other words, the purpose of the middleware is to provide facilities that adapt distributed computing system for a given workload. In this paper we introduce such middleware appliance. Tests show that this middleware is well-suited for typical HPC and big data workloads and its performance is comparable with well-known alternatives.
Energy star. (Latest citations from the Computer database). Published Search
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The bibliography contains citations concerning a collaborative effort between the Environmental Protection Agency (EPA) and private industry to reduce electrical power consumed by personal computers and related peripherals. Manufacturers complying with EPA guidelines are officially recognized by award of a special Energy Star logo, and are referred to in official documents as a vendor of green computers. (Contains a minimum of 81 citations and includes a subject term index and title list.)
Energy star. (Latest citations from the Computer database). Published Search
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The bibliography contains citations concerning a collaborative effort between the Environmental Protection Agency (EPA) and private industry to reduce electrical power consumed by personal computers and related peripherals. Manufacturers complying with EPA guidelines are officially recognized by award of a special Energy Star logo, and are referred to in official documents as a vendor of green computers. (Contains a minimum of 234 citations and includes a subject term index and title list.)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nitao, J J
The goal of the Event Reconstruction Project is to find the location and strength of atmospheric release points, both stationary and moving. Source inversion relies on observational data as input. The methodology is sufficiently general to allow various forms of data. In this report, the authors will focus primarily on concentration measurements obtained at point monitoring locations at various times. The algorithms being investigated in the Project are the MCMC (Markov Chain Monte Carlo), SMC (Sequential Monte Carlo) Methods, classical inversion methods, and hybrids of these. They refer the reader to the report by Johannesson et al. (2004) for explanationsmore » of these methods. These methods require computing the concentrations at all monitoring locations for a given ''proposed'' source characteristic (locations and strength history). It is anticipated that the largest portion of the CPU time will take place performing this computation. MCMC and SMC will require this computation to be done at least tens of thousands of times. Therefore, an efficient means of computing forward model predictions is important to making the inversion practical. In this report they show how Green's functions and reciprocal Green's functions can significantly accelerate forward model computations. First, instead of computing a plume for each possible source strength history, they can compute plumes from unit impulse sources only. By using linear superposition, they can obtain the response for any strength history. This response is given by the forward Green's function. Second, they may use the law of reciprocity. Suppose that they require the concentration at a single monitoring point x{sub m} due to a potential (unit impulse) source that is located at x{sub s}. instead of computing a plume with source location x{sub s}, they compute a ''reciprocal plume'' whose (unit impulse) source is at the monitoring locations x{sub m}. The reciprocal plume is computed using a reversed-direction wind field. The wind field and transport coefficients must also be appropriately time-reversed. Reciprocity says that the concentration of reciprocal plume at x{sub s} is related to the desired concentration at x{sub m}. Since there are many less monitoring points than potential source locations, the number of forward model computations is drastically reduced.« less
ERIC Educational Resources Information Center
Parkhurst, John T.; Fleisher, Matthew S.; Skinner, Christopher H.; Woehr, David J.; Hawthorn-Embree, Meredith L.
2011-01-01
After completing the Multidimensional Work-Ethic Profile (MWEP), 98 college students were given a 20-problem math computation assignment and instructed to stop working on the assignment after completing 10 problems. Next, they were allowed to choose to finish either the partially completed assignment that had 10 problems remaining or a new…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bache, T.C.; Swanger, H.J.; Shkoller, B.
1981-07-01
This report summarizes three efforts performed during the past fiscal year. The first these efforts is a study of the theoretical behavior of the regional seismic phase Lg in various tectonic provinces. Synthetic seismograms are used to determine the sensitivity of Lg to source and medium properties. The primary issues addressed concern the relationship of regional Lg characteristics to the crustal attenuation properties, the comparison of the Lg in many crustal structures and the source depth dependence of Lg. The second effort described is an expansion of hte capabilities of the three-dimensional finite difference code TRES. The present capabilities aremore » outlined with comparisons of the performance of the code on three computer systems. The last effort described is the development of an algorithm for simulation of the near-field ground motions from the 1971 San Fernando, California, earthquake. A computer code implementing this algorithm has been provided to the Mission Research Corporation foe simulation of the acoustic disturbances from such an earthquake.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinda, Peter August
2015-03-17
This report describes the activities, findings, and products of the Northwestern University component of the "Enabling Exascale Hardware and Software Design through Scalable System Virtualization" project. The purpose of this project has been to extend the state of the art of systems software for high-end computing (HEC) platforms, and to use systems software to better enable the evaluation of potential future HEC platforms, for example exascale platforms. Such platforms, and their systems software, have the goal of providing scientific computation at new scales, thus enabling new research in the physical sciences and engineering. Over time, the innovations in systems softwaremore » for such platforms also become applicable to more widely used computing clusters, data centers, and clouds. This was a five-institution project, centered on the Palacios virtual machine monitor (VMM) systems software, a project begun at Northwestern, and originally developed in a previous collaboration between Northwestern University and the University of New Mexico. In this project, Northwestern (including via our subcontract to the University of Pittsburgh) contributed to the continued development of Palacios, along with other team members. We took the leadership role in (1) continued extension of support for emerging Intel and AMD hardware, (2) integration and performance enhancement of overlay networking, (3) connectivity with architectural simulation, (4) binary translation, and (5) support for modern Non-Uniform Memory Access (NUMA) hosts and guests. We also took a supporting role in support for specialized hardware for I/O virtualization, profiling, configurability, and integration with configuration tools. The efforts we led (1-5) were largely successful and executed as expected, with code and papers resulting from them. The project demonstrated the feasibility of a virtualization layer for HEC computing, similar to such layers for cloud or datacenter computing. For effort (3), although a prototype connecting Palacios with the GEM5 architectural simulator was demonstrated, our conclusion was that such a platform was less useful for design space exploration than anticipated due to inherent complexity of the connection between the instruction set architecture level and the microarchitectural level. For effort (4), we found that a code injection approach proved to be more fruitful. The results of our efforts are publicly available in the open source Palacios codebase and published papers, all of which are available from the project web site, v3vee.org. Palacios is currently one of the two codebases (the other being Sandia’s Kitten lightweight kernel) that underlies the node operating system for the DOE Hobbes Project, one of two projects tasked with building a systems software prototype for the national exascale computing effort.« less
NASA Technical Reports Server (NTRS)
Harding, Lawrence W., Jr.
1989-01-01
Two projects using remote sensing of phytoplankton chlorophyll concentrations in the Chesapeake Bay estuary were proposed. The first project used aircraft remote sensing with a compact radiometer system developed at NASA's Goddard Space Flight Center (GSFC), the Ocean Data Acquisition System (ODAS). ODAS includes three radiometers at 460, 490, and 520 nm, an infrared temperature sensor (PRT-5), Loran-C for navigation, and a data acquisition system using a PC and mass storage device. This instrument package can be flown in light aircraft at relatively low expense, permitting regular and frequent flights. Sixteen flights with ODAS were completed using the Virginia Institute of Marine Science's De Havilland 'Beaver'. The goal was to increase spatial and temporal resolution in assaying phytoplankton pigment concentrations in the Chesapeake. At present, analysis is underway of flight data collected between March and July 1989. The second project focused on satellite data gathered with the Nimbus-7 Coastal Zone Color Scanner (CZSC) between late 1978 and mid 1986. The problem in using CZSC data for the Chesapeake Bay is that the optical characteristics of this (and many) coastal and estuarine waters are distinct from those of the open ocean for which algorithms for computing pigment concentrations were developed. The successful use of CZCS data for the estuary requires development of site-specific algorithms and analytical approaches. Of principal importance in developing site-specific procedures is the availability of in-situ data on pigment concentrations. A significant data set was acquired from EPA's Chesapeake Bay Program in Annapolis, Maryland, and clear satellite scenes are being analyzed for which same-day sea truth measurements of pigment were obtained. Both the University of Miami and GSFC Seapak systems are being used in this effort. The main finding to date is an expected one, i.e., the algorithms developed for oceanic waters are inadequate to compute pigment concentrations for the Case 2 waters of the Chesapeake Bay. One reason is the overestimation of aerosol radiances by assuming that water-leaving radiance in Band 4 of CZCS (670 nm) is zero, an assumption that is invalid for the Bay. This prompted any attempts to iterative procedures for estimating the proportion of the Band 4 radiance that is actually attributable to aerosol by estimating the water-leaving component using optical data. A cruise on the Chesapeake the week of 7 August 1989 was conducted to collect additional optical data necessary to this task.
Automatic Data Filter Customization Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Mandrake, Lukas
2013-01-01
This work predicts whether a retrieval algorithm will usefully determine CO2 concentration from an input spectrum of GOSAT (Greenhouse Gases Observing Satellite). This was done to eliminate needless runtime on atmospheric soundings that would never yield useful results. A space of 50 dimensions was examined for predictive power on the final CO2 results. Retrieval algorithms are frequently expensive to run, and wasted effort defeats requirements and expends needless resources. This algorithm could be used to help predict and filter unneeded runs in any computationally expensive regime. Traditional methods such as the Fischer discriminant analysis and decision trees can attempt to predict whether a sounding will be properly processed. However, this work sought to detect a subsection of the dimensional space that can be simply filtered out to eliminate unwanted runs. LDAs (linear discriminant analyses) and other systems examine the entire data and judge a "best fit," giving equal weight to complex and problematic regions as well as simple, clear-cut regions. In this implementation, a genetic space of "left" and "right" thresholds outside of which all data are rejected was defined. These left/right pairs are created for each of the 50 input dimensions. A genetic algorithm then runs through countless potential filter settings using a JPL computer cluster, optimizing the tossed-out data s yield (proper vs. improper run removal) and number of points tossed. This solution is robust to an arbitrary decision boundary within the data and avoids the global optimization problem of whole-dataset fitting using LDA or decision trees. It filters out runs that would not have produced useful CO2 values to save needless computation. This would be an algorithmic preprocessing improvement to any computationally expensive system.
Huang, Wenjun; Mandal, Taraknath; Larson, Ronald G
2017-10-02
We recently developed coarse-grained (CG) force fields for hydroxypropyl methylcellulose acetate succinate (HPMCAS) polymers and the model drug molecule phenytoin, and a continuum transport model to study the polymer-drug nanostructures presented during a dissolution test after solvation of solid dispersion particles. We model the polymer-drug interactions that contribute to suppression of drug aggregation, release, and crystal growth during the dissolution process, and we take these as indicators of polymer effectiveness. We find that the size and the intermolecular interaction strength of the functional group and the drug loading concentration are the major factors that impact the effectiveness of the polymeric excipient. The hydroxypropyl acetyl group is the most effective functional group, followed by the acetyl group, while the deprotonated succinyl group is the least effective functional group, except that the deprotonated succinyl group at the 6-position is very effective in slowing down the phenytoin crystal growth. Our simulation results thus suggest HPMCAS with higher acetyl and lower succinyl content is more effective in promoting phenytoin solubility in dissolution media, and polymers become less effective when drug loading becomes high (i.e., 50% of the mass of the polymer/drug solid dispersion), agreeing with previous experimental studies. In addition, our transport model indicates that the drug release time from a solid dispersion particle of 2 μm diameter is less than 10 min, correlating well with the experimental time scale for a typical dissolution profile to reach maximum peak concentration. Our modeling effort, therefore, provides new avenues to understand the dissolution behavior of complex HPMCAS-phenytoin solid dispersions and offers a new design tool to optimize the formulation. Moreover, the systematic and robust approach used in our computational models can be extended to other polymeric excipients and drug candidates.
Monte-Carlo computation of turbulent premixed methane/air ignition
NASA Astrophysics Data System (ADS)
Carmen, Christina Lieselotte
The present work describes the results obtained by a time dependent numerical technique that simulates the early flame development of a spark-ignited premixed, lean, gaseous methane/air mixture with the unsteady spherical flame propagating in homogeneous and isotropic turbulence. The algorithm described is based upon a sub-model developed by an international automobile research and manufacturing corporation in order to analyze turbulence conditions within internal combustion engines. Several developments and modifications to the original algorithm have been implemented including a revised chemical reaction scheme and the evaluation and calculation of various turbulent flame properties. Solution of the complete set of Navier-Stokes governing equations for a turbulent reactive flow is avoided by reducing the equations to a single transport equation. The transport equation is derived from the Navier-Stokes equations for a joint probability density function, thus requiring no closure assumptions for the Reynolds stresses. A Monte-Carlo method is also utilized to simulate phenomena represented by the probability density function transport equation by use of the method of fractional steps. Gaussian distributions of fluctuating velocity and fuel concentration are prescribed. Attention is focused on the evaluation of the three primary parameters that influence the initial flame kernel growth-the ignition system characteristics, the mixture composition, and the nature of the flow field. Efforts are concentrated on the effects of moderate to intense turbulence on flames within the distributed reaction zone. Results are presented for lean conditions with the fuel equivalence ratio varying from 0.6 to 0.9. The present computational results, including flame regime analysis and the calculation of various flame speeds, provide excellent agreement with results obtained by other experimental and numerical researchers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watanabe, Karen H.; Andersen, Melvin E.; Basu, Nil
2011-01-01
An adverse outcome pathway (AOP) is a sequence of key events from a molecular-level initiating event and an ensuing cascade of steps to an adverse outcome with population level significance. To implement a predictive strategy for ecotoxicology, the multiscale nature of an AOP requires computational models to link salient processes (e.g., in chemical uptake, toxicokinetics, toxicodynamics, and population dynamics). A case study with domoic acid was used to demonstrate strategies and enable generic recommendations for developing computational models in an effort to move toward a toxicity testing paradigm focused on toxicity pathway perturbations applicable to ecological risk assessment. Domoic acid,more » an algal toxin with adverse effects on both wildlife and humans, is a potent agonist for kainate receptors (ionotropic glutamate receptors whose activation leads to the influx of Na+ and Ca2+). Increased Ca2+ concentrations result in neuronal excitotoxicity and cell death primarily in the hippocampus, which produces seizures, impairs learning and memory, and alters behavior in some species. Altered neuronal Ca2+ is a key process in domoic acid toxicity which can be evaluated in vitro. Further, results of these assays would be amenable to mechanistic modeling for identifying domoic acid concentrations and Ca2+ perturbations that are normal, adaptive, or clearly toxic. In vitro assays with outputs amenable to measurement in exposed populations can link in vitro to in vivo conditions, and toxicokinetic information will aid in linking in vitro results to the individual organism. Development of an AOP required an iterative process with three important outcomes: (1) a critically reviewed, stressor-specific AOP; (2) identification of key processes suitable for evaluation with in vitro assays; and (3) strategies for model development.« less
Long-term simulations of dissolved oxygen concentrations in Lake Trout lakes
NASA Astrophysics Data System (ADS)
Jabbari, A.; Boegman, L.; MacKay, M.; Hadley, K.; Paterson, A.; Jeziorski, A.; Nelligan, C.; Smol, J. P.
2016-02-01
Lake Trout are a rare and valuable natural resource that are threatened by multiple environmental stressors. With the added threat of climate warming, there is growing concern among resource managers that increased thermal stratification will reduce the habitat quality of deep-water Lake Trout lakes through enhanced oxygen depletion. To address this issue, a three-part study is underway, which aims to: analyze sediment cores to understand the past, develop empirical formulae to model the present and apply computational models to forecast the future. This presentation reports on the computational modeling efforts. To this end, a simple dissolved oxygen sub-model has been embedded in the one-dimensional bulk mixed-layer thermodynamic Canadian Small Lake Model (CSLM). This model is currently being incorporated into the Canadian Land Surface Scheme (CLASS), the primary land surface component of Environment Canada's global and regional climate modelling systems. The oxygen model was calibrated and validated by hind-casting temperature and dissolved oxygen profiles from two Lake Trout lakes on the Canadian Shield. These data sets include 5 years of high-frequency (10 s to 10 min) data from Eagle Lake and 30 years of bi-weekly data from Harp Lake. Initial results show temperature and dissolved oxygen was predicted with root mean square error <1.5 °C and <3 mgL-1, respectively. Ongoing work is validating the model, over climate-change relevant timescales, against dissolved oxygen reconstructions from the sediment cores and predicting future deep-water temperature and dissolved oxygen concentrations in Canadian Lake Trout lakes under future climate change scenarios. This model will provide a useful tool for managers to ensure sustainable fishery resources for future generations.
NASA Technical Reports Server (NTRS)
Nesbitt, James A.
2001-01-01
A finite-difference computer program (COSIM) has been written which models the one-dimensional, diffusional transport associated with high-temperature oxidation and interdiffusion of overlay-coated substrates. The program predicts concentration profiles for up to three elements in the coating and substrate after various oxidation exposures. Surface recession due to solute loss is also predicted. Ternary cross terms and concentration-dependent diffusion coefficients are taken into account. The program also incorporates a previously-developed oxide growth and spalling model to simulate either isothermal or cyclic oxidation exposures. In addition to predicting concentration profiles after various oxidation exposures, the program can also be used to predict coating life based on a concentration dependent failure criterion (e.g., surface solute content drops to 2%). The computer code is written in FORTRAN and employs numerous subroutines to make the program flexible and easily modifiable to other coating oxidation problems.
Segregation effects during solidification in weightless melts
NASA Technical Reports Server (NTRS)
Li, C.; Gershinsky, M.
1974-01-01
The generalized problem of determining the temperature and solute concentration profiles during directional solidification of binary alloys with surface evaporation was mathematically formulated. Realistic initial and boundary conditions were defined, and a computer program was developed and checked out. The programs computes the positions of two moving boundaries, evaporation and solidification, and their velocities. Temperature and solute concentration profiles in the semiinfinite material body at selected instances of time are also computed.
Viscous Design of TCA Configuration
NASA Technical Reports Server (NTRS)
Krist, Steven E.; Bauer, Steven X. S.; Campbell, Richard L.
1999-01-01
The goal in this effort is to redesign the baseline TCA configuration for improved performance at both supersonic and transonic cruise. Viscous analyses are conducted with OVERFLOW, a Navier-Stokes code for overset grids, using PEGSUS to compute the interpolations between overset grids. Viscous designs are conducted with OVERDISC, a script which couples OVERFLOW with the Constrained Direct Iterative Surface Curvature (CDISC) inverse design method. The successful execution of any computational fluid dynamics (CFD) based aerodynamic design method for complex configurations requires an efficient method for regenerating the computational grids to account for modifications to the configuration shape. The first section of this presentation deals with the automated regridding procedure used to generate overset grids for the fuselage/wing/diverter/nacelle configurations analysed in this effort. The second section outlines the procedures utilized to conduct OVERDISC inverse designs. The third section briefly covers the work conducted by Dick Campbell, in which a dual-point design at Mach 2.4 and 0.9 was attempted using OVERDISC; the initial configuration from which this design effort was started is an early version of the optimized shape for the TCA configuration developed by the Boeing Commercial Airplane Group (BCAG), which eventually evolved into the NCV design. The final section presents results from application of the Natural Flow Wing design philosophy to the TCA configuration.
Multimedia architectures: from desktop systems to portable appliances
NASA Astrophysics Data System (ADS)
Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.
1997-01-01
Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.
Support Expressed in Congress for U.S. High-Performance Computing
NASA Astrophysics Data System (ADS)
Showstack, Randy
2004-06-01
Advocates for a stronger U.S. position in high-performance computing-which could help with a number of grand challenges in the Earth sciences and other disciplines-hope that legislation recently introduced in the House of Representatives, and, will help to revitalize U.S. efforts. The High-Performance Computing Revitalization Act of 2004 would amend the earlier High-Performance Computing Act of 1991 (Public Law 102-194), which is partially credited with helping to strengthen U.S. capabilities in this area. The bill has the support of the Bush administration.
Computational structural mechanics for engine structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1988-01-01
The computational structural mechanics (CSM) program at Lewis encompasses the formulation and solution of structural mechanics problems and the development of integrated software systems to computationally simulate the performance, durability, and life of engine structures. It is structured to supplement, complement, and, whenever possible, replace costly experimental efforts. Specific objectives are to investigate unique advantages of parallel and multiprocessing for reformulating and solving structural mechanics and formulating and solving multidisciplinary mechanics and to develop integrated structural system computational simulators for predicting structural performance, evaluating newly developed methods, and identifying and prioritizing improved or missing methods.
Computational structural mechanics for engine structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1989-01-01
The computational structural mechanics (CSM) program at Lewis encompasses the formulation and solution of structural mechanics problems and the development of integrated software systems to computationally simulate the performance, durability, and life of engine structures. It is structured to supplement, complement, and, whenever possible, replace costly experimental efforts. Specific objectives are to investigate unique advantages of parallel and multiprocessing for reformulating and solving structural mechanics and formulating and solving multidisciplinary mechanics and to develop integrated structural system computational simulators for predicting structural performance, evaluating newly developed methods, and identifying and prioritizing improved or missing methods.
Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings
NASA Technical Reports Server (NTRS)
1992-01-01
The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.
Data Movement Dominates: Advanced Memory Technology to Address the Real Exascale Power Problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergman, Keren
Energy is the fundamental barrier to Exascale supercomputing and is dominated by the cost of moving data from one point to another, not computation. Similarly, performance is dominated by data movement, not computation. The solution to this problem requires three critical technologies: 3D integration, optical chip-to-chip communication, and a new communication model. The central goal of the Sandia led "Data Movement Dominates" project aimed to develop memory systems and new architectures based on these technologies that have the potential to lower the cost of local memory accesses by orders of magnitude and provide substantially more bandwidth. Only through these transformationalmore » advances can future systems reach the goals of Exascale computing with a manageable power budgets. The Sandia led team included co-PIs from Columbia University, Lawrence Berkeley Lab, and the University of Maryland. The Columbia effort of Data Movement Dominates focused on developing a physically accurate simulation environment and experimental verification for optically-connected memory (OCM) systems that can enable continued performance scaling through high-bandwidth capacity, energy-efficient bit-rate transparency, and time-of-flight latency. With OCM, memory device parallelism and total capacity can scale to match future high-performance computing requirements without sacrificing data-movement efficiency. When we consider systems with integrated photonics, links to memory can be seamlessly integrated with the interconnection network-in a sense, memory becomes a primary aspect of the interconnection network. At the core of the Columbia effort, toward expanding our understanding of OCM enabled computing we have created an integrated modeling and simulation environment that uniquely integrates the physical behavior of the optical layer. The PhoenxSim suite of design and software tools developed under this effort has enabled the co-design of and performance evaluation photonics-enabled OCM architectures on Exascale computing systems.« less
Indoor air polychlorinated biphenyl (PCB) concentrations in some U.S. schools are one or more orders of magnitude higher than background levels. In response to this, efforts have been made to assess the potential health risk posed by inhaled PCBs. These efforts are hindered by un...
Discounting the value of safety: effects of perceived risk and effort.
Sigurdsson, Sigurdur O; Taylor, Matthew A; Wirth, Oliver
2013-09-01
Although falls from heights remain the most prevalent cause of fatalities in the construction industry, factors impacting safety-related choices associated with work at heights are not completely understood. Better tools are needed to identify and study the factors influencing safety-related choices and decision making. Using a computer-based task within a behavioral economics paradigm, college students were presented a choice between two hypothetical scenarios that differed in working height and effort associated with retrieving and donning a safety harness. Participants were instructed to choose the scenario in which they were more likely to wear the safety harness. Based on choice patterns, switch points were identified, indicating when the perceived risk in both scenarios was equivalent. Switch points were a systematic function of working height and effort, and the quantified relation between perceived risk and effort was described well by a hyperbolic equation. Choice patterns revealed that the perceived risk of working at heights decreased as the effort to retrieve and don a safety harness increased. Results contribute to the development of computer-based procedure for assessing risk discounting within a behavioral economics framework. Such a procedure can be used as a research tool to study factors that influence safety-related decision making with a goal of informing more effective prevention and intervention strategies. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Marshall, Jochen; Milos, Frank; Fredrich, Joanne; Rasky, Daniel J. (Technical Monitor)
1997-01-01
Laser Scanning Confocal Microscopy (LSCM) has been used to obtain digital images of the complicated 3-D (three-dimensional) microstructures of rigid, fibrous thermal protection system (TPS) materials. These orthotropic materials are comprised of refractory ceramic fibers with diameters in the range of 1 to 10 microns and have open porosities of 0.8 or more. Algorithms are being constructed to extract quantitative microstructural information from the digital data so that it may be applied to specific heat and mass transport modeling efforts; such information includes, for example, the solid and pore volume fractions, the internal surface area per volume, fiber diameter distributions, and fiber orientation distributions. This type of information is difficult to obtain in general, yet it is directly relevant to many computational efforts which seek to model macroscopic thermophysical phenomena in terms of microscopic mechanisms or interactions. Two such computational efforts for fibrous TPS materials are: i) the calculation of radiative transport properties; ii) the modeling of gas permeabilities.
Toward using alpha and theta brain waves to quantify programmer expertise.
Crk, Igor; Kluthe, Timothy
2014-01-01
Empirical studies of programming language learnability and usability have thus far depended on indirect measures of human cognitive performance, attempting to capture what is at its essence a purely cognitive exercise through various indicators of comprehension, such as the correctness of coding tasks or the time spent working out the meaning of code and producing acceptable solutions. Understanding program comprehension is essential to understanding the inherent complexity of programming languages, and ultimately, having a measure of mental effort based on direct observation of the brain at work will illuminate the nature of the work of programming. We provide evidence of direct observation of the cognitive effort associated with programming tasks, through a carefully constructed empirical study using a cross-section of undergraduate computer science students and an inexpensive, off-the-shelf brain-computer interface device. This study presents a link between expertise and programming language comprehension, draws conclusions about the observed indicators of cognitive effort using recent cognitive theories, and proposes directions for future work that is now possible.
Development of an adaptive hp-version finite element method for computational optimal control
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.; Warner, Michael S.
1994-01-01
In this research effort, the usefulness of hp-version finite elements and adaptive solution-refinement techniques in generating numerical solutions to optimal control problems has been investigated. Under NAG-939, a general FORTRAN code was developed which approximated solutions to optimal control problems with control constraints and state constraints. Within that methodology, to get high-order accuracy in solutions, the finite element mesh would have to be refined repeatedly through bisection of the entire mesh in a given phase. In the current research effort, the order of the shape functions in each element has been made a variable, giving more flexibility in error reduction and smoothing. Similarly, individual elements can each be subdivided into many pieces, depending on the local error indicator, while other parts of the mesh remain coarsely discretized. The problem remains to reduce and smooth the error while still keeping computational effort reasonable enough to calculate time histories in a short enough time for on-board applications.
A parallel approach of COFFEE objective function to multiple sequence alignment
NASA Astrophysics Data System (ADS)
Zafalon, G. F. D.; Visotaky, J. M. V.; Amorim, A. R.; Valêncio, C. R.; Neves, L. A.; de Souza, R. C. G.; Machado, J. M.
2015-09-01
The computational tools to assist genomic analyzes show even more necessary due to fast increasing of data amount available. With high computational costs of deterministic algorithms for sequence alignments, many works concentrate their efforts in the development of heuristic approaches to multiple sequence alignments. However, the selection of an approach, which offers solutions with good biological significance and feasible execution time, is a great challenge. Thus, this work aims to show the parallelization of the processing steps of MSA-GA tool using multithread paradigm in the execution of COFFEE objective function. The standard objective function implemented in the tool is the Weighted Sum of Pairs (WSP), which produces some distortions in the final alignments when sequences sets with low similarity are aligned. Then, in studies previously performed we implemented the COFFEE objective function in the tool to smooth these distortions. Although the nature of COFFEE objective function implies in the increasing of execution time, this approach presents points, which can be executed in parallel. With the improvements implemented in this work, we can verify the execution time of new approach is 24% faster than the sequential approach with COFFEE. Moreover, the COFFEE multithreaded approach is more efficient than WSP, because besides it is slightly fast, its biological results are better.
Computational Fluid Dynamics Modeling of the Operation of a Flame Ionization Sensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huckaby, E.D.; Chorpening, B.T.; Thornton, J.D.
The sensors and controls research group at the United States Department of Energy (DOE) National Energy Technology Laboratory (NETL) is continuing to develop the Combustion Control and Diagnostics Sensor (CCADS) for gas turbine applications. CCADS uses the electrical conduction of the charged species generated during the combustion process to detect combustion instabilities and monitor equivalence ratio. As part of this effort, combustion models are being developed which include the interaction between the electric field and the transport of charged species. The primary combustion process is computed using a flame wrinkling model (Weller et. al. 1998) which is a component ofmore » the OpenFOAM toolkit (Jasak et. al. 2004). A sub-model for the transport of charged species is attached to this model. The formulation of the charged-species model similar that applied by Penderson and Brown (1993) for the simulation of laminar flames. The sub-model consists of an additional flux due to the electric field (drift flux) added to the equations for the charged species concentrations and the solution the electric potential from the resolved charge density. The subgrid interactions between the electric field and charged species transport have been neglected. Using the above procedure, numerical simulations are performed and the results compared with several recent CCADS experiments.« less
Balasubramanian, Saravana K; Coger, Robin N
2005-01-01
Bioartificial liver devices (BALs) have proven to be an effective bridge to transplantation for cases of acute liver failure. Enabling the long-term storage of these devices using a method such as cryopreservation will ensure their easy off the shelf availability. To date, cryopreservation of liver cells has been attempted for both single cells and sandwich cultures. This study presents the potential of using computational modeling to help develop a cryopreservation protocol for storing the three dimensional BAL: Hepatassist. The focus is upon determining the thermal and concentration profiles as the BAL is cooled from 37 degrees C-100 degrees C, and is completed in two steps: a cryoprotectant loading step and a phase change step. The results indicate that, for the loading step, mass transfer controls the duration of the protocol, whereas for the phase change step, when mass transfer is assumed negligible, the latent heat released during freezing is the control factor. The cryoprotocol that is ultimately proposed considers time, cooling rate, and the temperature gradients that the cellular space is exposed to during cooling. To our knowledge, this study is the first reported effort toward designing an effective protocol for the cryopreservation of a three-dimensional BAL device.
Displaying CFD Solution Parameters on Arbitrary Cut Planes
NASA Technical Reports Server (NTRS)
Pao, S. Paul
2008-01-01
USMC6 is a Fortran 90 computer program for post-processing in support of visualization of flows simulated by computational fluid dynamics (CFD). The name "USMC6" is partly an abbreviation of "TetrUSS - USM3D Solution Cutter," reflecting its origin as a post-processor for use with USM3D - a CFD program that is a component of the Tetrahedral Unstructured Software System and that solves the Navier-Stokes equations on tetrahedral unstructured grids. "Cutter" here refers to a capability to acquire and process solution data on (1) arbitrary planes that cut through grid volumes, or (2) user-selected spheroidal, conical, cylindrical, and/or prismatic domains cut from within grids. Cutting saves time by enabling concentration of post-processing and visualization efforts on smaller solution domains of interest. The user can select from among more than 40 flow functions. The cut planes can be trimmed to circular or rectangular shape. The user specifies cuts and functions in a free-format input file using simple and easy-to-remember keywords. The USMC6 command line is simple enough that the slicing process can readily be embedded in a shell script for assembly-line post-processing. The output of USMC6 is a data file ready for plotting.
Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview
NASA Technical Reports Server (NTRS)
Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard
1996-01-01
The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'
Sedentary behavior, physical activity, and concentrations of insulin among US adults.
Ford, Earl S; Li, Chaoyang; Zhao, Guixiang; Pearson, William S; Tsai, James; Churilla, James R
2010-09-01
Time spent watching television has been linked to obesity, metabolic syndrome, and diabetes, all conditions characterized to some degree by hyperinsulinemia and insulin resistance. However, limited evidence relates screen time (watching television or using a computer) directly to concentrations of insulin. We examined the cross-sectional associations between time spent watching television or using a computer, physical activity, and serum concentrations of insulin using data from 2800 participants aged at least 20 years of the 2003-2006 National Health and Nutrition Examination Survey. The amount of time spent watching television and using a computer as well as physical activity was self-reported. The unadjusted geometric mean concentration of insulin increased from 6.2 microU/mL among participants who did not watch television to 10.0 microU/mL among those who watched television for 5 or more hours per day (P = .001). After adjustment for age, sex, race or ethnicity, educational status, concentration of cotinine, alcohol intake, physical activity, waist circumference, and body mass index using multiple linear regression analysis, the log-transformed concentrations of insulin were significantly and positively associated with time spent watching television (P = < .001). Reported time spent using a computer was significantly associated with log-transformed concentrations of insulin before but not after accounting for waist circumference and body mass index. Leisure-time physical activity but not transportation or household physical activity was significantly and inversely associated with log-transformed concentrations of insulin. Sedentary behavior, particularly the amount of time spent watching television, may be an important modifiable determinant of concentrations of insulin. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Brian Phillip
The purpose of this document is to describe the statistical modeling effort for gas concentrations in WIPP storage containers. The concentration (in ppm) of CO 2 in the headspace volume of standard waste box (SWB) 68685 is shown. A Bayesian approach and an adaptive Metropolis-Hastings algorithm were used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McInerney, J.D.; Micikas, L.B.
Efforts are described to prepare educational materials including computer based as well as conventional type teaching materials for training interested high school and elementary students in aspects of Human Genome Project.
Energy star. (Latest citations from the Computer database). Published Search
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The bibliography contains citations concerning a collaborative effort between the Environmental Protection Agency (EPA) and private industry to reduce electrical power consumed by personal computers and related peripherals. Manufacturers complying with EPA guidelines are officially recognized by award of a special Energy Star logo, and are referred to in official documents as a vendor of green computers. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)
Singular Perturbations and Time-Scale Methods in Control Theory: Survey 1976-1982.
1982-12-01
established in the 1960s, when they first became a means for simplified computation of optimal trajectories. It was soon recognized that singular...null-space of P(ao). The asymptotic values of the invariant zeros and associated invariant-zero directions as € O are the values computed from the...49 ’ 49 7. WEAK COUPLING AND TIME SCALES The need for model simplification with a reduction (or distribution) of computational effort is
ERIC Educational Resources Information Center
White, Kerry-Ann
2012-01-01
Given the lack of computer diffusion studies in the Caribbean, and coupled with the necessity to understand Jamaica in efforts to get a clearer global representation of the digital divide, this study takes an exploratory approach and examines the differences of the computer technology adoption and diffusion attitudes and viewpoints between the…
Radio Frequency Mass Gauging of Propellants
NASA Technical Reports Server (NTRS)
Zimmerli, Gregory A.; Vaden, Karl R.; Herlacher, Michael D.; Buchanan, David A.; VanDresar, Neil T.
2007-01-01
A combined experimental and computer simulation effort was conducted to measure radio frequency (RF) tank resonance modes in a dewar partially filled with liquid oxygen, and compare the measurements with numerical simulations. The goal of the effort was to demonstrate that computer simulations of a tank's electromagnetic eigenmodes can be used to accurately predict ground-based measurements, thereby providing a computational tool for predicting tank modes in a low-gravity environment. Matching the measured resonant frequencies of several tank modes with computer simulations can be used to gauge the amount of liquid in a tank, thus providing a possible method to gauge cryogenic propellant tanks in low-gravity. Using a handheld RF spectrum analyzer and a small antenna in a 46 liter capacity dewar for experimental measurements, we have verified that the four lowest transverse magnetic eigenmodes can be accurately predicted as a function of liquid oxygen fill level using computer simulations. The input to the computer simulations consisted of tank dimensions, and the dielectric constant of the fluid. Without using any adjustable parameters, the calculated and measured frequencies agree such that the liquid oxygen fill level was gauged to within 2 percent full scale uncertainty. These results demonstrate the utility of using electromagnetic simulations to form the basis of an RF mass gauging technology with the power to simulate tank resonance frequencies from arbitrary fluid configurations.
Virtual reality neurosurgery: a simulator blueprint.
Spicer, Mark A; van Velsen, Martin; Caffrey, John P; Apuzzo, Michael L J
2004-04-01
This article details preliminary studies undertaken to integrate the most relevant advancements across multiple disciplines in an effort to construct a highly realistic neurosurgical simulator based on a distributed computer architecture. Techniques based on modified computational modeling paradigms incorporating finite element analysis are presented, as are current and projected efforts directed toward the implementation of a novel bidirectional haptic device. Patient-specific data derived from noninvasive magnetic resonance imaging sequences are used to construct a computational model of the surgical region of interest. Magnetic resonance images of the brain may be coregistered with those obtained from magnetic resonance angiography, magnetic resonance venography, and diffusion tensor imaging to formulate models of varying anatomic complexity. The majority of the computational burden is encountered in the presimulation reduction of the computational model and allows realization of the required threshold rates for the accurate and realistic representation of real-time visual animations. Intracranial neurosurgical procedures offer an ideal testing site for the development of a totally immersive virtual reality surgical simulator when compared with the simulations required in other surgical subspecialties. The material properties of the brain as well as the typically small volumes of tissue exposed in the surgical field, coupled with techniques and strategies to minimize computational demands, provide unique opportunities for the development of such a simulator. Incorporation of real-time haptic and visual feedback is approached here and likely will be accomplished soon.
Strategic directions of computing at Fermilab
NASA Astrophysics Data System (ADS)
Wolbers, Stephen
1998-05-01
Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.
NASA HPCC Technology for Aerospace Analysis and Design
NASA Technical Reports Server (NTRS)
Schulbach, Catherine H.
1999-01-01
The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.
2010-01-01
Background The finite volume solver Fluent (Lebanon, NH, USA) is a computational fluid dynamics software employed to analyse biological mass-transport in the vasculature. A principal consideration for computational modelling of blood-side mass-transport is convection-diffusion discretisation scheme selection. Due to numerous discretisation schemes available when developing a mass-transport numerical model, the results obtained should either be validated against benchmark theoretical solutions or experimentally obtained results. Methods An idealised aneurysm model was selected for the experimental and computational mass-transport analysis of species concentration due to its well-defined recirculation region within the aneurysmal sac, allowing species concentration to vary slowly with time. The experimental results were obtained from fluid samples extracted from a glass aneurysm model, using the direct spectrophometric concentration measurement technique. The computational analysis was conducted using the four convection-diffusion discretisation schemes available to the Fluent user, including the First-Order Upwind, the Power Law, the Second-Order Upwind and the Quadratic Upstream Interpolation for Convective Kinetics (QUICK) schemes. The fluid has a diffusivity of 3.125 × 10-10 m2/s in water, resulting in a Peclet number of 2,560,000, indicating strongly convection-dominated flow. Results The discretisation scheme applied to the solution of the convection-diffusion equation, for blood-side mass-transport within the vasculature, has a significant influence on the resultant species concentration field. The First-Order Upwind and the Power Law schemes produce similar results. The Second-Order Upwind and QUICK schemes also correlate well but differ considerably from the concentration contour plots of the First-Order Upwind and Power Law schemes. The computational results were then compared to the experimental findings. An average error of 140% and 116% was demonstrated between the experimental results and those obtained from the First-Order Upwind and Power Law schemes, respectively. However, both the Second-Order upwind and QUICK schemes accurately predict species concentration under high Peclet number, convection-dominated flow conditions. Conclusion Convection-diffusion discretisation scheme selection has a strong influence on resultant species concentration fields, as determined by CFD. Furthermore, either the Second-Order or QUICK discretisation schemes should be implemented when numerically modelling convection-dominated mass-transport conditions. Finally, care should be taken not to utilize computationally inexpensive discretisation schemes at the cost of accuracy in resultant species concentration. PMID:20642816
CFD Based Computations of Flexible Helicopter Blades for Stability Analysis
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.
2011-01-01
As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.
Thomas, Michael; Corry, Ben
2016-01-01
Membranes made from nanomaterials such as nanotubes and graphene have been suggested to have a range of applications in water filtration and desalination, but determining their suitability for these purposes requires an accurate assessment of the properties of these novel materials. In this study, we use molecular dynamics simulations to determine the permeability and salt rejection capabilities for membranes incorporating carbon nanotubes (CNTs) at a range of pore sizes, pressures and concentrations. We include the influence of osmotic gradients and concentration build up and simulate at realistic pressures to improve the reliability of estimated membrane transport properties. We find that salt rejection is highly dependent on the applied hydrostatic pressure, meaning high rejection can be achieved with wider tubes than previously thought; while membrane permeability depends on salt concentration. The ideal size of the CNTs for desalination applications yielding high permeability and high salt rejection is found to be around 1.1 nm diameter. While there are limited energy gains to be achieved in using ultra-permeable CNT membranes in desalination by reverse osmosis, such membranes may allow for smaller plants to be built as is required when size or weight must be minimized. There are diminishing returns in further increasing membrane permeability, so efforts should focus on the fabrication of membranes containing narrow or functionalized CNTs that yield the desired rejection or selection properties rather than trying to optimize pore densities. PMID:26712639
Sources of avoidance motivation: Valence effects from physical effort and mental rotation.
Morsella, Ezequiel; Feinberg, Giles H; Cigarchi, Sepeedeh; Newton, James W; Williams, Lawrence E
2011-09-01
When reaching goals, organisms must simultaneously meet the overarching goal of conserving energy. According to the law of least effort, organisms will select the means associated with the least effort. The mechanisms underlying this bias remain unknown. One hypothesis is that organisms come to avoid situations associated with unnecessary effort by generating a negative valence toward the stimuli associated with such situations. Accordingly, merely using a dysfunctional, 'slow' computer mouse causes participants to dislike ambient neutral images (Study 1). In Study 2, nonsense shapes were liked less when associated with effortful processing (135° of mental rotation) versus easier processing (45° of rotation). Complementing 'fluency' effects found in perceptuo-semantic research, valence emerged from action-related processing in a principled fashion. The findings imply that negative valence associations may underlie avoidance motivations, and have practical implications for educational/workplace contexts in which effort and positive affect are conducive to success.
Computer simulation modeling of recreation use: Current status, case studies, and future directions
David N. Cole
2005-01-01
This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...
Troubleshooting Microcomputers. A Technical Guide for Polk County Schools.
ERIC Educational Resources Information Center
Black, B. R.; And Others
This guide was started in 1986 as an effort to pull together a collection of several computer guides that had been written over the previous several years to assist schools in making simple computer repairs. The first of six sections contains general tips and hints, including sections on tool requirements, strobe disk speed adjustment, static…
ERIC Educational Resources Information Center
Brown, Abbie; Sugar, William
2004-01-01
A report on the efforts made to describe the range of human-computer interaction skills necessary to complete a program of study in Instructional Design Technology. Educators responsible for instructional media production courses have not yet articulated which among the wide range of possible interactions students must master for instructional…
Web Solutions Inspire Cloud Computing Software
NASA Technical Reports Server (NTRS)
2013-01-01
An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.
Project SUN (Students Understanding Nature)
NASA Technical Reports Server (NTRS)
Curley, T.; Yanow, G.
1995-01-01
Project SUN is part of NASA's 'Mission to Planet Earth' education outreach effort. It is based on development of low cost, scientifi- cally accurate instrumentation and computer interfacing, coupled with Apple II computers as dedicated data loggers. The project is com- prised of: instruments, interfacing, software, curriculum, a detailed operating manual, and a system of training at the school sites.
ERIC Educational Resources Information Center
Mouza, Chrystalla; Marzocchi, Alison; Pan, Yi-Cheng; Pollock, Lori
2016-01-01
Current policy efforts that seek to improve learning in science, technology, engineering, and mathematics (STEM) emphasize the importance of helping all students acquire concepts and tools from computer science that help them analyze and develop solutions to everyday problems. These goals have been generally described in the literature under the…
ERIC Educational Resources Information Center
Baker, Eva L.
Some special problems associated with evaluating intelligent computer-assisted instruction (ICAI) programs are addressed. This paper intends to describe alternative approaches to the assessment and improvement of such applications and to provide examples of efforts undertaken and shortfalls. Issues discussed stem chiefly from the technical demands…
Riding the Crest of the E-Commerce Wave: Transforming MIT's Campus Computer Resale Operation.
ERIC Educational Resources Information Center
Hallisey, Joanne
1998-01-01
Reengineering efforts, vendor consolidation, and rising costs prompted the Massachusetts Institute of Technology to convert its computer resale store to an online catalog that allows students, faculty, and staff to purchase equipment and software through a World Wide Web interface. The transition has been greeted with a mixed reaction. The next…
Computer Science Education in North-Rhine Westphalia, Germany--A Case Study
ERIC Educational Resources Information Center
Knobelsdorf, Maria; Magenheim, Johannes; Brinda, Torsten; Engbring, Dieter; Humbert, Ludger; Pasternak, Arno; Schroeder, Ulrik; Thomas, Marco; Vahrenhold, Jan
2015-01-01
In North-Rhine Westphalia, the most populated state in Germany, Computer Science (CS) has been taught in secondary schools since the early 1970s. This article provides an overview of the past and current situation of CS education in North-Rhine Westphalia, including lessons learned through efforts to introduce and to maintain CS in secondary…
A Modified Laptop Program: Putting the Carts in the Classrooms
ERIC Educational Resources Information Center
Grant, Michael M.; Ross, Steven M.; Wan, Weiping; Potter, Allison; Wilson, Yola
2004-01-01
Four fifth grade classrooms embarked on a modified ubiquitous computing initiative in Fall 2003. Two 15-computer wireless laptop carts were shared among the four classrooms in an effort to integrate technology across the curriculum and affect change in student learning and teacher pedagogy. This initiative?in contrast to other 1:1 programs and…
Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges
ERIC Educational Resources Information Center
Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil
2017-01-01
The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…
Gas flow parameters in laser cutting of wood- nozzle design
Kali Mukherjee; Tom Grendzwell; Parwaiz A.A. Khan; Charles McMillin
1990-01-01
The Automated Lumber Processing System (ALPS) is an ongoing team research effort to optimize the yield of parts in a furniture rough mill. The process is designed to couple aspects of computer vision, computer optimization of yield, and laser cutting. This research is focused on optimizing laser wood cutting. Laser machining of lumber has the advantage over...
Addressing Computational Estimation in the Kuwaiti Curriculum: Teachers' Views
ERIC Educational Resources Information Center
Alajmi, Amal Hussain
2009-01-01
Computational estimation has not yet established a place in the Kuwaiti national curriculum. An attempt was made to include it during the early 1990s, but it was dropped by the Kuwaiti Ministry of Education because of the difficulties teachers had teaching it. In an effort to provide guidance for reintroducing the concept into the curriculum, this…
The Concept of Energy in Psychological Theory. Cognitive Science Program, Technical Report No. 86-2.
ERIC Educational Resources Information Center
Posner, Michael I.; Rothbart, Mary Klevjord
This paper describes a basic framework for integration of computational and energetic concepts in psychological theory. The framework is adapted from a general effort to understand the neural systems underlying cognition. The element of the cognitive system that provides the best basis for attempting to relate energetic and computational ideas is…
ERIC Educational Resources Information Center
Protopapas, Athanassios; Skaloumbakas, Christos; Bali, Persefoni
2008-01-01
After reviewing past efforts related to computer-based reading disability (RD) assessment, we present a fully automated screening battery that evaluates critical skills relevant for RD diagnosis designed for unsupervised application in the Greek educational system. Psychometric validation in 301 children, 8-10 years old (grades 3 and 4; including…
ERIC Educational Resources Information Center
European Commission, 2014
2014-01-01
The 2013 European Commission Communication on Opening up Education underlined the importance of solid evidence to assess developments and take full advantage of the impact of technology on education, and called for sustained effort and international cooperation to improve our knowledge-base in this area. The International Computer and Information…
ERIC Educational Resources Information Center
Udoh, Emmanuel E.
2010-01-01
Advances in grid technology have enabled some organizations to harness enormous computational power on demand. However, the prediction of widespread adoption of the grid technology has not materialized despite the obvious grid advantages. This situation has encouraged intense efforts to close the research gap in the grid adoption process. In this…