Sample records for significant computational effort

  1. Technology and Sexuality--What's the Connection? Addressing Youth Sexualities in Efforts to Increase Girls' Participation in Computing

    ERIC Educational Resources Information Center

    Ashcraft, Catherine

    2015-01-01

    To date, girls and women are significantly underrepresented in computer science and technology. Concerns about this underrepresentation have sparked a wealth of educational efforts to promote girls' participation in computing, but these programs have demonstrated limited impact on reversing current trends. This paper argues that this is, in part,…

  2. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  3. Computational Methods for Stability and Control (COMSAC): The Time Has Come

    NASA Technical Reports Server (NTRS)

    Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.

    2005-01-01

    Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.

  4. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  5. The Stabilization, Exploration, and Expression of Computer Game History

    ERIC Educational Resources Information Center

    Kaltman, Eric

    2017-01-01

    Computer games are now a significant cultural phenomenon, and a significant artistic output of humanity. However, little effort and attention have been paid to how the medium of games and interactive software developed, and even less to the historical storage of software development documentation. This thesis borrows methodologies and practices…

  6. An efficient method for hybrid density functional calculation with spin-orbit coupling

    NASA Astrophysics Data System (ADS)

    Wang, Maoyuan; Liu, Gui-Bin; Guo, Hong; Yao, Yugui

    2018-03-01

    In first-principles calculations, hybrid functional is often used to improve accuracy from local exchange correlation functionals. A drawback is that evaluating the hybrid functional needs significantly more computing effort. When spin-orbit coupling (SOC) is taken into account, the non-collinear spin structure increases computing effort by at least eight times. As a result, hybrid functional calculations with SOC are intractable in most cases. In this paper, we present an approximate solution to this problem by developing an efficient method based on a mixed linear combination of atomic orbital (LCAO) scheme. We demonstrate the power of this method using several examples and we show that the results compare very well with those of direct hybrid functional calculations with SOC, yet the method only requires a computing effort similar to that without SOC. The presented technique provides a good balance between computing efficiency and accuracy, and it can be extended to magnetic materials.

  7. On the evaluation of derivatives of Gaussian integrals

    NASA Technical Reports Server (NTRS)

    Helgaker, Trygve; Taylor, Peter R.

    1992-01-01

    We show that by a suitable change of variables, the derivatives of molecular integrals over Gaussian-type functions required for analytic energy derivatives can be evaluated with significantly less computational effort than current formulations. The reduction in effort increases with the order of differentiation.

  8. How Emerging Technologies are Changing the Rules of Spacecraft Ground Support

    NASA Technical Reports Server (NTRS)

    Boland, Dillard; Steger, Warren; Weidow, David; Yakstis, Lou

    1996-01-01

    As part of its effort to develop the flight dynamics distributed system (FDDS), NASA established a program for the continual monitoring of the developments in computer and software technologies, and for assessing the significance of constructing and operating spacecraft ground data systems. In relation to this, technology trends in the computing industry are reviewed, exploring their significance for the spacecraft ground support industry. The technologies considered are: hardware; object computing; Internet; automation, and software development. The ways in which these technologies have affected the industry are considered.

  9. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes weremore » used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge« less

  10. Spectroscopic Data for Characterizing the Atmospheres of Exoplanetsand Assigning Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Lee, Timothy J.

    2018-06-01

    In this talk I will discuss laboratory and computational efforts to provide detailed line list data for use in characterizing the atmospheres of planets, exoplanets, and other astrophysical objects such as dwarf stars. The discussion will cover significant efforts on stable molecules routinely found in atmospheres such as CO2, NH3, H2O, and SO2. In addition, there will be some discussion towards efforts to provide more limited line lists or simulated spectra for molecules that might be present in trace amounts, but would be very significant if identified, such as possible biosignatures. How these efforts may provide insight into astronomical observations, especially with the upcoming James Webb Space Telescope, will also be discussed.

  11. Investigation of Grid Adaptation to Reduce Computational Efforts for a 2-D Hydrogen-Fueled Dual-Mode Scramjet

    NASA Astrophysics Data System (ADS)

    Foo, Kam Keong

    A two-dimensional dual-mode scramjet flowpath is developed and evaluated using the ANSYS Fluent density-based flow solver with various computational grids. Results are obtained for fuel-off, fuel-on non-reacting, and fuel-on reacting cases at different equivalence ratios. A one-step global chemical kinetics hydrogen-air model is used in conjunction with the eddy-dissipation model. Coarse, medium and fine computational grids are used to evaluate grid sensitivity and to investigate a lack of grid independence. Different grid adaptation strategies are performed on the coarse grid in an attempt to emulate the solutions obtained from the finer grids. The goal of this study is to investigate the feasibility of using various mesh adaptation criteria to significantly decrease computational efforts for high-speed reacting flows.

  12. Computing at the speed limit (supercomputers)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernhard, R.

    1982-07-01

    The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less

  13. Parallel aeroelastic computations for wing and wing-body configurations

    NASA Technical Reports Server (NTRS)

    Byun, Chansup

    1994-01-01

    The objective of this research is to develop computationally efficient methods for solving fluid-structural interaction problems by directly coupling finite difference Euler/Navier-Stokes equations for fluids and finite element dynamics equations for structures on parallel computers. This capability will significantly impact many aerospace projects of national importance such as Advanced Subsonic Civil Transport (ASCT), where the structural stability margin becomes very critical at the transonic region. This research effort will have direct impact on the High Performance Computing and Communication (HPCC) Program of NASA in the area of parallel computing.

  14. Elimination sequence optimization for SPAR

    NASA Technical Reports Server (NTRS)

    Hogan, Harry A.

    1986-01-01

    SPAR is a large-scale computer program for finite element structural analysis. The program allows user specification of the order in which the joints of a structure are to be eliminated since this order can have significant influence over solution performance, in terms of both storage requirements and computer time. An efficient elimination sequence can improve performance by over 50% for some problems. Obtaining such sequences, however, requires the expertise of an experienced user and can take hours of tedious effort to affect. Thus, an automatic elimination sequence optimizer would enhance productivity by reducing the analysts' problem definition time and by lowering computer costs. Two possible methods for automating the elimination sequence specifications were examined. Several algorithms based on the graph theory representations of sparse matrices were studied with mixed results. Significant improvement in the program performance was achieved, but sequencing by an experienced user still yields substantially better results. The initial results provide encouraging evidence that the potential benefits of such an automatic sequencer would be well worth the effort.

  15. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  16. Building A Community Focused Data and Modeling Collaborative platform with Hardware Virtualization Technology

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.

    2009-12-01

    As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.

  17. The Effects of Hearing Aid Directional Microphone and Noise Reduction Processing on Listening Effort in Older Adults with Hearing Loss.

    PubMed

    Desjardins, Jamie L

    2016-01-01

    Older listeners with hearing loss may exert more cognitive resources to maintain a level of listening performance similar to that of younger listeners with normal hearing. Unfortunately, this increase in cognitive load, which is often conceptualized as increased listening effort, may come at the cost of cognitive processing resources that might otherwise be available for other tasks. The purpose of this study was to evaluate the independent and combined effects of a hearing aid directional microphone and a noise reduction (NR) algorithm on reducing the listening effort older listeners with hearing loss expend on a speech-in-noise task. Participants were fitted with study worn commercially available behind-the-ear hearing aids. Listening effort on a sentence recognition in noise task was measured using an objective auditory-visual dual-task paradigm. The primary task required participants to repeat sentences presented in quiet and in a four-talker babble. The secondary task was a digital visual pursuit rotor-tracking test, for which participants were instructed to use a computer mouse to track a moving target around an ellipse that was displayed on a computer screen. Each of the two tasks was presented separately and concurrently at a fixed overall speech recognition performance level of 50% correct with and without the directional microphone and/or the NR algorithm activated in the hearing aids. In addition, participants reported how effortful it was to listen to the sentences in quiet and in background noise in the different hearing aid listening conditions. Fifteen older listeners with mild sloping to severe sensorineural hearing loss participated in this study. Listening effort in background noise was significantly reduced with the directional microphones activated in the hearing aids. However, there was no significant change in listening effort with the hearing aid NR algorithm compared to no noise processing. Correlation analysis between objective and self-reported ratings of listening effort showed no significant relation. Directional microphone processing effectively reduced the cognitive load of listening to speech in background noise. This is significant because it is likely that listeners with hearing impairment will frequently encounter noisy speech in their everyday communications. American Academy of Audiology.

  18. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  19. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  20. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  1. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  2. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  3. Neurocomputational mechanisms underlying subjective valuation of effort costs

    PubMed Central

    Giehl, Kathrin; Sillence, Annie

    2017-01-01

    In everyday life, we have to decide whether it is worth exerting effort to obtain rewards. Effort can be experienced in different domains, with some tasks requiring significant cognitive demand and others being more physically effortful. The motivation to exert effort for reward is highly subjective and varies considerably across the different domains of behaviour. However, very little is known about the computational or neural basis of how different effort costs are subjectively weighed against rewards. Is there a common, domain-general system of brain areas that evaluates all costs and benefits? Here, we used computational modelling and functional magnetic resonance imaging (fMRI) to examine the mechanisms underlying value processing in both the cognitive and physical domains. Participants were trained on two novel tasks that parametrically varied either cognitive or physical effort. During fMRI, participants indicated their preferences between a fixed low-effort/low-reward option and a variable higher-effort/higher-reward offer for each effort domain. Critically, reward devaluation by both cognitive and physical effort was subserved by a common network of areas, including the dorsomedial and dorsolateral prefrontal cortex, the intraparietal sulcus, and the anterior insula. Activity within these domain-general areas also covaried negatively with reward and positively with effort, suggesting an integration of these parameters within these areas. Additionally, the amygdala appeared to play a unique, domain-specific role in processing the value of rewards associated with cognitive effort. These results are the first to reveal the neurocomputational mechanisms underlying subjective cost–benefit valuation across different domains of effort and provide insight into the multidimensional nature of motivation. PMID:28234892

  4. Spectral analysis of sinus arrhythmia - A measure of mental effort

    NASA Technical Reports Server (NTRS)

    Vicente, Kim J.; Craig Thornton, D.; Moray, Neville

    1987-01-01

    The validity of the spectral analysis of sinus arrhythmia as a measure of mental effort was investigated using a computer simulation of a hovercraft piloted along a river as the experimental task. Strong correlation was observed between the subjective effort-ratings and the heart-rate variability (HRV) power spectrum between 0.06 and 0.14 Hz. Significant correlations were observed not only between subjects but, more importantly, within subjects as well, indicating that the spectral analysis of HRV is an accurate measure of the amount of effort being invested by a subject. Results also indicate that the intensity of effort invested by subjects cannot be inferred from the objective ratings of task difficulty or from performance.

  5. Many Masses on One Stroke:. Economic Computation of Quark Propagators

    NASA Astrophysics Data System (ADS)

    Frommer, Andreas; Nöckel, Bertold; Güsken, Stephan; Lippert, Thomas; Schilling, Klaus

    The computational effort in the calculation of Wilson fermion quark propagators in Lattice Quantum Chromodynamics can be considerably reduced by exploiting the Wilson fermion matrix structure in inversion algorithms based on the non-symmetric Lanczos process. We consider two such methods: QMR (quasi minimal residual) and BCG (biconjugate gradients). Based on the decomposition M/κ = 1/κ-D of the Wilson mass matrix, using QMR, one can carry out inversions on a whole trajectory of masses simultaneously, merely at the computational expense of a single propagator computation. In other words, one has to compute the propagator corresponding to the lightest mass only, while all the heavier masses are given for free, at the price of extra storage. Moreover, the symmetry γ5M = M†γ5 can be used to cut the computational effort in QMR and BCG by a factor of two. We show that both methods then become — in the critical regime of small quark masses — competitive to BiCGStab and significantly better than the standard MR method, with optimal relaxation factor, and CG as applied to the normal equations.

  6. Parallel computing for automated model calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.

    2002-07-29

    Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less

  7. Louisiana: a model for advancing regional e-Research through cyberinfrastructure.

    PubMed

    Katz, Daniel S; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott

    2009-06-28

    Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date.

  8. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.

    2013-12-15

    Purpose: Four-dimensional computed tomography (4DCT) can be used to make measurements of pulmonary function longitudinally. The sensitivity of such measurements to identify change depends on measurement uncertainty. Previously, intrasubject reproducibility of Jacobian-based measures of lung tissue expansion was studied in two repeat prior-RT 4DCT human acquisitions. Difference in respiratory effort such as breathing amplitude and frequency may affect longitudinal function assessment. In this study, the authors present normalization schemes that correct ventilation images for variations in respiratory effort and assess the reproducibility improvement after effort correction.Methods: Repeat 4DCT image data acquired within a short time interval from 24 patients priormore » to radiation therapy (RT) were used for this analysis. Using a tissue volume preserving deformable image registration algorithm, Jacobian ventilation maps in two scanning sessions were computed and compared on the same coordinate for reproducibility analysis. In addition to computing the ventilation maps from end expiration to end inspiration, the authors investigated the effort normalization strategies using other intermediated inspiration phases upon the principles of equivalent tidal volume (ETV) and equivalent lung volume (ELV). Scatter plots and mean square error of the repeat ventilation maps and the Jacobian ratio map were generated for four conditions: no effort correction, global normalization, ETV, and ELV. In addition, gamma pass rate was calculated from a modified gamma index evaluation between two ventilation maps, using acceptance criterions of 2 mm distance-to-agreement and 5% ventilation difference.Results: The pattern of regional pulmonary ventilation changes as lung volume changes. All effort correction strategies improved reproducibility when changes in respiratory effort were greater than 150 cc (p < 0.005 with regard to the gamma pass rate). Improvement of reproducibility was correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.« less

  9. An Assessment of Computer Science Degree Programs in Virginia. A Report to the Council of Higher Education and Virginia's State-Supported Institutions of Higher Education.

    ERIC Educational Resources Information Center

    Virginia State Council of Higher Education, Richmond.

    This report presents the results of a review of all significant instructional efforts in the computer science discipline in Virginia institutions of higher education, with emphasis on those whose instructional activities constitute complete degree programs. The report is based largely on information provided by the institutions in self-studies. A…

  10. ERTS-B imagery interpretation techniques in the Tennessee Valley

    NASA Technical Reports Server (NTRS)

    Gonzalez, R. C. (Principal Investigator)

    1973-01-01

    There are no author-identified significant results in this report. The proposed investigation is a continuation of an ERTS-1 project. The principal missions are to serve as the principal supporter on computer and image processing problems for the multidisciplinary ERTS effort of the University of Tennessee, and to carry out research in improved methods for the computer processing, enhancement, and recognition of ERTS imagery.

  11. Topical perspective on massive threading and parallelism.

    PubMed

    Farber, Robert M

    2011-09-01

    Unquestionably computer architectures have undergone a recent and noteworthy paradigm shift that now delivers multi- and many-core systems with tens to many thousands of concurrent hardware processing elements per workstation or supercomputer node. GPGPU (General Purpose Graphics Processor Unit) technology in particular has attracted significant attention as new software development capabilities, namely CUDA (Compute Unified Device Architecture) and OpenCL™, have made it possible for students as well as small and large research organizations to achieve excellent speedup for many applications over more conventional computing architectures. The current scientific literature reflects this shift with numerous examples of GPGPU applications that have achieved one, two, and in some special cases, three-orders of magnitude increased computational performance through the use of massive threading to exploit parallelism. Multi-core architectures are also evolving quickly to exploit both massive-threading and massive-parallelism such as the 1.3 million threads Blue Waters supercomputer. The challenge confronting scientists in planning future experimental and theoretical research efforts--be they individual efforts with one computer or collaborative efforts proposing to use the largest supercomputers in the world is how to capitalize on these new massively threaded computational architectures--especially as not all computational problems will scale to massive parallelism. In particular, the costs associated with restructuring software (and potentially redesigning algorithms) to exploit the parallelism of these multi- and many-threaded machines must be considered along with application scalability and lifespan. This perspective is an overview of the current state of threading and parallelize with some insight into the future. Published by Elsevier Inc.

  12. Materials characterization on efforts for ablative materials

    NASA Technical Reports Server (NTRS)

    Tytula, Thomas P.; Schad, Kristin C.; Swann, Myles H.

    1992-01-01

    Experimental efforts to develop a new procedure to measure char depth in carbon phenolic nozzle material are described. Using a Shor Type D Durometer, hardness profiles were mapped across post fired sample blocks and specimens from a fired rocket nozzle. Linear regression was used to estimate the char depth. Results are compared to those obtained from computed tomography in a comparative experiment. There was no significant difference in the depth estimates obtained by the two methods.

  13. Non-conforming finite-element formulation for cardiac electrophysiology: an effective approach to reduce the computation time of heart simulations without compromising accuracy

    NASA Astrophysics Data System (ADS)

    Hurtado, Daniel E.; Rojas, Guillermo

    2018-04-01

    Computer simulations constitute a powerful tool for studying the electrical activity of the human heart, but computational effort remains prohibitively high. In order to recover accurate conduction velocities and wavefront shapes, the mesh size in linear element (Q1) formulations cannot exceed 0.1 mm. Here we propose a novel non-conforming finite-element formulation for the non-linear cardiac electrophysiology problem that results in accurate wavefront shapes and lower mesh-dependance in the conduction velocity, while retaining the same number of global degrees of freedom as Q1 formulations. As a result, coarser discretizations of cardiac domains can be employed in simulations without significant loss of accuracy, thus reducing the overall computational effort. We demonstrate the applicability of our formulation in biventricular simulations using a coarse mesh size of ˜ 1 mm, and show that the activation wave pattern closely follows that obtained in fine-mesh simulations at a fraction of the computation time, thus improving the accuracy-efficiency trade-off of cardiac simulations.

  14. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers.

    PubMed

    Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.

  15. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  16. Louisiana: a model for advancing regional e-Research through cyberinfrastructure

    PubMed Central

    Katz, Daniel S.; Allen, Gabrielle; Cortez, Ricardo; Cruz-Neira, Carolina; Gottumukkala, Raju; Greenwood, Zeno D.; Guice, Les; Jha, Shantenu; Kolluru, Ramesh; Kosar, Tevfik; Leger, Lonnie; Liu, Honggao; McMahon, Charlie; Nabrzyski, Jarek; Rodriguez-Milla, Bety; Seidel, Ed; Speyrer, Greg; Stubblefield, Michael; Voss, Brian; Whittenburg, Scott

    2009-01-01

    Louisiana researchers and universities are leading a concentrated, collaborative effort to advance statewide e-Research through a new cyberinfrastructure: computing systems, data storage systems, advanced instruments and data repositories, visualization environments and people, all linked together by software programs and high-performance networks. This effort has led to a set of interlinked projects that have started making a significant difference in the state, and has created an environment that encourages increased collaboration, leading to new e-Research. This paper describes the overall effort, the new projects and environment and the results to date. PMID:19451102

  17. Generic approach to access barriers in dehydrogenation reactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank

    The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less

  18. Generic approach to access barriers in dehydrogenation reactions

    DOE PAGES

    Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank

    2018-03-08

    The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less

  19. Reducing obesity will require involvement of all sectors of society.

    PubMed

    Hill, James O; Peters, John C; Blair, Steven N

    2015-02-01

    We need all sectors of society involved in reducing obesity. The food industry's effort to reduce energy intake as part of the Healthy Weight Commitment Foundation is a significant step in the right direction and should be recognized as such by the public health community. We also need to get organizations that promote physical inactivity, such as computer, automobile, and entertainment industries, to become engaged in efforts to reduce obesity. © 2014 The Obesity Society.

  20. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus

    PubMed Central

    Karp, Peter D.; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard

    2015-01-01

    Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology (ISMB) 2016, Orlando, Florida). PMID:26097686

  1. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus.

    PubMed

    Karp, Peter D; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard

    2015-01-01

    Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology (ISMB) 2016, Orlando, Florida).

  2. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  3. Application of Interface Technology in Progressive Failure Analysis of Composite Panels

    NASA Technical Reports Server (NTRS)

    Sleight, D. W.; Lotts, C. G.

    2002-01-01

    A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.

  4. Translating Computational Toxicology Data Through Stakeholder Outreach & Engagement (SOT)

    EPA Science Inventory

    US EPA has been using in vitro testing methods in an effort to accelerate the pace of chemical evaluations and address the significant lack of health and environmental data on the thousands of chemicals found in commonly used products. Since 2005, EPA’s researchers have generated...

  5. Cloud-Based Virtual Laboratory for Network Security Education

    ERIC Educational Resources Information Center

    Xu, Le; Huang, Dijiang; Tsai, Wei-Tek

    2014-01-01

    Hands-on experiments are essential for computer network security education. Existing laboratory solutions usually require significant effort to build, configure, and maintain and often do not support reconfigurability, flexibility, and scalability. This paper presents a cloud-based virtual laboratory education platform called V-Lab that provides a…

  6. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  7. U.S. EPA computational toxicology programs: Central role of chemical-annotation efforts and molecular databases

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is engaged in high-profile research efforts to improve the ability to more efficiently and effectively prioritize and screen thousands of environmental chemicals for potential toxicity. A central component of these efforts invol...

  8. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65% scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20%, 64% and 83% semispan stations of the baseline-reference wing. Three-dimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date. 1

  9. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65 percent scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20, 64 and 83 percent semispan stations of the baseline-reference wing. Threedimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date.

  10. A community effort to protect genomic data sharing, collaboration and outsourcing.

    PubMed

    Wang, Shuang; Jiang, Xiaoqian; Tang, Haixu; Wang, Xiaofeng; Bu, Diyue; Carey, Knox; Dyke, Stephanie Om; Fox, Dov; Jiang, Chao; Lauter, Kristin; Malin, Bradley; Sofia, Heidi; Telenti, Amalio; Wang, Lei; Wang, Wenhao; Ohno-Machado, Lucila

    2017-01-01

    The human genome can reveal sensitive information and is potentially re-identifiable, which raises privacy and security concerns about sharing such data on wide scales. In 2016, we organized the third Critical Assessment of Data Privacy and Protection competition as a community effort to bring together biomedical informaticists, computer privacy and security researchers, and scholars in ethical, legal, and social implications (ELSI) to assess the latest advances on privacy-preserving techniques for protecting human genomic data. Teams were asked to develop novel protection methods for emerging genome privacy challenges in three scenarios: Track (1) data sharing through the Beacon service of the Global Alliance for Genomics and Health. Track (2) collaborative discovery of similar genomes between two institutions; and Track (3) data outsourcing to public cloud services. The latter two tracks represent continuing themes from our 2015 competition, while the former was new and a response to a recently established vulnerability. The winning strategy for Track 1 mitigated the privacy risk by hiding approximately 11% of the variation in the database while permitting around 160,000 queries, a significant improvement over the baseline. The winning strategies in Tracks 2 and 3 showed significant progress over the previous competition by achieving multiple orders of magnitude performance improvement in terms of computational runtime and memory requirements. The outcomes suggest that applying highly optimized privacy-preserving and secure computation techniques to safeguard genomic data sharing and analysis is useful. However, the results also indicate that further efforts are needed to refine these techniques into practical solutions.

  11. The National Shipbuilding Research Program. 1989 Ship Production Symposium. Paper No. 13: NIDDESC: Meeting the Data Exchange Challenge Through a Cooperative Effort

    DTIC Science & Technology

    1989-09-01

    RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18... Computer Aided Design (CAD) and Manufacturing (CAM) techniques in the marine industry has increased significantly in recent years, With more...somewhat from ship to ship. All of the activities and companies involved have improved this process by utilizing computer tools. For example, many

  12. NASA's Technology Transfer Program for the Early Detection of Breast Cancer

    NASA Technical Reports Server (NTRS)

    Schmidt, Gregory; Frey, Mary Anne; Vernikos, Joan; Winfield, Daniel; Dalton, Bonnie P. (Technical Monitor)

    1996-01-01

    The National Aeronautics and Space Administration (NASA) has led the development of advanced imaging sensors and image processing technologies for space science and Earth science missions. NASA considers the transfer and commercialization of such technologies a fundamental mission of the agency. Over the last two years, efforts have been focused on the application of aerospace imaging and computing to the field of diagnostic imaging, specifically to breast cancer imaging. These technology transfer efforts offer significant promise in helping in the national public health priority of the early detection of breast cancer.

  13. Administrative Information Systems Plan for FY89

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-11-01

    The Administrative Information Systems (AIS) Plan was developed to prioritize, track, and control the cost of AIS activities. This annually published plan, in conjunction with quarterly status reports, measures projected AIS activities and progress. The AIS Plan and quarterly reporting are administered jointly by the Director of Computing and an Organization 30 director. Priority development projects are clearly defined and closely managed efforts that consume significant resources. Directorate supplementals describe other AIS activity within each directorate, which may include: production support; technical support; development activity; and other AIS effort.

  14. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    PubMed

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Improvements in approaches to forecasting and evaluation techniques

    NASA Astrophysics Data System (ADS)

    Weatherhead, Elizabeth

    2014-05-01

    The US is embarking on an experiment to make significant and sustained improvements in weather forecasting. The effort stems from a series of community conversations that recognized the rapid advancements in observations, modeling and computing techniques in the academic, governmental and private sectors. The new directions and initial efforts will be summarized, including information on possibilities for international collaboration. Most new projects are scheduled to start in the last half of 2014. Several advancements include ensemble forecasting with global models, and new sharing of computing resources. Newly developed techniques for evaluating weather forecast models will be presented in detail. The approaches use statistical techniques that incorporate pair-wise comparisons of forecasts with observations and account for daily auto-correlation to assess appropriate uncertainty in forecast changes. Some of the new projects allow for international collaboration, particularly on the research components of the projects.

  16. Decoding-Accuracy-Based Sequential Dimensionality Reduction of Spatio-Temporal Neural Activities

    NASA Astrophysics Data System (ADS)

    Funamizu, Akihiro; Kanzaki, Ryohei; Takahashi, Hirokazu

    Performance of a brain machine interface (BMI) critically depends on selection of input data because information embedded in the neural activities is highly redundant. In addition, properly selected input data with a reduced dimension leads to improvement of decoding generalization ability and decrease of computational efforts, both of which are significant advantages for the clinical applications. In the present paper, we propose an algorithm of sequential dimensionality reduction (SDR) that effectively extracts motor/sensory related spatio-temporal neural activities. The algorithm gradually reduces input data dimension by dropping neural data spatio-temporally so as not to undermine the decoding accuracy as far as possible. Support vector machine (SVM) was used as the decoder, and tone-induced neural activities in rat auditory cortices were decoded into the test tone frequencies. SDR reduced the input data dimension to a quarter and significantly improved the accuracy of decoding of novel data. Moreover, spatio-temporal neural activity patterns selected by SDR resulted in significantly higher accuracy than high spike rate patterns or conventionally used spatial patterns. These results suggest that the proposed algorithm can improve the generalization ability and decrease the computational effort of decoding.

  17. A Single LiDAR-Based Feature Fusion Indoor Localization Algorithm.

    PubMed

    Wang, Yun-Ting; Peng, Chao-Chung; Ravankar, Ankit A; Ravankar, Abhijeet

    2018-04-23

    In past years, there has been significant progress in the field of indoor robot localization. To precisely recover the position, the robots usually relies on multiple on-board sensors. Nevertheless, this affects the overall system cost and increases computation. In this research work, we considered a light detection and ranging (LiDAR) device as the only sensor for detecting surroundings and propose an efficient indoor localization algorithm. To attenuate the computation effort and preserve localization robustness, a weighted parallel iterative closed point (WP-ICP) with interpolation is presented. As compared to the traditional ICP, the point cloud is first processed to extract corners and line features before applying point registration. Later, points labeled as corners are only matched with the corner candidates. Similarly, points labeled as lines are only matched with the lines candidates. Moreover, their ICP confidence levels are also fused in the algorithm, which make the pose estimation less sensitive to environment uncertainties. The proposed WP-ICP architecture reduces the probability of mismatch and thereby reduces the ICP iterations. Finally, based on given well-constructed indoor layouts, experiment comparisons are carried out under both clean and perturbed environments. It is shown that the proposed method is effective in significantly reducing computation effort and is simultaneously able to preserve localization precision.

  18. A Single LiDAR-Based Feature Fusion Indoor Localization Algorithm

    PubMed Central

    Wang, Yun-Ting; Peng, Chao-Chung; Ravankar, Ankit A.; Ravankar, Abhijeet

    2018-01-01

    In past years, there has been significant progress in the field of indoor robot localization. To precisely recover the position, the robots usually relies on multiple on-board sensors. Nevertheless, this affects the overall system cost and increases computation. In this research work, we considered a light detection and ranging (LiDAR) device as the only sensor for detecting surroundings and propose an efficient indoor localization algorithm. To attenuate the computation effort and preserve localization robustness, a weighted parallel iterative closed point (WP-ICP) with interpolation is presented. As compared to the traditional ICP, the point cloud is first processed to extract corners and line features before applying point registration. Later, points labeled as corners are only matched with the corner candidates. Similarly, points labeled as lines are only matched with the lines candidates. Moreover, their ICP confidence levels are also fused in the algorithm, which make the pose estimation less sensitive to environment uncertainties. The proposed WP-ICP architecture reduces the probability of mismatch and thereby reduces the ICP iterations. Finally, based on given well-constructed indoor layouts, experiment comparisons are carried out under both clean and perturbed environments. It is shown that the proposed method is effective in significantly reducing computation effort and is simultaneously able to preserve localization precision. PMID:29690624

  19. Shor's factoring algorithm and modern cryptography. An illustration of the capabilities inherent in quantum computers

    NASA Astrophysics Data System (ADS)

    Gerjuoy, Edward

    2005-06-01

    The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.

  20. SABrE User's Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, S.A.

    In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less

  1. SABrE User`s Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, S.A.

    In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less

  2. Framework for architecture-independent run-time reconfigurable applications

    NASA Astrophysics Data System (ADS)

    Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.

    2000-10-01

    Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.

  3. Proposed Directions for Research in Computer-Based Education.

    ERIC Educational Resources Information Center

    Waugh, Michael L.

    Several directions for potential research efforts in the field of computer-based education (CBE) are discussed. (For the purposes of this paper, CBE is defined as any use of computers to promote learning with no intended inference as to the specific nature or organization of the educational application under discussion.) Efforts should be directed…

  4. The 2005 Minnesota Internet Study: An Examination of Metro/Rural Differences in Digital Technology Adoption

    ERIC Educational Resources Information Center

    Center for Rural Policy and Development, 2006

    2006-01-01

    Since 2001 the Center for Rural Policy & Development (CRPD) has annually conducted surveys of rural Minnesota households to discern the level of computer ownership, Internet connectivity and broadband adoption throughout rural Minnesota. Since the beginning of this longitudinal effort, significant increases in technology adoption have been…

  5. Reprocessing Multiyear GPS Data from Continuously Operating Reference Stations on Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Yoon, S.

    2016-12-01

    To define geodetic reference frame using GPS data collected by Continuously Operating Reference Stations (CORS) network, historical GPS data needs to be reprocessed regularly. Reprocessing GPS data collected by upto 2000 CORS sites for the last two decades requires a lot of computational resource. At National Geodetic Survey (NGS), there has been one completed reprocessing in 2011, and currently, the second reprocessing is undergoing. For the first reprocessing effort, in-house computing resource was utilized. In the current second reprocessing effort, outsourced cloud computing platform is being utilized. In this presentation, the outline of data processing strategy at NGS is described as well as the effort to parallelize the data processing procedure in order to maximize the benefit of the cloud computing. The time and cost savings realized by utilizing cloud computing approach will also be discussed.

  6. Documentation of probabilistic fracture mechanics codes used for reactor pressure vessels subjected to pressurized thermal shock loading: Parts 1 and 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balkey, K.; Witt, F.J.; Bishop, B.A.

    1995-06-01

    Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less

  7. Message from the ISCB: ISCB Ebola award for important future research on the computational biology of Ebola virus.

    PubMed

    Karp, Peter D; Berger, Bonnie; Kovats, Diane; Lengauer, Thomas; Linial, Michal; Sabeti, Pardis; Hide, Winston; Rost, Burkhard

    2015-02-15

    Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains and three-dimensional protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB) announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology 2016, Orlando, FL). dkovats@iscb.org or rost@in.tum.de. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, Reinhold C.

    This is the first formal progress report issued by the ORNL Life Sciences Division. It covers the period from February 1997 through December 1998, which has been critical in the formation of our new division. The legacy of 50 years of excellence in biological research at ORNL has been an important driver for everyone in the division to do their part so that this new research division can realize the potential it has to make seminal contributions to the life sciences for years to come. This reporting period is characterized by intense assessment and planning efforts. They included thorough scrutinymore » of our strengths and weaknesses, analyses of our situation with respect to comparative research organizations, and identification of major thrust areas leading to core research efforts that take advantage of our special facilities and expertise. Our goal is to develop significant research and development (R&D) programs in selected important areas to which we can make significant contributions by combining our distinctive expertise and resources in the biological sciences with those in the physical, engineering, and computational sciences. Significant facilities in mouse genomics, mass spectrometry, neutron science, bioanalytical technologies, and high performance computing are critical to the success of our programs. Research and development efforts in the division are organized in six sections. These cluster into two broad areas of R&D: systems biology and technology applications. The systems biology part of the division encompasses our core biological research programs. It includes the Mammalian Genetics and Development Section, the Biochemistry and Biophysics Section, and the Computational Biosciences Section. The technology applications part of the division encompasses the Assessment Technology Section, the Environmental Technology Section, and the Toxicology and Risk Analysis Section. These sections are the stewards of the division's core competencies. The common mission of the division is to advance science and technology to understand complex biological systems and their relationship with human health and the environment.« less

  9. Computer simulation and performance assessment of the packet-data service of the Aeronautical Mobile Satellite Service (AMSS)

    NASA Technical Reports Server (NTRS)

    Ferzali, Wassim; Zacharakis, Vassilis; Upadhyay, Triveni; Weed, Dennis; Burke, Gregory

    1995-01-01

    The ICAO Aeronautical Mobile Communications Panel (AMCP) completed the drafting of the Aeronautical Mobile Satellite Service (AMSS) Standards and Recommended Practices (SARP's) and the associated Guidance Material and submitted these documents to ICAO Air Navigation Commission (ANC) for ratification in May 1994. This effort, encompassed an extensive, multi-national SARP's validation. As part of this activity, the US Federal Aviation Administration (FAA) sponsored an effort to validate the SARP's via computer simulation. This paper provides a description of this effort. Specifically, it describes: (1) the approach selected for the creation of a high-fidelity AMSS computer model; (2) the test traffic generation scenarios; and (3) the resultant AMSS performance assessment. More recently, the AMSS computer model was also used to provide AMSS performance statistics in support of the RTCA standardization activities. This paper describes this effort as well.

  10. Using artificial intelligence to control fluid flow computations

    NASA Technical Reports Server (NTRS)

    Gelsey, Andrew

    1992-01-01

    Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.

  11. Communicating quality improvement through a hospital newsletter.

    PubMed

    Tietz, A; Tabor, R

    1995-01-01

    Healthcare organizations across the United States are embracing the tenets of continuous quality improvement. The challenge is to disseminate information about this quality activity throughout the organization. A monthly newsletter serves two vital purposes: to share the improvements and to generate more enthusiasm and participation by staff members. This article gives practical suggestions for promoting a monthly newsletter. Preparation of an informative newsletter requires a significant investment of time and effort. However, the positive results of providing facilitywide communications can make it worth the effort. The current availability of relatively inexpensive desktop publishing computer software programs has made the process much easier.

  12. Health literacy and task environment influence parents' burden for data entry on child-specific health information: randomized controlled trial.

    PubMed

    Porter, Stephen C; Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia

    2011-01-26

    Health care systems increasingly rely on patients' data entry efforts to organize and assist in care delivery through health information exchange. We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents' health literacy on the task burden. We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F(1,178) = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents' TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden.

  13. Methods for Prediction of High-Speed Reacting Flows in Aerospace Propulsion

    NASA Technical Reports Server (NTRS)

    Drummond, J. Philip

    2014-01-01

    Research to develop high-speed airbreathing aerospace propulsion systems was underway in the late 1950s. A major part of the effort involved the supersonic combustion ramjet, or scramjet, engine. Work had also begun to develop computational techniques for solving the equations governing the flow through a scramjet engine. However, scramjet technology and the computational methods to assist in its evolution would remain apart for another decade. The principal barrier was that the computational methods needed for engine evolution lacked the computer technology required for solving the discrete equations resulting from the numerical methods. Even today, computer resources remain a major pacing item in overcoming this barrier. Significant advances have been made over the past 35 years, however, in modeling the supersonic chemically reacting flow in a scramjet combustor. To see how scramjet development and the required computational tools finally merged, we briefly trace the evolution of the technology in both areas.

  14. Trust and Reciprocity: Are Effort and Money Equivalent?

    PubMed Central

    Vilares, Iris; Dam, Gregory; Kording, Konrad

    2011-01-01

    Trust and reciprocity facilitate cooperation and are relevant to virtually all human interactions. They are typically studied using trust games: one subject gives (entrusts) money to another subject, which may return some of the proceeds (reciprocate). Currently, however, it is unclear whether trust and reciprocity in monetary transactions are similar in other settings, such as physical effort. Trust and reciprocity of physical effort are important as many everyday decisions imply an exchange of physical effort, and such exchange is central to labor relations. Here we studied a trust game based on physical effort and compared the results with those of a computationally equivalent monetary trust game. We found no significant difference between effort and money conditions in both the amount trusted and the quantity reciprocated. Moreover, there is a high positive correlation in subjects' behavior across conditions. This suggests that trust and reciprocity may be character traits: subjects that are trustful/trustworthy in monetary settings behave similarly during exchanges of physical effort. Our results validate the use of trust games to study exchanges in physical effort and to characterize inter-subject differences in trust and reciprocity, and also suggest a new behavioral paradigm to study these differences. PMID:21364931

  15. [Computational chemistry in structure-based drug design].

    PubMed

    Cao, Ran; Li, Wei; Sun, Han-Zi; Zhou, Yu; Huang, Niu

    2013-07-01

    Today, the understanding of the sequence and structure of biologically relevant targets is growing rapidly and researchers from many disciplines, physics and computational science in particular, are making significant contributions to modern biology and drug discovery. However, it remains challenging to rationally design small molecular ligands with desired biological characteristics based on the structural information of the drug targets, which demands more accurate calculation of ligand binding free-energy. With the rapid advances in computer power and extensive efforts in algorithm development, physics-based computational chemistry approaches have played more important roles in structure-based drug design. Here we reviewed the newly developed computational chemistry methods in structure-based drug design as well as the elegant applications, including binding-site druggability assessment, large scale virtual screening of chemical database, and lead compound optimization. Importantly, here we address the current bottlenecks and propose practical solutions.

  16. Aeroelasticity of wing and wing-body configurations on parallel computers

    NASA Technical Reports Server (NTRS)

    Byun, Chansup

    1995-01-01

    The objective of this research is to develop computationally efficient methods for solving aeroelasticity problems on parallel computers. Both uncoupled and coupled methods are studied in this research. For the uncoupled approach, the conventional U-g method is used to determine the flutter boundary. The generalized aerodynamic forces required are obtained by the pulse transfer-function analysis method. For the coupled approach, the fluid-structure interaction is obtained by directly coupling finite difference Euler/Navier-Stokes equations for fluids and finite element dynamics equations for structures. This capability will significantly impact many aerospace projects of national importance such as Advanced Subsonic Civil Transport (ASCT), where the structural stability margin becomes very critical at the transonic region. This research effort will have direct impact on the High Performance Computing and Communication (HPCC) Program of NASA in the area of parallel computing.

  17. Program Helps Generate Boundary-Element Mathematical Models

    NASA Technical Reports Server (NTRS)

    Goldberg, R. K.

    1995-01-01

    Composite Model Generation-Boundary Element Method (COM-GEN-BEM) computer program significantly reduces time and effort needed to construct boundary-element mathematical models of continuous-fiber composite materials at micro-mechanical (constituent) scale. Generates boundary-element models compatible with BEST-CMS boundary-element code for anlaysis of micromechanics of composite material. Written in PATRAN Command Language (PCL).

  18. Recapturing Technology for Education: Keeping Tomorrow in Today's Classrooms

    ERIC Educational Resources Information Center

    Gura, Mark; Percy, Bernard

    2005-01-01

    Despite significant investment of funds, time, and effort in bringing computers, the Internet, and related technologies into the classrooms, educators have turned their back on these new power tools of the intellect. School is the last remaining institution to keep 21st Century technology at arms distance. How can technology be used to enrich and…

  19. Molecular Dynamics of Hot Dense Plasmas: New Horizons

    NASA Astrophysics Data System (ADS)

    Graziani, Frank

    2011-10-01

    We describe the status of a new time-dependent simulation capability for hot dense plasmas. The backbone of this multi-institutional computational and experimental effort--the Cimarron Project--is the massively parallel molecular dynamics (MD) code ``ddcMD''. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Zelements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This talk summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision and highlights some significant results obtained to date. We describe the status of a new time-dependent simulation capability for hot dense plasmas. The backbone of this multi-institutional computational and experimental effort--the Cimarron Project--is the massively parallel molecular dynamics (MD) code ``ddcMD''. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Zelements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This talk summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision and highlights some significant results obtained to date. This work is performed under the auspices of the U. S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  20. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  1. 42 CFR 441.182 - Maintenance of effort: Computation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES Inpatient Psychiatric Services for Individuals Under Age 21 in Psychiatric Facilities or Programs § 441.182 Maintenance of effort: Computation. (a) For expenditures for inpatient psychiatric services... total State Medicaid expenditures in the current quarter for inpatient psychiatric services and...

  2. An accurate binding interaction model in de novo computational protein design of interactions: if you build it, they will bind.

    PubMed

    London, Nir; Ambroggio, Xavier

    2014-02-01

    Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Massive parallelization of serial inference algorithms for a complex generalized linear model

    PubMed Central

    Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David

    2014-01-01

    Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363

  4. Turbulence modeling of free shear layers for high performance aircraft

    NASA Technical Reports Server (NTRS)

    Sondak, Douglas

    1993-01-01

    In many flowfield computations, accuracy of the turbulence model employed is frequently a limiting factor in the overall accuracy of the computation. This is particularly true for complex flowfields such as those around full aircraft configurations. Free shear layers such as wakes, impinging jets (in V/STOL applications), and mixing layers over cavities are often part of these flowfields. Although flowfields have been computed for full aircraft, the memory and CPU requirements for these computations are often excessive. Additional computer power is required for multidisciplinary computations such as coupled fluid dynamics and conduction heat transfer analysis. Massively parallel computers show promise in alleviating this situation, and the purpose of this effort was to adapt and optimize CFD codes to these new machines. The objective of this research effort was to compute the flowfield and heat transfer for a two-dimensional jet impinging normally on a cool plate. The results of this research effort were summarized in an AIAA paper titled 'Parallel Implementation of the k-epsilon Turbulence Model'. Appendix A contains the full paper.

  5. Materials Frontiers to Empower Quantum Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Antoinette Jane; Sarrao, John Louis; Richardson, Christopher

    This is an exciting time at the nexus of quantum computing and materials research. The materials frontiers described in this report represent a significant advance in electronic materials and our understanding of the interactions between the local material and a manufactured quantum state. Simultaneously, directed efforts to solve materials issues related to quantum computing provide an opportunity to control and probe the fundamental arrangement of matter that will impact all electronic materials. An opportunity exists to extend our understanding of materials functionality from electronic-grade to quantum-grade by achieving a predictive understanding of noise and decoherence in qubits and their originsmore » in materials defects and environmental coupling. Realizing this vision systematically and predictively will be transformative for quantum computing and will represent a qualitative step forward in materials prediction and control.« less

  6. Computational strategies for tire monitoring and analysis

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.; Noor, Ahmed K.; Green, James S.

    1995-01-01

    Computational strategies are presented for the modeling and analysis of tires in contact with pavement. A procedure is introduced for simple and accurate determination of tire cross-sectional geometric characteristics from a digitally scanned image. Three new strategies for reducing the computational effort in the finite element solution of tire-pavement contact are also presented. These strategies take advantage of the observation that footprint loads do not usually stimulate a significant tire response away from the pavement contact region. The finite element strategies differ in their level of approximation and required amount of computer resources. The effectiveness of the strategies is demonstrated by numerical examples of frictionless and frictional contact of the space shuttle Orbiter nose-gear tire. Both an in-house research code and a commercial finite element code are used in the numerical studies.

  7. Image-guided tissue engineering

    PubMed Central

    Ballyns, Jeffrey J; Bonassar, Lawrence J

    2009-01-01

    Replication of anatomic shape is a significant challenge in developing implants for regenerative medicine. This has lead to significant interest in using medical imaging techniques such as magnetic resonance imaging and computed tomography to design tissue engineered constructs. Implementation of medical imaging and computer aided design in combination with technologies for rapid prototyping of living implants enables the generation of highly reproducible constructs with spatial resolution up to 25 μm. In this paper, we review the medical imaging modalities available and a paradigm for choosing a particular imaging technique. We also present fabrication techniques and methodologies for producing cellular engineered constructs. Finally, we comment on future challenges involved with image guided tissue engineering and efforts to generate engineered constructs ready for implantation. PMID:19583811

  8. Computational psychiatry

    PubMed Central

    Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter

    2013-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032

  9. ``But it doesn't come naturally'': how effort expenditure shapes the benefit of growth mindset on women's sense of intellectual belonging in computing

    NASA Astrophysics Data System (ADS)

    Stout, Jane G.; Blaney, Jennifer M.

    2017-10-01

    Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.

  10. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    PubMed

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO model based on hierarchical error prediction, developed to explain MPFC-DLPFC interactions. We derive behavioral predictions that describe how effort and reward information is coded in PFC and how changing the configuration of such environmental information might affect decision-making and task performance involving motivation.

  11. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  12. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  13. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  14. Deep learning with coherent nanophotonic circuits

    NASA Astrophysics Data System (ADS)

    Shen, Yichen; Harris, Nicholas C.; Skirlo, Scott; Prabhu, Mihika; Baehr-Jones, Tom; Hochberg, Michael; Sun, Xin; Zhao, Shijie; Larochelle, Hugo; Englund, Dirk; Soljačić, Marin

    2017-07-01

    Artificial neural networks are computational network models inspired by signal processing in the brain. These models have dramatically improved performance for many machine-learning tasks, including speech and image recognition. However, today's computing hardware is inefficient at implementing neural networks, in large part because much of it was designed for von Neumann computing schemes. Significant effort has been made towards developing electronic architectures tuned to implement artificial neural networks that exhibit improved computational speed and accuracy. Here, we propose a new architecture for a fully optical neural network that, in principle, could offer an enhancement in computational speed and power efficiency over state-of-the-art electronics for conventional inference tasks. We experimentally demonstrate the essential part of the concept using a programmable nanophotonic processor featuring a cascaded array of 56 programmable Mach-Zehnder interferometers in a silicon photonic integrated circuit and show its utility for vowel recognition.

  15. An Integrated Constraint Programming Approach to Scheduling Sports Leagues with Divisional and Round-robin Tournaments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey

    Previous approaches for scheduling a league with round-robin and divisional tournaments involved decomposing the problem into easier subproblems. This approach, used to schedule the top Swedish handball league Elitserien, reduces the problem complexity but can result in suboptimal schedules. This paper presents an integrated constraint programming model that allows to perform the scheduling in a single step. Particular attention is given to identifying implied and symmetry-breaking constraints that reduce the computational complexity significantly. The experimental evaluation of the integrated approach takes considerably less computational effort than the previous approach.

  16. Cost analysis for computer supported multiple-choice paper examinations

    PubMed Central

    Mandel, Alexander; Hörnlein, Alexander; Ifland, Marianus; Lüneburg, Edeltraud; Deckert, Jürgen; Puppe, Frank

    2011-01-01

    Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results. Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS) 2009/10 eleven, in the summer semester (SS) 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam. PMID:22205913

  17. Cost analysis for computer supported multiple-choice paper examinations.

    PubMed

    Mandel, Alexander; Hörnlein, Alexander; Ifland, Marianus; Lüneburg, Edeltraud; Deckert, Jürgen; Puppe, Frank

    2011-01-01

    Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results. Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS) 2009/10 eleven, in the summer semester (SS) 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam.

  18. Robotic tape library system level testing at NSA: Present and planned

    NASA Technical Reports Server (NTRS)

    Shields, Michael F.

    1994-01-01

    In the present of declining Defense budgets, increased pressure has been placed on the DOD to utilize Commercial Off the Shelf (COTS) solutions to incrementally solve a wide variety of our computer processing requirements. With the rapid growth in processing power, significant expansion of high performance networking, and the increased complexity of applications data sets, the requirement for high performance, large capacity, reliable and secure, and most of all affordable robotic tape storage libraries has greatly increased. Additionally, the migration to a heterogeneous, distributed computing environment has further complicated the problem. With today's open system compute servers approaching yesterday's supercomputer capabilities, the need for affordable, reliable secure Mass Storage Systems (MSS) has taken on an ever increasing importance to our processing center's ability to satisfy operational mission requirements. To that end, NSA has established an in-house capability to acquire, test, and evaluate COTS products. Its goal is to qualify a set of COTS MSS libraries, thereby achieving a modicum of standardization for robotic tape libraries which can satisfy our low, medium, and high performance file and volume serving requirements. In addition, NSA has established relations with other Government Agencies to complete this in-house effort and to maximize our research, testing, and evaluation work. While the preponderance of the effort is focused at the high end of the storage ladder, considerable effort will be extended this year and next at the server class or mid range storage systems.

  19. New Mexico district work-effort analysis computer program

    USGS Publications Warehouse

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation 6600 computer system. Central processing computer time has seldom exceeded 5 minutes on the longest year-to-date runs.

  20. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review

    PubMed Central

    Ngoepe, Malebogo N.; Frangi, Alejandro F.; Byrne, James V.; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities. PMID:29670533

  1. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review.

    PubMed

    Ngoepe, Malebogo N; Frangi, Alejandro F; Byrne, James V; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities.

  2. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  3. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    2008-04-01

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  4. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  5. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  6. SU-F-J-219: Predicting Ventilation Change Due to Radiation Therapy: Dependency On Pre-RT Ventilation and Effort Correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, T; Du, K; Bayouth, J

    Purpose: Ventilation change caused by radiation therapy (RT) can be predicted using four-dimensional computed tomography (4DCT) and image registration. This study tested the dependency of predicted post-RT ventilation on effort correction and pre-RT lung function. Methods: Pre-RT and 3 month post-RT 4DCT images were obtained for 13 patients. The 4DCT images were used to create ventilation maps using a deformable image registration based Jacobian expansion calculation. The post-RT ventilation maps were predicted in four different ways using the dose delivered, pre-RT ventilation, and effort correction. The pre-RT ventilation and effort correction were toggled to determine dependency. The four different predictedmore » ventilation maps were compared to the post-RT ventilation map calculated from image registration to establish the best prediction method. Gamma pass rates were used to compare the different maps with the criteria of 2mm distance-to-agreement and 6% ventilation difference. Paired t-tests of gamma pass rates were used to determine significant differences between the maps. Additional gamma pass rates were calculated using only voxels receiving over 20 Gy. Results: The predicted post-RT ventilation maps were in agreement with the actual post-RT maps in the following percentage of voxels averaged over all subjects: 71% with pre-RT ventilation and effort correction, 69% with no pre-RT ventilation and effort correction, 60% with pre-RT ventilation and no effort correction, and 58% with no pre-RT ventilation and no effort correction. When analyzing only voxels receiving over 20 Gy, the gamma pass rates were respectively 74%, 69%, 65%, and 55%. The prediction including both pre- RT ventilation and effort correction was the only prediction with significant improvement over using no prediction (p<0.02). Conclusion: Post-RT ventilation is best predicted using both pre-RT ventilation and effort correction. This is the only prediction that provided a significant improvement on agreement. Research support from NIH grants CA166119 and CA166703, a gift from Roger Koch, and a Pilot Grant from University of Iowa Carver College of Medicine.« less

  7. Climate Science Performance, Data and Productivity on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, Benjamin W; Worley, Patrick H; Gaddis, Abigail L

    2015-01-01

    Climate Science models are flagship codes for the largest of high performance computing (HPC) resources, both in visibility, with the newly launched Department of Energy (DOE) Accelerated Climate Model for Energy (ACME) effort, and in terms of significant fractions of system usage. The performance of the DOE ACME model is captured with application level timers and examined through a sizeable run archive. Performance and variability of compute, queue time and ancillary services are examined. As Climate Science advances in the use of HPC resources there has been an increase in the required human and data systems to achieve programs goals.more » A description of current workflow processes (hardware, software, human) and planned automation of the workflow, along with historical and projected data in motion and at rest data usage, are detailed. The combination of these two topics motivates a description of future systems requirements for DOE Climate Modeling efforts, focusing on the growth of data storage and network and disk bandwidth required to handle data at an acceptable rate.« less

  8. A compendium of computational fluid dynamics at the Langley Research Center

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Through numerous summary examples, the scope and general nature of the computational fluid dynamics (CFD) effort at Langley is identified. These summaries will help inform researchers in CFD and line management at Langley of the overall effort. In addition to the inhouse efforts, out of house CFD work supported by Langley through industrial contracts and university grants are included. Researchers were encouraged to include summaries of work in preliminary and tentative states of development as well as current research approaching definitive results.

  9. Power and Energy Considerations at Forward Operating Bases (FOBs)

    DTIC Science & Technology

    2010-06-16

    systems • Anticipated additional plug loads by users – Personal Computers and Gaming Devices – Coffee Pots – Refrigerators – Lights – Personal Heaters...effort was made to account for the significant amount of equipment that consumes power not on the unit’s MTOE (printers, plotters, coffee pots, etc...50 Warfighters including billeting, kitchen, laundry, shower, latrines, and new wastewater treatment system Capability/impact: Compact, lightweight

  10. 3D automatic Cartesian grid generation for Euler flows

    NASA Technical Reports Server (NTRS)

    Melton, John E.; Enomoto, Francis Y.; Berger, Marsha J.

    1993-01-01

    We describe a Cartesian grid strategy for the study of three dimensional inviscid flows about arbitrary geometries that uses both conventional and CAD/CAM surface geometry databases. Initial applications of the technique are presented. The elimination of the body-fitted constraint allows the grid generation process to be automated, significantly reducing the time and effort required to develop suitable computational grids for inviscid flowfield simulations.

  11. Motivational Beliefs, Student Effort, and Feedback Behaviour in Computer-Based Formative Assessment

    ERIC Educational Resources Information Center

    Timmers, Caroline F.; Braber-van den Broek, Jannie; van den Berg, Stephanie M.

    2013-01-01

    Feedback can only be effective when students seek feedback and process it. This study examines the relations between students' motivational beliefs, effort invested in a computer-based formative assessment, and feedback behaviour. Feedback behaviour is represented by whether a student seeks feedback and the time a student spends studying the…

  12. Establishing a K-12 Circuit Design Program

    ERIC Educational Resources Information Center

    Inceoglu, Mustafa M.

    2010-01-01

    Outreach, as defined by Wikipedia, is an effort by an organization or group to connect its ideas or practices to the efforts of other organizations, groups, specific audiences, or the general public. This paper describes a computer engineering outreach project of the Department of Computer Engineering at Ege University, Izmir, Turkey, to a local…

  13. Identifying Predictors of Achievement in the Newly Defined Information Literacy: A Neural Network Analysis

    ERIC Educational Resources Information Center

    Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.

    2009-01-01

    Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…

  14. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1974-01-01

    A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.

  15. An opportunity cost model of subjective effort and task performance

    PubMed Central

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  16. Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.

    PubMed

    Vassena, Eliana; Holroyd, Clay B; Alexander, William H

    2017-01-01

    In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.

  17. A CFD Database for Airfoils and Wings at Post-Stall Angles of Attack

    NASA Technical Reports Server (NTRS)

    Petrilli, Justin; Paul, Ryan; Gopalarathnam, Ashok; Frink, Neal T.

    2013-01-01

    This paper presents selected results from an ongoing effort to develop an aerodynamic database from Reynolds-Averaged Navier-Stokes (RANS) computational analysis of airfoils and wings at stall and post-stall angles of attack. The data obtained from this effort will be used for validation and refinement of a low-order post-stall prediction method developed at NCSU, and to fill existing gaps in high angle of attack data in the literature. Such data could have potential applications in post-stall flight dynamics, helicopter aerodynamics and wind turbine aerodynamics. An overview of the NASA TetrUSS CFD package used for the RANS computational approach is presented. Detailed results for three airfoils are presented to compare their stall and post-stall behavior. The results for finite wings at stall and post-stall conditions focus on the effects of taper-ratio and sweep angle, with particular attention to whether the sectional flows can be approximated using two-dimensional flow over a stalled airfoil. While this approximation seems reasonable for unswept wings even at post-stall conditions, significant spanwise flow on stalled swept wings preclude the use of two-dimensional data to model sectional flows on swept wings. Thus, further effort is needed in low-order aerodynamic modeling of swept wings at stalled conditions.

  18. Health Literacy and Task Environment Influence Parents' Burden for Data Entry on Child-Specific Health Information: Randomized Controlled Trial

    PubMed Central

    Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia

    2011-01-01

    Background Health care systems increasingly rely on patients’ data entry efforts to organize and assist in care delivery through health information exchange. Objectives We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents’ health literacy on the task burden. Methods We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. Results We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F1,178 = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents’ TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. Conclusions A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden. Trial registration Clinicaltrials.gov NCT00543257; http://www.clinicaltrials.gov/ct2/show/NCT00543257 (Archived by WebCite at http://www.webcitation.org/5vUVH2DYR) PMID:21269990

  19. Computers in medical education 2. Use of a computer package to supplement the clinical experience in a surgical clerkship: an objective evaluation.

    PubMed

    Devitt, P; Cehic, D; Palmer, E

    1998-06-01

    Student teaching of surgery has been devolved from the university in an effort to increase and broaden undergraduate clinical experience. In order to ensure uniformity of learning we have defined learning objectives and provided a computer-based package to supplement clinical teaching. A study was undertaken to evaluate the place of computer-based learning in a clinical environment. Twelve modules were provided for study during a 6-week attachment. These covered clinical problems related to cardiology, neurosurgery and gastrointestinal haemorrhage. Eighty-four fourth-year students undertook a pre- and post-test assessment on these three topics as well as acute abdominal pain. No extra learning material on the latter topic was provided during the attachment. While all students showed significant improvement in performance in the post-test assessment, those who had access to the computer material performed significantly better than did the controls. Within the topics, students in both groups performed equally well on the post-test assessment of acute abdominal pain but the control group's performance was significantly lacking on the topic of gastrointestinal haemorrhage, suggesting that the bulk of learning on this subject came from the computer material and little from the clinical attachment. This type of learning resource can be used to supplement the student's clinical experience and at the same time monitor what they learn during clinical clerkships and identify areas of weakness.

  20. An Investigation of the Flow Physics of Acoustic Liners by Direct Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Watson, Willie R. (Technical Monitor); Tam, Christopher

    2004-01-01

    This report concentrates on reporting the effort and status of work done on three dimensional (3-D) simulation of a multi-hole resonator in an impedance tube. This work is coordinated with a parallel experimental effort to be carried out at the NASA Langley Research Center. The outline of this report is as follows : 1. Preliminary consideration. 2. Computation model. 3. Mesh design and parallel computing. 4. Visualization. 5. Status of computer code development. 1. Preliminary Consideration.

  1. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  2. Use of declarative statements in creating and maintaining computer-interpretable knowledge bases for guideline-based care.

    PubMed

    Tu, Samson W; Hrabak, Karen M; Campbell, James R; Glasgow, Julie; Nyman, Mark A; McClure, Robert; McClay, James; Abarbanel, Robert; Mansfield, James G; Martins, Susana M; Goldstein, Mary K; Musen, Mark A

    2006-01-01

    Developing computer-interpretable clinical practice guidelines (CPGs) to provide decision support for guideline-based care is an extremely labor-intensive task. In the EON/ATHENA and SAGE projects, we formulated substantial portions of CPGs as computable statements that express declarative relationships between patient conditions and possible interventions. We developed query and expression languages that allow a decision-support system (DSS) to evaluate these statements in specific patient situations. A DSS can use these guideline statements in multiple ways, including: (1) as inputs for determining preferred alternatives in decision-making, and (2) as a way to provide targeted commentaries in the clinical information system. The use of these declarative statements significantly reduces the modeling expertise and effort required to create and maintain computer-interpretable knowledge bases for decision-support purpose. We discuss possible implications for sharing of such knowledge bases.

  3. Research progress on quantum informatics and quantum computation

    NASA Astrophysics Data System (ADS)

    Zhao, Yusheng

    2018-03-01

    Quantum informatics is an emerging interdisciplinary subject developed by the combination of quantum mechanics, information science, and computer science in the 1980s. The birth and development of quantum information science has far-reaching significance in science and technology. At present, the application of quantum information technology has become the direction of people’s efforts. The preparation, storage, purification and regulation, transmission, quantum coding and decoding of quantum state have become the hotspot of scientists and technicians, which have a profound impact on the national economy and the people’s livelihood, technology and defense technology. This paper first summarizes the background of quantum information science and quantum computer and the current situation of domestic and foreign research, and then introduces the basic knowledge and basic concepts of quantum computing. Finally, several quantum algorithms are introduced in detail, including Quantum Fourier transform, Deutsch-Jozsa algorithm, Shor’s quantum algorithm, quantum phase estimation.

  4. Exploring Effective Decision Making through Human-Centered and Computational Intelligence Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Kyungsik; Cook, Kristin A.; Shih, Patrick C.

    Decision-making has long been studied to understand a psychological, cognitive, and social process of selecting an effective choice from alternative options. Its studies have been extended from a personal level to a group and collaborative level, and many computer-aided decision-making systems have been developed to help people make right decisions. There has been significant research growth in computational aspects of decision-making systems, yet comparatively little effort has existed in identifying and articulating user needs and requirements in assessing system outputs and the extent to which human judgments could be utilized for making accurate and reliable decisions. Our research focus ismore » decision-making through human-centered and computational intelligence methods in a collaborative environment, and the objectives of this position paper are to bring our research ideas to the workshop, and share and discuss ideas.« less

  5. Large-Scale Distributed Computational Fluid Dynamics on the Information Power Grid Using Globus

    NASA Technical Reports Server (NTRS)

    Barnard, Stephen; Biswas, Rupak; Saini, Subhash; VanderWijngaart, Robertus; Yarrow, Maurice; Zechtzer, Lou; Foster, Ian; Larsson, Olle

    1999-01-01

    This paper describes an experiment in which a large-scale scientific application development for tightly-coupled parallel machines is adapted to the distributed execution environment of the Information Power Grid (IPG). A brief overview of the IPG and a description of the computational fluid dynamics (CFD) algorithm are given. The Globus metacomputing toolkit is used as the enabling device for the geographically-distributed computation. Modifications related to latency hiding and Load balancing were required for an efficient implementation of the CFD application in the IPG environment. Performance results on a pair of SGI Origin 2000 machines indicate that real scientific applications can be effectively implemented on the IPG; however, a significant amount of continued effort is required to make such an environment useful and accessible to scientists and engineers.

  6. Addendum report to atmospheric science facility pallet-only mode space transportation system payload feasibility study, volume 3, revision A

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The feasibility of accomplishing selected atmospheric science mission using a pallet-only mode was studied. Certain unresolved issues were identified. The first issue was that of assuring that the on-board computer facility was adequate to process scientific data, control subsystems such as instrument pointing, provide mission operational program capability, and accomplish display and control. The second issue evolved from an investigation of the availability of existing substitute instruments that could be used instead of the prime instrumentation where the development tests and schedules are incompatible with the realistic budgets and shuttle vehicle schedules. Some effort was expended on identifying candidate substitute instruments, and the performance, cost, and development schedule trade-offs found during that effort were significant enough to warrant a follow-on investigation. This addendum documents the results of that follow-on effort, as it applies to the Atmospheric Sciences Facility.

  7. Exploring Midwives' Need and Intention to Adopt Electronic Integrated Antenatal Care.

    PubMed

    Markam, Hosizah; Hochheiser, Harry; Kuntoro, Kuntoro; Notobroto, Hari Basuki

    2018-01-01

    Documentation requirements for the Indonesian integrated antenatal care (ANC) program suggest the need for electronic systems to address gaps in existing paper documentation practices. Our goals were to quantify midwives' documentation completeness in a primary healthcare center, understand documentation challenges, develop a tool, and assess intention to use the tool. We analyzed existing ANC records in a primary healthcare center in Bangkalan, East Java, and conducted interviews with stakeholders to understand needs for an electronic system in support of ANC. Development of the web-based Electronic Integrated ANC (e-iANC) system used the System Development Life Cycle method. Training on the use of the system was held in the computer laboratory for 100 midwives chosen from four primary healthcare centers in each of five regions. The Unified Theory of Acceptance and Use of Technology (UTAUT) questionnaire was used to assess their intention to adopt e-iANC. The midwives' intention to adopt e-iANC was significantly influenced by performance expectancy, effort expectancy and facilitating conditions. Age, education level, and computer literacy did not significantly moderate the effects of performance expectancy and effort expectancy on adoption intention. The UTAUT results indicated that the factors that might influence intention to adopt e-iANC are potentially addressable. Results suggest that e-iANC might well be accepted by midwives.

  8. Progress on the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha

    2015-12-01

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.

  9. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  10. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1997-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the CAGI: Computer Aided Grid Interface system. The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and/or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  11. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1996-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the Computer Aided Grid Interface system (CAGI). The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and / or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  12. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    PubMed

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  13. Quantum chemical approaches to [NiFe] hydrogenase.

    PubMed

    Vaissier, Valerie; Van Voorhis, Troy

    2017-05-09

    The mechanism by which [NiFe] hydrogenase catalyses the oxidation of molecular hydrogen is a significant yet challenging topic in bioinorganic chemistry. With far-reaching applications in renewable energy and carbon mitigation, significant effort has been invested in the study of these complexes. In particular, computational approaches offer a unique perspective on how this enzyme functions at an electronic and atomistic level. In this article, we discuss state-of-the art quantum chemical methods and how they have helped deepen our comprehension of [NiFe] hydrogenase. We outline the key strategies that can be used to compute the (i) geometry, (ii) electronic structure, (iii) thermodynamics and (iv) kinetic properties associated with the enzymatic activity of [NiFe] hydrogenase and other bioinorganic complexes. © 2017 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  14. Empirical Relationships Between Optical Properties and Equivalent Diameters of Fractal Soot Aggregates at 550 Nm Wavelength.

    NASA Technical Reports Server (NTRS)

    Pandey, Apoorva; Chakrabarty, Rajan K.; Liu, Li; Mishchenko, Michael I.

    2015-01-01

    Soot aggregates (SAs)-fractal clusters of small, spherical carbonaceous monomers-modulate the incoming visible solar radiation and contribute significantly to climate forcing. Experimentalists and climate modelers typically assume a spherical morphology for SAs when computing their optical properties, causing significant errors. Here, we calculate the optical properties of freshly-generated (fractal dimension Df = 1.8) and aged (Df = 2.6) SAs at 550 nm wavelength using the numericallyexact superposition T-Matrix method. These properties were expressed as functions of equivalent aerosol diameters as measured by contemporary aerosol instruments. This work improves upon previous efforts wherein SA optical properties were computed as a function of monomer number, rendering them unusable in practical applications. Future research will address the sensitivity of variation in refractive index, fractal prefactor, and monomer overlap of SAs on the reported empirical relationships.

  15. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary compute maintenance of effort in...

  16. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary compute maintenance of effort in...

  17. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary compute maintenance of effort in...

  18. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary compute maintenance of effort in...

  19. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in...

  20. Integrating computation into the undergraduate curriculum: A vision and guidelines for future developments

    NASA Astrophysics Data System (ADS)

    Chonacky, Norman; Winch, David

    2008-04-01

    There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.

  1. Micro-video display with ocular tracking and interactive voice control

    NASA Technical Reports Server (NTRS)

    Miller, James E.

    1993-01-01

    In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.

  2. Measuring and Modeling Change in Examinee Effort on Low-Stakes Tests across Testing Occasions

    ERIC Educational Resources Information Center

    Sessoms, John; Finney, Sara J.

    2015-01-01

    Because schools worldwide use low-stakes tests to make important decisions, value-added indices computed from test scores must accurately reflect student learning, which requires equal test-taking effort across testing occasions. Evaluating change in effort assumes effort is measured equivalently across occasions. We evaluated the longitudinal…

  3. Accounting for sub-resolution pores in models of water and solute transport in soils based on computed tomography images: Are we there yet?

    NASA Astrophysics Data System (ADS)

    Baveye, Philippe C.; Pot, Valérie; Garnier, Patricia

    2017-12-01

    In the last decade, X-ray computed tomography (CT) has become widely used to characterize the geometry and topology of the pore space of soils and natural porous media. Regardless of the resolution of CT images, a fundamental problem associated with their use, for example as a starting point in simulation efforts, is that sub-resolution pores are not detected. Over the last few years, a particular type of modeling method, known as ;Grey; or ;Partial Bounce Back; Lattice-Boltzmann (LB), has been adopted by increasing numbers of researchers to try to account for sub-resolution pores in the modeling of water and solute transport in natural porous media. In this short paper, we assess the extent to which Grey LB methods indeed offer a workable solution to the problem at hand. We conclude that, in spite of significant computational advances, a major experimental hurdle related to the evaluation of the penetrability of sub-resolution pores, is blocking the way ahead. This hurdle will need to be cleared before Grey LB can become a credible option in the microscale modeling of soils and sediments. A necessarily interdisciplinary effort, involving both modelers and experimentalists, is needed to clear the path forward.

  4. Uncertainty in sample estimates and the implicit loss function for soil information.

    NASA Astrophysics Data System (ADS)

    Lark, Murray

    2015-04-01

    One significant challenge in the communication of uncertain information is how to enable the sponsors of sampling exercises to make a rational choice of sample size. One way to do this is to compute the value of additional information given the loss function for errors. The loss function expresses the costs that result from decisions made using erroneous information. In certain circumstances, such as remediation of contaminated land prior to development, loss functions can be computed and used to guide rational decision making on the amount of resource to spend on sampling to collect soil information. In many circumstances the loss function cannot be obtained prior to decision making. This may be the case when multiple decisions may be based on the soil information and the costs of errors are hard to predict. The implicit loss function is proposed as a tool to aid decision making in these circumstances. Conditional on a logistical model which expresses costs of soil sampling as a function of effort, and statistical information from which the error of estimates can be modelled as a function of effort, the implicit loss function is the loss function which makes a particular decision on effort rational. In this presentation the loss function is defined and computed for a number of arbitrary decisions on sampling effort for a hypothetical soil monitoring problem. This is based on a logistical model of sampling cost parameterized from a recent geochemical survey of soil in Donegal, Ireland and on statistical parameters estimated with the aid of a process model for change in soil organic carbon. It is shown how the implicit loss function might provide a basis for reflection on a particular choice of sample size by comparing it with the values attributed to soil properties and functions. Scope for further research to develop and apply the implicit loss function to help decision making by policy makers and regulators is then discussed.

  5. The Continual Intercomparison of Radiation Codes: Results from Phase I

    NASA Technical Reports Server (NTRS)

    Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri; hide

    2011-01-01

    The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality, and will guide the development of future phases of CIRC

  6. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  7. High performance transcription factor-DNA docking with GPU computing

    PubMed Central

    2012-01-01

    Background Protein-DNA docking is a very challenging problem in structural bioinformatics and has important implications in a number of applications, such as structure-based prediction of transcription factor binding sites and rational drug design. Protein-DNA docking is very computational demanding due to the high cost of energy calculation and the statistical nature of conformational sampling algorithms. More importantly, experiments show that the docking quality depends on the coverage of the conformational sampling space. It is therefore desirable to accelerate the computation of the docking algorithm, not only to reduce computing time, but also to improve docking quality. Methods In an attempt to accelerate the sampling process and to improve the docking performance, we developed a graphics processing unit (GPU)-based protein-DNA docking algorithm. The algorithm employs a potential-based energy function to describe the binding affinity of a protein-DNA pair, and integrates Monte-Carlo simulation and a simulated annealing method to search through the conformational space. Algorithmic techniques were developed to improve the computation efficiency and scalability on GPU-based high performance computing systems. Results The effectiveness of our approach is tested on a non-redundant set of 75 TF-DNA complexes and a newly developed TF-DNA docking benchmark. We demonstrated that the GPU-based docking algorithm can significantly accelerate the simulation process and thereby improving the chance of finding near-native TF-DNA complex structures. This study also suggests that further improvement in protein-DNA docking research would require efforts from two integral aspects: improvement in computation efficiency and energy function design. Conclusions We present a high performance computing approach for improving the prediction accuracy of protein-DNA docking. The GPU-based docking algorithm accelerates the search of the conformational space and thus increases the chance of finding more near-native structures. To the best of our knowledge, this is the first ad hoc effort of applying GPU or GPU clusters to the protein-DNA docking problem. PMID:22759575

  8. Multiphysics Thrust Chamber Modeling for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation. A two-pronged approach is employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of heat transfer on thrust performance. Preliminary results on both aspects are presented.

  9. Comparison of Different Instructional Multimedia Designs for Improving Student Science-Process Skill Learning

    NASA Astrophysics Data System (ADS)

    Chien, Yu-Ta; Chang, Chun-Yen

    2012-02-01

    This study developed three forms of computer-based multimedia, including Static Graphics (SG), Simple Learner-Pacing Animation (SLPA), and Full Learner-Pacing Animation (FLPA), to assist students in learning topographic measuring. The interactive design of FLPA allowed students to physically manipulate the virtual measuring mechanism, rather than passively observe dynamic or static images. The students were randomly assigned to different multimedia groups. The results of a one-way ANOVA analysis indicated that (1) there was a significant difference with a large effect size ( f = .69) in mental effort ratings among three groups, and the post-hoc test indicated that FLPA imposed less cognitive load on students than did SG ( p = .007); (2) the differences of practical performance scores among groups reached the statistic significant level with a large effect size ( f = .76), and the post-hoc test indicated that FLPA fostered better learning outcomes than both SLPA and SG ( p = .004 and p = .05, respectively); (3) the difference in instructional efficiency that was computed by the z-score combination of students' mental effort ratings and practical performance scores among the three groups obtained the statistic significant level with a large effect size ( f = .79), and the post-hoc test indicated that FLPA brought students higher instructional efficiency than those of both SLPA and SG ( p = .01 and .005, respectively); (4) no significant effect was found in instructional time-spans between groups ( p = .637). Overall, FLPA was recommended as the best multimedia form to facilitate topographic measurement learning. The implications of instructional multimedia design were discussed from the perspective of cognitive load theory.

  10. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  11. Parallel computation with molecular-motor-propelled agents in nanofabricated networks.

    PubMed

    Nicolau, Dan V; Lard, Mercy; Korten, Till; van Delft, Falco C M J M; Persson, Malin; Bengtsson, Elina; Månsson, Alf; Diez, Stefan; Linke, Heiner; Nicolau, Dan V

    2016-03-08

    The combinatorial nature of many important mathematical problems, including nondeterministic-polynomial-time (NP)-complete problems, places a severe limitation on the problem size that can be solved with conventional, sequentially operating electronic computers. There have been significant efforts in conceiving parallel-computation approaches in the past, for example: DNA computation, quantum computation, and microfluidics-based computation. However, these approaches have not proven, so far, to be scalable and practical from a fabrication and operational perspective. Here, we report the foundations of an alternative parallel-computation system in which a given combinatorial problem is encoded into a graphical, modular network that is embedded in a nanofabricated planar device. Exploring the network in a parallel fashion using a large number of independent, molecular-motor-propelled agents then solves the mathematical problem. This approach uses orders of magnitude less energy than conventional computers, thus addressing issues related to power consumption and heat dissipation. We provide a proof-of-concept demonstration of such a device by solving, in a parallel fashion, the small instance {2, 5, 9} of the subset sum problem, which is a benchmark NP-complete problem. Finally, we discuss the technical advances necessary to make our system scalable with presently available technology.

  12. Magnetic Skyrmion as a Nonlinear Resistive Element: A Potential Building Block for Reservoir Computing

    NASA Astrophysics Data System (ADS)

    Prychynenko, Diana; Sitte, Matthias; Litzius, Kai; Krüger, Benjamin; Bourianoff, George; Kläui, Mathias; Sinova, Jairo; Everschor-Sitte, Karin

    2018-01-01

    Inspired by the human brain, there is a strong effort to find alternative models of information processing capable of imitating the high energy efficiency of neuromorphic information processing. One possible realization of cognitive computing involves reservoir computing networks. These networks are built out of nonlinear resistive elements which are recursively connected. We propose that a Skyrmion network embedded in magnetic films may provide a suitable physical implementation for reservoir computing applications. The significant key ingredient of such a network is a two-terminal device with nonlinear voltage characteristics originating from magnetoresistive effects, such as the anisotropic magnetoresistance or the recently discovered noncollinear magnetoresistance. The most basic element for a reservoir computing network built from "Skyrmion fabrics" is a single Skyrmion embedded in a ferromagnetic ribbon. In order to pave the way towards reservoir computing systems based on Skyrmion fabrics, we simulate and analyze (i) the current flow through a single magnetic Skyrmion due to the anisotropic magnetoresistive effect and (ii) the combined physics of local pinning and the anisotropic magnetoresistive effect.

  13. 34 CFR 403.185 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in the event of a waiver? 403.185 Section 403.185 Education Regulations of the Offices of the Department...

  14. Development of the NASA/FLAGRO computer program for analysis of airframe structures

    NASA Technical Reports Server (NTRS)

    Forman, R. G.; Shivakumar, V.; Newman, J. C., Jr.

    1994-01-01

    The NASA/FLAGRO (NASGRO) computer program was developed for fracture control analysis of space hardware and is currently the standard computer code in NASA, the U.S. Air Force, and the European Agency (ESA) for this purpose. The significant attributes of the NASGRO program are the numerous crack case solutions, the large materials file, the improved growth rate equation based on crack closure theory, and the user-friendly promptive input features. In support of the National Aging Aircraft Research Program (NAARP); NASGRO is being further developed to provide advanced state-of-the-art capability for damage tolerance and crack growth analysis of aircraft structural problems, including mechanical systems and engines. The project currently involves a cooperative development effort by NASA, FAA, and ESA. The primary tasks underway are the incorporation of advanced methodology for crack growth rate retardation resulting from spectrum loading and improved analysis for determining crack instability. Also, the current weight function solutions in NASGRO or nonlinear stress gradient problems are being extended to more crack cases, and the 2-d boundary integral routine for stress analysis and stress-intensity factor solutions is being extended to 3-d problems. Lastly, effort is underway to enhance the program to operate on personal computers and work stations in a Windows environment. Because of the increasing and already wide usage of NASGRO, the code offers an excellent mechanism for technology transfer for new fatigue and fracture mechanics capabilities developed within NAARP.

  15. Computational Fluid Dynamics Technology for Hypersonic Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2003-01-01

    Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.

  16. Equal Time for Women.

    ERIC Educational Resources Information Center

    Kolata, Gina

    1984-01-01

    Examines social influences which discourage women from pursuing studies in computer science, including monopoly of computer time by boys at the high school level, sexual harassment in college, movies, and computer games. Describes some initial efforts to encourage females of all ages to study computer science. (JM)

  17. Numerical simulation of long-duration blast wave evolution in confined facilities

    NASA Astrophysics Data System (ADS)

    Togashi, F.; Baum, J. D.; Mestreau, E.; Löhner, R.; Sunshine, D.

    2010-10-01

    The objective of this research effort was to investigate the quasi-steady flow field produced by explosives in confined facilities. In this effort we modeled tests in which a high explosive (HE) cylindrical charge was hung in the center of a room and detonated. The HEs used for the tests were C-4 and AFX 757. While C-4 is just slightly under-oxidized and is typically modeled as an ideal explosive, AFX 757 includes a significant percentage of aluminum particles, so long-time afterburning and energy release must be considered. The Lawrence Livermore National Laboratory (LLNL)-produced thermo-chemical equilibrium algorithm, “Cheetah”, was used to estimate the remaining burnable detonation products. From these remaining species, the afterburning energy was computed and added to the flow field. Computations of the detonation and afterburn of two HEs in the confined multi-room facility were performed. The results demonstrate excellent agreement with available experimental data in terms of blast wave time of arrival, peak shock amplitude, reverberation, and total impulse (and hence, total energy release, via either the detonation or afterburn processes.

  18. A Selective Role for Dopamine in Learning to Maximize Reward But Not to Minimize Effort: Evidence from Patients with Parkinson's Disease.

    PubMed

    Skvortsova, Vasilisa; Degos, Bertrand; Welter, Marie-Laure; Vidailhet, Marie; Pessiglione, Mathias

    2017-06-21

    Instrumental learning is a fundamental process through which agents optimize their choices, taking into account various dimensions of available options such as the possible reward or punishment outcomes and the costs associated with potential actions. Although the implication of dopamine in learning from choice outcomes is well established, less is known about its role in learning the action costs such as effort. Here, we tested the ability of patients with Parkinson's disease (PD) to maximize monetary rewards and minimize physical efforts in a probabilistic instrumental learning task. The implication of dopamine was assessed by comparing performance ON and OFF prodopaminergic medication. In a first sample of PD patients ( n = 15), we observed that reward learning, but not effort learning, was selectively impaired in the absence of treatment, with a significant interaction between learning condition (reward vs effort) and medication status (OFF vs ON). These results were replicated in a second, independent sample of PD patients ( n = 20) using a simplified version of the task. According to Bayesian model selection, the best account for medication effects in both studies was a specific amplification of reward magnitude in a Q-learning algorithm. These results suggest that learning to avoid physical effort is independent from dopaminergic circuits and strengthen the general idea that dopaminergic signaling amplifies the effects of reward expectation or obtainment on instrumental behavior. SIGNIFICANCE STATEMENT Theoretically, maximizing reward and minimizing effort could involve the same computations and therefore rely on the same brain circuits. Here, we tested whether dopamine, a key component of reward-related circuitry, is also implicated in effort learning. We found that patients suffering from dopamine depletion due to Parkinson's disease were selectively impaired in reward learning, but not effort learning. Moreover, anti-parkinsonian medication restored the ability to maximize reward, but had no effect on effort minimization. This dissociation suggests that the brain has evolved separate, domain-specific systems for instrumental learning. These results help to disambiguate the motivational role of prodopaminergic medications: they amplify the impact of reward without affecting the integration of effort cost. Copyright © 2017 the authors 0270-6474/17/376087-11$15.00/0.

  19. Current Grid operation and future role of the Grid

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place, Grid will become limited to HEP; if however the current multitude of Grid-like systems will converge to a generic, modular and extensible solution, Grid will become true to its name.

  20. Report on the Human Genome Initiative for the Office of Health and Environmental Research

    DOE R&D Accomplishments Database

    Tinoco, I.; Cahill, G.; Cantor, C.; Caskey, T.; Dulbecco, R.; Engelhardt, D. L.; Hood, L.; Lerman, L. S.; Mendelsohn, M. L.; Sinsheimer, R. L.; Smith, T.; Soll, D.; Stormo, G.; White, R. L.

    1987-04-01

    The report urges DOE and the Nation to commit to a large, multi-year, multidisciplinary, technological undertaking to order and sequence the human genome. This effort will first require significant innovation in general capability to manipulate DNA, major new analytical methods for ordering and sequencing, theoretical developments in computer science and mathematical biology, and great expansions in our ability to store and manipulate the information and to interface it with other large and diverse genetic databases. The actual ordering and sequencing involves the coordinated processing of some 3 billion bases from a reference human genome. Science is poised on the rudimentary edge of being able to read and understand human genes. A concerted, broadly based, scientific effort to provide new methods of sufficient power and scale should transform this activity from an inefficient one-gene-at-a-time, single laboratory effort into a coordinated, worldwide, comprehensive reading of "the book of man". The effort will be extraordinary in scope and magnitude, but so will be the benefit to biological understanding, new technology and the diagnosis and treatment of human disease.

  1. Computational fluid dynamics modeling of laminar, transitional, and turbulent flows with sensitivity to streamline curvature and rotational effects

    NASA Astrophysics Data System (ADS)

    Chitta, Varun

    Modeling of complex flows involving the combined effects of flow transition and streamline curvature using two advanced turbulence models, one in the Reynolds-averaged Navier-Stokes (RANS) category and the other in the hybrid RANS-Large eddy simulation (LES) category is considered in this research effort. In the first part of the research, a new scalar eddy-viscosity model (EVM) is proposed, designed to exhibit physically correct responses to flow transition, streamline curvature, and system rotation effects. The four equation model developed herein is a curvature-sensitized version of a commercially available three-equation transition-sensitive model. The physical effects of rotation and curvature (RC) enter the model through the added transport equation, analogous to a transverse turbulent velocity scale. The eddy-viscosity has been redefined such that the proposed model is constrained to reduce to the original transition-sensitive model definition in nonrotating flows or in regions with negligible RC effects. In the second part of the research, the developed four-equation model is combined with a LES technique using a new hybrid modeling framework, dynamic hybrid RANS-LES. The new framework is highly generalized, allowing coupling of any desired LES model with any given RANS model and addresses several deficiencies inherent in most current hybrid models. In the present research effort, the DHRL model comprises of the proposed four-equation model for RANS component and the MILES scheme for LES component. Both the models were implemented into a commercial computational fluid dynamics (CFD) solver and tested on a number of engineering and generic flow problems. Results from both the RANS and hybrid models show successful resolution of the combined effects of transition and curvature with reasonable engineering accuracy, and for only a small increase in computational cost. In addition, results from the hybrid model indicate significant levels of turbulent fluctuations in the flowfield, improved accuracy compared to RANS models predictions, and are obtained at a significant reduction of computational cost compared to full LES models. The results suggest that the advanced turbulence modeling techniques presented in this research effort have potential as practical tools for solving low/high Re flows over blunt/curved bodies for the prediction of transition and RC effects.

  2. A review of evaluative studies of computer-based learning in nursing education.

    PubMed

    Lewis, M J; Davies, R; Jenkins, D; Tait, M I

    2001-01-01

    Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.

  3. Initial dynamic load estimates during configuration design

    NASA Technical Reports Server (NTRS)

    Schiff, Daniel

    1987-01-01

    This analysis includes the structural response to shock and vibration and evaluates the maximum deflections and material stresses and the potential for the occurrence of elastic instability, fatigue and fracture. The required computations are often performed by means of finite element analysis (FEA) computer programs in which the structure is simulated by a finite element model which may contain thousands of elements. The formulation of a finite element model can be time consuming, and substantial additional modeling effort may be necessary if the structure requires significant changes after initial analysis. Rapid methods for obtaining rough estimates of the structural response to shock and vibration are presented for the purpose of providing guidance during the initial mechanical design configuration stage.

  4. The Man computer Interactive Data Access System: 25 Years of Interactive Processing.

    NASA Astrophysics Data System (ADS)

    Lazzara, Matthew A.; Benson, John M.; Fox, Robert J.; Laitsch, Denise J.; Rueden, Joseph P.; Santek, David A.; Wade, Delores M.; Whittaker, Thomas M.; Young, J. T.

    1999-02-01

    On 12 October 1998, it was the 25th anniversary of the Man computer Interactive Data Access System (McIDAS). On that date in 1973, McIDAS was first used operationally by scientists as a tool for data analysis. Over the last 25 years, McIDAS has undergone numerous architectural changes in an effort to keep pace with changing technology. In its early years, significant technological breakthroughs were required to achieve the functionality needed by atmospheric scientists. Today McIDAS is challenged by new Internet-based approaches to data access and data display. The history and impact of McIDAS, along with some of the lessons learned, are presented here

  5. Combining Computational and Social Effort for Collaborative Problem Solving

    PubMed Central

    Wagy, Mark D.; Bongard, Josh C.

    2015-01-01

    Rather than replacing human labor, there is growing evidence that networked computers create opportunities for collaborations of people and algorithms to solve problems beyond either of them. In this study, we demonstrate the conditions under which such synergy can arise. We show that, for a design task, three elements are sufficient: humans apply intuitions to the problem, algorithms automatically determine and report back on the quality of designs, and humans observe and innovate on others’ designs to focus creative and computational effort on good designs. This study suggests how such collaborations should be composed for other domains, as well as how social and computational dynamics mutually influence one another during collaborative problem solving. PMID:26544199

  6. Computing Cluster for Large Scale Turbulence Simulations and Applications in Computational Aeroacoustics

    NASA Astrophysics Data System (ADS)

    Lele, Sanjiva K.

    2002-08-01

    Funds were received in April 2001 under the Department of Defense DURIP program for construction of a 48 processor high performance computing cluster. This report details the hardware which was purchased and how it has been used to enable and enhance research activities directly supported by, and of interest to, the Air Force Office of Scientific Research and the Department of Defense. The report is divided into two major sections. The first section after this summary describes the computer cluster, its setup, and some cluster performance benchmark results. The second section explains ongoing research efforts which have benefited from the cluster hardware, and presents highlights of those efforts since installation of the cluster.

  7. Eric Bonnema | NREL

    Science.gov Websites

    contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and

  8. The Fox and the Grapes-How Physical Constraints Affect Value Based Decision Making.

    PubMed

    Gross, Jörg; Woelbert, Eva; Strobel, Martin

    2015-01-01

    One fundamental question in decision making research is how humans compute the values that guide their decisions. Recent studies showed that people assign higher value to goods that are closer to them, even when physical proximity should be irrelevant for the decision from a normative perspective. This phenomenon, however, seems reasonable from an evolutionary perspective. Most foraging decisions of animals involve the trade-off between the value that can be obtained and the associated effort of obtaining. Anticipated effort for physically obtaining a good could therefore affect the subjective value of this good. In this experiment, we test this hypothesis by letting participants state their subjective value for snack food while the effort that would be incurred when reaching for it was manipulated. Even though reaching was not required in the experiment, we find that willingness to pay was significantly lower when subjects wore heavy wristbands on their arms. Thus, when reaching was more difficult, items were perceived as less valuable. Importantly, this was only the case when items were physically in front of the participants but not when items were presented as text on a computer screen. Our results suggest automatic interactions of motor and valuation processes which are unexplored to this date and may account for irrational decisions that occur when reward is particularly easy to reach.

  9. High level cognitive information processing in neural networks

    NASA Technical Reports Server (NTRS)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  10. The Fox and the Grapes—How Physical Constraints Affect Value Based Decision Making

    PubMed Central

    Strobel, Martin

    2015-01-01

    One fundamental question in decision making research is how humans compute the values that guide their decisions. Recent studies showed that people assign higher value to goods that are closer to them, even when physical proximity should be irrelevant for the decision from a normative perspective. This phenomenon, however, seems reasonable from an evolutionary perspective. Most foraging decisions of animals involve the trade-off between the value that can be obtained and the associated effort of obtaining. Anticipated effort for physically obtaining a good could therefore affect the subjective value of this good. In this experiment, we test this hypothesis by letting participants state their subjective value for snack food while the effort that would be incurred when reaching for it was manipulated. Even though reaching was not required in the experiment, we find that willingness to pay was significantly lower when subjects wore heavy wristbands on their arms. Thus, when reaching was more difficult, items were perceived as less valuable. Importantly, this was only the case when items were physically in front of the participants but not when items were presented as text on a computer screen. Our results suggest automatic interactions of motor and valuation processes which are unexplored to this date and may account for irrational decisions that occur when reward is particularly easy to reach. PMID:26061087

  11. Complementary Reliability-Based Decodings of Binary Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1997-01-01

    This correspondence presents a hybrid reliability-based decoding algorithm which combines the reprocessing method based on the most reliable basis and a generalized Chase-type algebraic decoder based on the least reliable positions. It is shown that reprocessing with a simple additional algebraic decoding effort achieves significant coding gain. For long codes, the order of reprocessing required to achieve asymptotic optimum error performance is reduced by approximately 1/3. This significantly reduces the computational complexity, especially for long codes. Also, a more efficient criterion for stopping the decoding process is derived based on the knowledge of the algebraic decoding solution.

  12. Study of the Use of Time-Mean Vortices to Generate Lift for MAV Applications

    DTIC Science & Technology

    2011-05-31

    microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters (geometry, frequency, amplitude of oscillation, etc...issue involved. Towards this end, a suspended microplate was fabricated via MEMS technology and driven to in-plane resonance via Lorentz force...force to drive the suspended MEMS-based microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters

  13. Foreign Language Translation of Chemical Nomenclature by Computer

    PubMed Central

    2009-01-01

    Chemical compound names remain the primary method for conveying molecular structures between chemists and researchers. In research articles, patents, chemical catalogues, government legislation, and textbooks, the use of IUPAC and traditional compound names is universal, despite efforts to introduce more machine-friendly representations such as identifiers and line notations. Fortunately, advances in computing power now allow chemical names to be parsed and generated (read and written) with almost the same ease as conventional connection tables. A significant complication, however, is that although the vast majority of chemistry uses English nomenclature, a significant fraction is in other languages. This complicates the task of filing and analyzing chemical patents, purchasing from compound vendors, and text mining research articles or Web pages. We describe some issues with manipulating chemical names in various languages, including British, American, German, Japanese, Chinese, Spanish, Swedish, Polish, and Hungarian, and describe the current state-of-the-art in software tools to simplify the process. PMID:19239237

  14. Distributive, Non-destructive Real-time System and Method for Snowpack Monitoring

    NASA Technical Reports Server (NTRS)

    Frolik, Jeff (Inventor); Skalka, Christian (Inventor)

    2013-01-01

    A ground-based system that provides quasi real-time measurement and collection of snow-water equivalent (SWE) data in remote settings is provided. The disclosed invention is significantly less expensive and easier to deploy than current methods and less susceptible to terrain and snow bridging effects. Embodiments of the invention include remote data recovery solutions. Compared to current infrastructure using existing SWE technology, the disclosed invention allows more SWE sites to be installed for similar cost and effort, in a greater variety of terrain; thus, enabling data collection at improved spatial resolutions. The invention integrates a novel computational architecture with new sensor technologies. The invention's computational architecture is based on wireless sensor networks, comprised of programmable, low-cost, low-powered nodes capable of sophisticated sensor control and remote data communication. The invention also includes measuring attenuation of electromagnetic radiation, an approach that is immune to snow bridging and significantly reduces sensor footprints.

  15. A General Approach to Measuring Test-Taking Effort on Computer-Based Tests

    ERIC Educational Resources Information Center

    Wise, Steven L.; Gao, Lingyun

    2017-01-01

    There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…

  16. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  17. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Astrophysics Data System (ADS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  18. Increasing the impact of medical image computing using community-based open-access hackathons: The NA-MIC and 3D Slicer experience.

    PubMed

    Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron

    2016-10-01

    The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  20. Progress on the FabrIc for Frontier Experiments project at Fermilab

    DOE PAGES

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...

    2015-12-23

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less

  1. The FIFE Project at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Box, D.; Boyd, J.; Di Benedetto, V.

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less

  2. Improving a data-acquisition software system with abstract data type components

    NASA Technical Reports Server (NTRS)

    Howard, S. D.

    1990-01-01

    Abstract data types and object-oriented design are active research areas in computer science and software engineering. Much of the interest is aimed at new software development. Abstract data type packages developed for a discontinued software project were used to improve a real-time data-acquisition system under maintenance. The result saved effort and contributed to a significant improvement in the performance, maintainability, and reliability of the Goldstone Solar System Radar Data Acquisition System.

  3. Techniques A: continuous waves

    NASA Astrophysics Data System (ADS)

    Beuthan, J.

    1993-08-01

    In a vast amount of medical diseases the biochemical and physiological changes of soft tissues are hardly detectable by conventional techniques of diagnostic imaging (x- ray, ultrasound, computer tomography, and MRI). The detectivity is low and the technical efforts are tremendous. On the other hand these pathologic variations induce significant changes of the optical tissue parameters which can be detected. The corresponding variations of the scattered light can most easily be detected and evaluated by infrared diaphanoscopy, even on optical thick tissue slices.

  4. Exploring Midwives' Need and Intention to Adopt Electronic Integrated Antenatal Care

    PubMed Central

    Markam, Hosizah; Hochheiser, Harry; Kuntoro, Kuntoro; Notobroto, Hari Basuki

    2018-01-01

    Documentation requirements for the Indonesian integrated antenatal care (ANC) program suggest the need for electronic systems to address gaps in existing paper documentation practices. Our goals were to quantify midwives' documentation completeness in a primary healthcare center, understand documentation challenges, develop a tool, and assess intention to use the tool. We analyzed existing ANC records in a primary healthcare center in Bangkalan, East Java, and conducted interviews with stakeholders to understand needs for an electronic system in support of ANC. Development of the web-based Electronic Integrated ANC (e-iANC) system used the System Development Life Cycle method. Training on the use of the system was held in the computer laboratory for 100 midwives chosen from four primary healthcare centers in each of five regions. The Unified Theory of Acceptance and Use of Technology (UTAUT) questionnaire was used to assess their intention to adopt e-iANC. The midwives' intention to adopt e-iANC was significantly influenced by performance expectancy, effort expectancy and facilitating conditions. Age, education level, and computer literacy did not significantly moderate the effects of performance expectancy and effort expectancy on adoption intention. The UTAUT results indicated that the factors that might influence intention to adopt e-iANC are potentially addressable. Results suggest that e-iANC might well be accepted by midwives. PMID:29618961

  5. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems

    PubMed Central

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-01-01

    Background We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. Results We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Conclusion Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems. PMID:17081289

  6. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems.

    PubMed

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-11-02

    We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems.

  7. Multi-school collaboration to develop and test nutrition computer modules for pediatric residents.

    PubMed

    Roche, Patricia L; Ciccarelli, Mary R; Gupta, Sandeep K; Hayes, Barbara M; Molleston, Jean P

    2007-09-01

    The provision of essential nutrition-related content in US medical education has been deficient, despite efforts of the federal government and multiple professional organizations. Novel and efficient approaches are needed. A multi-department project was developed to create and pilot a computer-based compact disc instructional program covering the nutrition topics of oral rehydration therapy, calcium, and vitamins. Funded by an internal medical school grant, the content of the modules was written by Department of Pediatrics faculty. The modules were built by School of Informatics faculty and students, and were tested on a convenience sampling of 38 pediatric residents in a randomized controlled trial performed by a registered dietitian/School of Health and Rehabilitation Sciences Master's degree candidate. The modules were reviewed for content by the pediatric faculty principal investigator and the registered dietitian/School of Health and Rehabilitation Sciences graduate student. Residents completed a pretest of nutrition knowledge and attitude toward nutrition and Web-based instruction. Half the group was given three programs (oral rehydration therapy, calcium, and vitamins) on compact disc for study over 6 weeks. Both study and control groups completed a posttest. Pre- and postintervention objective test results in study vs control groups and attitudinal survey results before and after intervention in the study group were compared. The experimental group demonstrated significantly better posttrial objective test performance compared to the control group (P=0.0005). The study group tended toward improvement, whereas the control group performance declined substantially between pre- and posttests. Study group resident attitudes toward computer-based instruction improved. Use of these computer modules prompted almost half of the residents in the study group to independently pursue relevant nutrition-related information. This inexpensive, collaborative, multi-department effort to design a computer-based nutrition curriculum positively impacted both resident knowledge and attitudes.

  8. Accelerating the discovery of space-time patterns of infectious diseases using parallel computing.

    PubMed

    Hohl, Alexander; Delmelle, Eric; Tang, Wenwu; Casas, Irene

    2016-11-01

    Infectious diseases have complex transmission cycles, and effective public health responses require the ability to monitor outbreaks in a timely manner. Space-time statistics facilitate the discovery of disease dynamics including rate of spread and seasonal cyclic patterns, but are computationally demanding, especially for datasets of increasing size, diversity and availability. High-performance computing reduces the effort required to identify these patterns, however heterogeneity in the data must be accounted for. We develop an adaptive space-time domain decomposition approach for parallel computation of the space-time kernel density. We apply our methodology to individual reported dengue cases from 2010 to 2011 in the city of Cali, Colombia. The parallel implementation reaches significant speedup compared to sequential counterparts. Density values are visualized in an interactive 3D environment, which facilitates the identification and communication of uneven space-time distribution of disease events. Our framework has the potential to enhance the timely monitoring of infectious diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Fast Katz and Commuters: Efficient Estimation of Social Relatedness in Large Networks

    NASA Astrophysics Data System (ADS)

    Esfandiar, Pooya; Bonchi, Francesco; Gleich, David F.; Greif, Chen; Lakshmanan, Laks V. S.; On, Byung-Won

    Motivated by social network data mining problems such as link prediction and collaborative filtering, significant research effort has been devoted to computing topological measures including the Katz score and the commute time. Existing approaches typically approximate all pairwise relationships simultaneously. In this paper, we are interested in computing: the score for a single pair of nodes, and the top-k nodes with the best scores from a given source node. For the pairwise problem, we apply an iterative algorithm that computes upper and lower bounds for the measures we seek. This algorithm exploits a relationship between the Lanczos process and a quadrature rule. For the top-k problem, we propose an algorithm that only accesses a small portion of the graph and is related to techniques used in personalized PageRank computing. To test the scalability and accuracy of our algorithms we experiment with three real-world networks and find that these algorithms run in milliseconds to seconds without any preprocessing.

  10. Fast katz and commuters : efficient estimation of social relatedness in large networks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    On, Byung-Won; Lakshmanan, Laks V. S.; Greif, Chen

    Motivated by social network data mining problems such as link prediction and collaborative filtering, significant research effort has been devoted to computing topological measures including the Katz score and the commute time. Existing approaches typically approximate all pairwise relationships simultaneously. In this paper, we are interested in computing: the score for a single pair of nodes, and the top-k nodes with the best scores from a given source node. For the pairwise problem, we apply an iterative algorithm that computes upper and lower bounds for the measures we seek. This algorithm exploits a relationship between the Lanczos process and amore » quadrature rule. For the top-k problem, we propose an algorithm that only accesses a small portion of the graph and is related to techniques used in personalized PageRank computing. To test the scalability and accuracy of our algorithms we experiment with three real-world networks and find that these algorithms run in milliseconds to seconds without any preprocessing.« less

  11. Meeting report from the fourth meeting of the Computational Modeling in Biology Network (COMBINE)

    PubMed Central

    Waltemath, Dagmar; Bergmann, Frank T.; Chaouiya, Claudine; Czauderna, Tobias; Gleeson, Padraig; Goble, Carole; Golebiewski, Martin; Hucka, Michael; Juty, Nick; Krebs, Olga; Le Novère, Nicolas; Mi, Huaiyu; Moraru, Ion I.; Myers, Chris J.; Nickerson, David; Olivier, Brett G.; Rodriguez, Nicolas; Schreiber, Falk; Smith, Lucian; Zhang, Fengkai; Bonnet, Eric

    2014-01-01

    The Computational Modeling in Biology Network (COMBINE) is an initiative to coordinate the development of community standards and formats in computational systems biology and related fields. This report summarizes the topics and activities of the fourth edition of the annual COMBINE meeting, held in Paris during September 16-20 2013, and attended by a total of 96 people. This edition pioneered a first day devoted to modeling approaches in biology, which attracted a broad audience of scientists thanks to a panel of renowned speakers. During subsequent days, discussions were held on many subjects including the introduction of new features in the various COMBINE standards, new software tools that use the standards, and outreach efforts. Significant emphasis went into work on extensions of the SBML format, and also into community-building. This year’s edition once again demonstrated that the COMBINE community is thriving, and still manages to help coordinate activities between different standards in computational systems biology.

  12. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  13. Computerizing the Accounting Curriculum.

    ERIC Educational Resources Information Center

    Nash, John F.; England, Thomas G.

    1986-01-01

    Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)

  14. Computers in Schools: White Boys Only?

    ERIC Educational Resources Information Center

    Hammett, Roberta F.

    1997-01-01

    Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)

  15. Computers and Instruction: Implications of the Rising Tide of Criticism for Reading Education.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    1988-01-01

    Examines two major reasons that schools have adopted computers without careful prior examination and planning. Surveys a variety of criticisms targeted toward some aspects of computer-based instruction in reading in an effort to direct attention to the beneficial implications of computers in the classroom. (MS)

  16. Computers for the Faculty: How on a Limited Budget.

    ERIC Educational Resources Information Center

    Arman, Hal; Kostoff, John

    An informal investigation of the use of computers at Delta College (DC) in Michigan revealed reasonable use of computers by faculty in disciplines such as mathematics, business, and technology, but very limited use in the humanities and social sciences. In an effort to increase faculty computer usage, DC decided to make computers available to any…

  17. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Treesearch

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  18. Biomechanics of Head, Neck, and Chest Injury Prevention for Soldiers: Phase 2 and 3

    DTIC Science & Technology

    2016-08-01

    understanding of the biomechanics of the head and brain. Task 2.3 details the computational modeling efforts conducted to evaluate the response of the...section also details the progress made on the development of a testing apparatus to evaluate cervical spine implants in survivable loading scenarios...computational modeling efforts conducted to evaluate the response of the cervical spine and the effects of cervical arthrodesis and arthroplasty during

  19. Limits on fundamental limits to computation.

    PubMed

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  20. Computational electronics and electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C C

    The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less

  1. Office workers' computer use patterns are associated with workplace stressors.

    PubMed

    Eijckelhof, Belinda H W; Huysmans, Maaike A; Blatter, Birgitte M; Leider, Priscilla C; Johnson, Peter W; van Dieën, Jaap H; Dennerlein, Jack T; van der Beek, Allard J

    2014-11-01

    This field study examined associations between workplace stressors and office workers' computer use patterns. We collected keyboard and mouse activities of 93 office workers (68F, 25M) for approximately two work weeks. Linear regression analyses examined the associations between self-reported effort, reward, overcommitment, and perceived stress and software-recorded computer use duration, number of short and long computer breaks, and pace of input device usage. Daily duration of computer use was, on average, 30 min longer for workers with high compared to low levels of overcommitment and perceived stress. The number of short computer breaks (30 s-5 min long) was approximately 20% lower for those with high compared to low effort and for those with low compared to high reward. These outcomes support the hypothesis that office workers' computer use patterns vary across individuals with different levels of workplace stressors. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. Progress Toward Affordable High Fidelity Combustion Simulations Using Filtered Density Functions for Hypersonic Flows in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Drozda, Tomasz G.; Quinlan, Jesse R.; Pisciuneri, Patrick H.; Yilmaz, S. Levent

    2012-01-01

    Significant progress has been made in the development of subgrid scale (SGS) closures based on a filtered density function (FDF) for large eddy simulations (LES) of turbulent reacting flows. The FDF is the counterpart of the probability density function (PDF) method, which has proven effective in Reynolds averaged simulations (RAS). However, while systematic progress is being made advancing the FDF models for relatively simple flows and lab-scale flames, the application of these methods in complex geometries and high speed, wall-bounded flows with shocks remains a challenge. The key difficulties are the significant computational cost associated with solving the FDF transport equation and numerically stiff finite rate chemistry. For LES/FDF methods to make a more significant impact in practical applications a pragmatic approach must be taken that significantly reduces the computational cost while maintaining high modeling fidelity. An example of one such ongoing effort is at the NASA Langley Research Center, where the first generation FDF models, namely the scalar filtered mass density function (SFMDF) are being implemented into VULCAN, a production-quality RAS and LES solver widely used for design of high speed propulsion flowpaths. This effort leverages internal and external collaborations to reduce the overall computational cost of high fidelity simulations in VULCAN by: implementing high order methods that allow reduction in the total number of computational cells without loss in accuracy; implementing first generation of high fidelity scalar PDF/FDF models applicable to high-speed compressible flows; coupling RAS/PDF and LES/FDF into a hybrid framework to efficiently and accurately model the effects of combustion in the vicinity of the walls; developing efficient Lagrangian particle tracking algorithms to support robust solutions of the FDF equations for high speed flows; and utilizing finite rate chemistry parametrization, such as flamelet models, to reduce the number of transported reactive species and remove numerical stiffness. This paper briefly introduces the SFMDF model (highlighting key benefits and challenges), and discusses particle tracking for flows with shocks, the hybrid coupled RAS/PDF and LES/FDF model, flamelet generated manifolds (FGM) model, and the Irregularly Portioned Lagrangian Monte Carlo Finite Difference (IPLMCFD) methodology for scalable simulation of high-speed reacting compressible flows.

  3. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  4. An Approach for Dynamic Grids

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Liou, Meng-Sing; Hindman, Richard G.

    1994-01-01

    An approach is presented for the generation of two-dimensional, structured, dynamic grids. The grid motion may be due to the motion of the boundaries of the computational domain or to the adaptation of the grid to the transient, physical solution. A time-dependent grid is computed through the time integration of the grid speeds which are computed from a system of grid speed equations. The grid speed equations are derived from the time-differentiation of the grid equations so as to ensure that the dynamic grid maintains the desired qualities of the static grid. The grid equations are the Euler-Lagrange equations derived from a variational statement for the grid. The dynamic grid method is demonstrated for a model problem involving boundary motion, an inviscid flow in a converging-diverging nozzle during startup, and a viscous flow over a flat plate with an impinging shock wave. It is shown that the approach is more accurate for transient flows than an approach in which the grid speeds are computed using a finite difference with respect to time of the grid. However, the approach requires significantly more computational effort.

  5. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  6. Reduction of community alcohol problems: computer simulation experiments in three counties.

    PubMed

    Holder, H D; Blose, J O

    1987-03-01

    A series of alcohol abuse prevention strategies was evaluated using computer simulation for three counties in the United States: Wake County, North Carolina, Washington County, Vermont and Alameda County, California. A system dynamics model composed of a network of interacting variables was developed for the pattern of alcoholic beverage consumption in a community. The relationship of community drinking patterns to various stimulus factors was specified in the model based on available empirical research. Stimulus factors included disposable income, alcoholic beverage prices, advertising exposure, minimum drinking age and changes in cultural norms. After a generic model was developed and validated on the national level, a computer-based system dynamics model was developed for each county, and a series of experiments was conducted to project the potential impact of specific prevention strategies. The project concluded that prevention efforts can both lower current levels of alcohol abuse and reduce projected increases in alcohol-related problems. Without such efforts, already high levels of alcohol-related family disruptions in the three counties could be expected to rise an additional 6% and drinking-related work problems 1-5%, over the next 10 years after controlling for population growth. Of the strategies tested, indexing the price of alcoholic beverages to the consumer price index in conjunction with the implementation of a community educational program with well-defined target audiences has the best potential for significant problem reduction in all three counties.

  7. Enabling large-scale viscoelastic calculations via neural network acceleration

    NASA Astrophysics Data System (ADS)

    Robinson DeVries, P.; Thompson, T. B.; Meade, B. J.

    2017-12-01

    One of the most significant challenges involved in efforts to understand the effects of repeated earthquake cycle activity are the computational costs of large-scale viscoelastic earthquake cycle models. Deep artificial neural networks (ANNs) can be used to discover new, compact, and accurate computational representations of viscoelastic physics. Once found, these efficient ANN representations may replace computationally intensive viscoelastic codes and accelerate large-scale viscoelastic calculations by more than 50,000%. This magnitude of acceleration enables the modeling of geometrically complex faults over thousands of earthquake cycles across wider ranges of model parameters and at larger spatial and temporal scales than have been previously possible. Perhaps most interestingly from a scientific perspective, ANN representations of viscoelastic physics may lead to basic advances in the understanding of the underlying model phenomenology. We demonstrate the potential of artificial neural networks to illuminate fundamental physical insights with specific examples.

  8. Computational fluid dynamics (CFD) in the design of a water-jet-drive system

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto

    1994-01-01

    NASA/Marshall Space Flight Center (MSFC) has an ongoing effort to transfer to industry the technologies developed at MSFC for rocket propulsion systems. The Technology Utilization (TU) Office at MSFC promotes these efforts and accepts requests for assistance from industry. One such solicitation involves a request from North American Marine Jet, Inc. (NAMJ) for assistance in the design of a water-jet-drive system to fill a gap in NAMJ's product line. NAMJ provided MSFC with a baseline axial flow impeller design as well as the relevant working parameters (rpm, flow rate, etc.). This baseline design was analyzed using CFD, and significant deficiencies identified. Four additional analyses were performed involving MSFC changes to the geometric and operational parameters of the baseline case. Subsequently, the impeller was redesigned by NAMJ and analyzed by MSFC. This new configuration performs significantly better than the baseline design. Similar cooperative activities are planned for the design of the jet-drive inlet.

  9. Imaging atherosclerosis with hybrid [18F]fluorodeoxyglucose positron emission tomography/computed tomography imaging: what Leonardo da Vinci could not see.

    PubMed

    Cocker, Myra S; Mc Ardle, Brian; Spence, J David; Lum, Cheemun; Hammond, Robert R; Ongaro, Deidre C; McDonald, Matthew A; Dekemp, Robert A; Tardif, Jean-Claude; Beanlands, Rob S B

    2012-12-01

    Prodigious efforts and landmark discoveries have led toward significant advances in our understanding of atherosclerosis. Despite significant efforts, atherosclerosis continues globally to be a leading cause of mortality and reduced quality of life. With surges in the prevalence of obesity and diabetes, atherosclerosis is expected to have an even more pronounced impact upon the global burden of disease. It is imperative to develop strategies for the early detection of disease. Positron emission tomography (PET) imaging utilizing [(18)F]fluorodeoxyglucose (FDG) may provide a non-invasive means of characterizing inflammatory activity within atherosclerotic plaque, thus serving as a surrogate biomarker for detecting vulnerable plaque. The aim of this review is to explore the rationale for performing FDG imaging, provide an overview into the mechanism of action, and summarize findings from the early application of FDG PET imaging in the clinical setting to evaluate vascular disease. Alternative imaging biomarkers and approaches are briefly discussed.

  10. Information Security: Governmentwide Guidance Needed to Assist Agencies in Implementing Cloud Computing

    DTIC Science & Technology

    2010-07-01

    Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to

  11. Computer- and web-based interventions to promote healthy eating among children and adolescents: a systematic review.

    PubMed

    Hamel, Lauren M; Robbins, Lorraine B

    2013-01-01

    To: (1) determine the effect of computer- and web-based interventions on improving eating behavior (e.g. increasing fruit and vegetable consumption; decreasing fat consumption) and/or diet-related physical outcomes (e.g. body mass index) among children and adolescents; and (2) examine what elements enhance success. Children and adolescents are the heaviest they have ever been. Excess weight can carry into adulthood and result in chronic health problems. Because of the capacity to reach large audiences of children and adolescents to promote healthy eating, computer- and web-based interventions hold promise for helping to curb this serious trend. However, evidence to support this approach is lacking. Systematic review using guidelines from the Cochrane Effective Practice and Organisation of Care Group. The following databases were searched for studies from 1998-2011: CINAHL; PubMed; Cochrane; PsycINFO; ERIC; and Proquest. Fifteen randomized controlled trials or quasi-experimental studies were analysed in a systematic review. Although a majority of interventions resulted in statistically significant positive changes in eating behavior and/or diet-related physical outcomes, interventions that included post intervention follow-up, ranging from 3-18 months, showed that changes were not maintained. Elements, such as conducting the intervention at school or using individually tailored feedback, may enhance success. Computer- and web-based interventions can improve eating behavior and diet-related physical outcomes among children and adolescents, particularly when conducted in schools and individually tailored. These interventions can complement and support nursing efforts to give preventive care; however, maintenance efforts are recommended. © 2012 Blackwell Publishing Ltd.

  12. Determining Training Device Requirements in Army Aviation Systems

    NASA Technical Reports Server (NTRS)

    Poumade, M. L.

    1984-01-01

    A decision making methodology which applies the systems approach to the training problem is discussed. Training is viewed as a total system instead of a collection of individual devices and unrelated techniques. The core of the methodology is the use of optimization techniques such as the transportation algorithm and multiobjective goal programming with training task and training device specific data. The role of computers, especially automated data bases and computer simulation models, in the development of training programs is also discussed. The approach can provide significant training enhancement and cost savings over the more traditional, intuitive form of training development and device requirements process. While given from an aviation perspective, the methodology is equally applicable to other training development efforts.

  13. Future Computer Requirements for Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  14. Computational Investigations for Undergraduate Organic Chemistry: Predicting the Mechanism of the Ritter Reaction

    NASA Astrophysics Data System (ADS)

    Hessley, Rita K.

    2000-02-01

    In an effort to engage students more deeply in their laboratory work and provide them with valuable learning experiences in the applications and limitations of computational chemistry as a research tool, students are instructed to carry out a computational pre-lab exercise. Before carrying out a laboratory experiment that investigates the mechanism for the formation of N-t-butylbenzamide, students construct and obtain heats of formation for reactants, products, postulated reaction intermediates, and one transition state structure for each proposed mechanism. This is designed as a companion to an open-ended laboratory experiment that hones skills learned early in most traditional organic chemistry courses. The incorporation of a preliminary computational exercise enables students to move beyond guessing what the outcome of the reaction will be. It challenges them to test what they believe they "know" about such fundamental concepts as stability of carbocations, or the significance and utility of thermodynamic data relative to kinetic data. On the basis of their computations and their own experimental data, students then verify or dispute their hypothesis, finally arriving at a defensible and logical conclusion about the course of the reaction mechanism. The manner of implementation of the exercise and typical computational data are described.

  15. Emerging Neuromorphic Computing Architectures & Enabling Hardware for Cognitive Information Processing Applications

    DTIC Science & Technology

    2010-06-01

    DATES COVEREDAPR 2009 – JAN 2010 (From - To) APR 2009 – JAN 2010 4. TITLE AND SUBTITLE EMERGING NEUROMORPHIC COMPUTING ARCHITECTURES AND ENABLING...14. ABSTRACT The highly cross-disciplinary emerging field of neuromorphic computing architectures for cognitive information processing applications...belief systems, software, computer engineering, etc. In our effort to develop cognitive systems atop a neuromorphic computing architecture, we explored

  16. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  17. A New Biogeochemical Computational Framework Integrated within the Community Land Model

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Li, H.; Liu, C.; Huang, M.; Leung, L.

    2012-12-01

    Terrestrial biogeochemical processes, particularly carbon cycle dynamics, have been shown to significantly influence regional and global climate changes. Modeling terrestrial biogeochemical processes within the land component of Earth System Models such as the Community Land model (CLM), however, faces three major challenges: 1) extensive efforts in modifying modeling structures and rewriting computer programs to incorporate biogeochemical processes with increasing complexity, 2) expensive computational cost to solve the governing equations due to numerical stiffness inherited from large variations in the rates of biogeochemical processes, and 3) lack of an efficient framework to systematically evaluate various mathematical representations of biogeochemical processes. To address these challenges, we introduce a new computational framework to incorporate biogeochemical processes into CLM, which consists of a new biogeochemical module with a generic algorithm and reaction database. New and updated biogeochemical processes can be incorporated into CLM without significant code modification. To address the stiffness issue, algorithms and criteria will be developed to identify fast processes, which will be replaced with algebraic equations and decoupled from slow processes. This framework can serve as a generic and user-friendly platform to test out different mechanistic process representations and datasets and gain new insight on the behavior of the terrestrial ecosystems in response to climate change in a systematic way.

  18. Computer Augmented Video Education.

    ERIC Educational Resources Information Center

    Sousa, M. B.

    1979-01-01

    Describes project CAVE (Computer Augmented Video Education), an ongoing effort at the U.S. Naval Academy to present lecture material on videocassette tape, reinforced by drill and practice through an interactive computer system supported by a 12 channel closed circuit television distribution and production facility. (RAO)

  19. Computer Guided Instructional Design.

    ERIC Educational Resources Information Center

    Merrill, M. David; Wood, Larry E.

    1984-01-01

    Describes preliminary efforts to create the Lesson Design System, a computer-guided instructional design system written in Pascal for Apple microcomputers. Its content outline, strategy, display, and online lesson editors correspond roughly to instructional design phases of content and strategy analysis, display creation, and computer programing…

  20. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...

  1. Computer Modeling of Acceleration Effects on Cerebral Oxygen Saturation

    DTIC Science & Technology

    2007-04-01

    a significant physiological threat to etrate the cranium and enter the cerebral cortex. Hongo high-performance aircraft pilots since the development...et al. and Hongo et al. (7,8). blackened out and all that could be seen was the target, The primary focus of this effort was to build a model i.e...O6GInduced.html. 87:402. 12. Tripp LD, Arnold A, Bagian J, et al. Psychophysiological effects 8. Hongo K, Kobayashi S, Okudera H, et al. Noninvasive cerebral of

  2. Langley's Computational Efforts in Sonic-Boom Softening of the Boeing HSCT

    NASA Technical Reports Server (NTRS)

    Fouladi, Kamran

    1999-01-01

    NASA Langley's computational efforts in the sonic-boom softening of the Boeing high-speed civil transport are discussed in this paper. In these efforts, an optimization process using a higher order Euler method for analysis was employed to reduce the sonic boom of a baseline configuration through fuselage camber and wing dihedral modifications. Fuselage modifications did not provide any improvements, but the dihedral modifications were shown to be an important tool for the softening process. The study also included aerodynamic and sonic-boom analyses of the baseline and some of the proposed "softened" configurations. Comparisons of two Euler methodologies and two propagation programs for sonic-boom predictions are also discussed in the present paper.

  3. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diachin, L F; Garaizar, F X; Henson, V E

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less

  4. Overview of NASA/OAST efforts related to manufacturing technology

    NASA Technical Reports Server (NTRS)

    Saunders, N. T.

    1976-01-01

    An overview of some of NASA's current efforts related to manufacturing technology and some possible directions for the future are presented. The topics discussed are: computer-aided design, composite structures, and turbine engine components.

  5. Data Network Weather Service Reporting - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Frey

    2012-08-30

    A final report is made of a three-year effort to develop a new forecasting paradigm for computer network performance. This effort was made in co-ordination with Fermi Lab's construction of e-Weather Center.

  6. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  7. Optimizing R with SparkR on a commodity cluster for biomedical research.

    PubMed

    Sedlmayr, Martin; Würfl, Tobias; Maier, Christian; Häberle, Lothar; Fasching, Peter; Prokosch, Hans-Ulrich; Christoph, Jan

    2016-12-01

    Medical researchers are challenged today by the enormous amount of data collected in healthcare. Analysis methods such as genome-wide association studies (GWAS) are often computationally intensive and thus require enormous resources to be performed in a reasonable amount of time. While dedicated clusters and public clouds may deliver the desired performance, their use requires upfront financial efforts or anonymous data, which is often not possible for preliminary or occasional tasks. We explored the possibilities to build a private, flexible cluster for processing scripts in R based on commodity, non-dedicated hardware of our department. For this, a GWAS-calculation in R on a single desktop computer, a Message Passing Interface (MPI)-cluster, and a SparkR-cluster were compared with regards to the performance, scalability, quality, and simplicity. The original script had a projected runtime of three years on a single desktop computer. Optimizing the script in R already yielded a significant reduction in computing time (2 weeks). By using R-MPI and SparkR, we were able to parallelize the computation and reduce the time to less than three hours (2.6 h) on already available, standard office computers. While MPI is a proven approach in high-performance clusters, it requires rather static, dedicated nodes. SparkR and its Hadoop siblings allow for a dynamic, elastic environment with automated failure handling. SparkR also scales better with the number of nodes in the cluster than MPI due to optimized data communication. R is a popular environment for clinical data analysis. The new SparkR solution offers elastic resources and allows supporting big data analysis using R even on non-dedicated resources with minimal change to the original code. To unleash the full potential, additional efforts should be invested to customize and improve the algorithms, especially with regards to data distribution. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  8. Implementing Equal Access Computer Labs.

    ERIC Educational Resources Information Center

    Clinton, Janeen; And Others

    This paper discusses the philosophy followed in Palm Beach County to adapt computer literacy curriculum, hardware, and software to meet the needs of all children. The Department of Exceptional Student Education and the Department of Instructional Computing Services cooperated in planning strategies and coordinating efforts to implement equal…

  9. The Effort Paradox: Effort Is Both Costly and Valued.

    PubMed

    Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y

    2018-04-01

    According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Computational Issues Associated with Temporally Deforming Geometries Such as Thrust Vectoring Nozzles

    NASA Technical Reports Server (NTRS)

    Boyalakuntla, Kishore; Soni, Bharat K.; Thornburg, Hugh J.; Yu, Robert

    1996-01-01

    During the past decade, computational simulation of fluid flow around complex configurations has progressed significantly and many notable successes have been reported, however, unsteady time-dependent solutions are not easily obtainable. The present effort involves unsteady time dependent simulation of temporally deforming geometries. Grid generation for a complex configuration can be a time consuming process and temporally varying geometries necessitate the regeneration of such grids for every time step. Traditional grid generation techniques have been tried and demonstrated to be inadequate to such simulations. Non-Uniform Rational B-splines (NURBS) based techniques provide a compact and accurate representation of the geometry. This definition can be coupled with a distribution mesh for a user defined spacing. The present method greatly reduces cpu requirements for time dependent remeshing, facilitating the simulation of more complex unsteady problems. A thrust vectoring nozzle has been chosen to demonstrate the capability as it is of current interest in the aerospace industry for better maneuverability of fighter aircraft in close combat and in post stall regimes. This current effort is the first step towards multidisciplinary design optimization which involves coupling the aerodynamic heat transfer and structural analysis techniques. Applications include simulation of temporally deforming bodies and aeroelastic problems.

  11. X-Ray Astronomy

    NASA Technical Reports Server (NTRS)

    Wu, S. T.

    2000-01-01

    Dr. S. N. Zhang has lead a seven member group (Dr. Yuxin Feng, Mr. XuejunSun, Mr. Yongzhong Chen, Mr. Jun Lin, Mr. Yangsen Yao, and Ms. Xiaoling Zhang). This group has carried out the following activities: continued data analysis from space astrophysical missions CGRO, RXTE, ASCA and Chandra. Significant scientific results have been produced as results of their work. They discovered the three-layered accretion disk structure around black holes in X-ray binaries; their paper on this discovery is to appear in the prestigious Science magazine. They have also developed a new method for energy spectral analysis of black hole X-ray binaries; four papers on this topics were presented at the most recent Atlanta AAS meeting. They have also carried Monte-Carlo simulations of X-ray detectors, in support to the hardware development efforts at Marshall Space Flight Center (MSFC). These computation-intensive simulations have been carried out entirely on the computers at UAH. They have also carried out extensive simulations for astrophysical applications, taking advantage of the Monte-Carlo simulation codes developed previously at MSFC and further improved at UAH for detector simulations. One refereed paper and one contribution to conference proceedings have been resulted from this effort.

  12. 76 FR 28443 - President's National Security Telecommunications Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... Government's use of cloud computing; the Federal Emergency Management Agency's NS/EP communications... Commercial Satellite Mission Assurance; and the way forward for the committee's cloud computing effort. The...

  13. Home

    Science.gov Websites

    System Award for developing a tool that has had a lasting influence on computing. Project Jupyter evolved lasting influence on computing. Project Jupyter evolved from IPython, an effort pioneered by Fernando PÃ

  14. Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

    2012-01-01

    We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

  15. Development and application of computational aerothermodynamics flowfield computer codes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj

    1994-01-01

    Research was performed in the area of computational modeling and application of hypersonic, high-enthalpy, thermo-chemical nonequilibrium flow (Aerothermodynamics) problems. A number of computational fluid dynamic (CFD) codes were developed and applied to simulate high altitude rocket-plume, the Aeroassist Flight Experiment (AFE), hypersonic base flow for planetary probes, the single expansion ramp model (SERN) connected with the National Aerospace Plane, hypersonic drag devices, hypersonic ramp flows, ballistic range models, shock tunnel facility nozzles, transient and steady flows in the shock tunnel facility, arc-jet flows, thermochemical nonequilibrium flows around simple and complex bodies, axisymmetric ionized flows of interest to re-entry, unsteady shock induced combustion phenomena, high enthalpy pulsed facility simulations, and unsteady shock boundary layer interactions in shock tunnels. Computational modeling involved developing appropriate numerical schemes for the flows on interest and developing, applying, and validating appropriate thermochemical processes. As part of improving the accuracy of the numerical predictions, adaptive grid algorithms were explored, and a user-friendly, self-adaptive code (SAGE) was developed. Aerothermodynamic flows of interest included energy transfer due to strong radiation, and a significant level of effort was spent in developing computational codes for calculating radiation and radiation modeling. In addition, computational tools were developed and applied to predict the radiative heat flux and spectra that reach the model surface.

  16. A neuronal model of a global workspace in effortful cognitive tasks.

    PubMed

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  17. A specific role for serotonin in overcoming effort cost.

    PubMed

    Meyniel, Florent; Goodwin, Guy M; Deakin, Jf William; Klinge, Corinna; MacFadyen, Christine; Milligan, Holly; Mullings, Emma; Pessiglione, Mathias; Gaillard, Raphaël

    2016-11-08

    Serotonin is implicated in many aspects of behavioral regulation. Theoretical attempts to unify the multiple roles assigned to serotonin proposed that it regulates the impact of costs, such as delay or punishment, on action selection. Here, we show that serotonin also regulates other types of action costs such as effort. We compared behavioral performance in 58 healthy humans treated during 8 weeks with either placebo or the selective serotonin reuptake inhibitor escitalopram. The task involved trading handgrip force production against monetary benefits. Participants in the escitalopram group produced more effort and thereby achieved a higher payoff. Crucially, our computational analysis showed that this effect was underpinned by a specific reduction of effort cost, and not by any change in the weight of monetary incentives. This specific computational effect sheds new light on the physiological role of serotonin in behavioral regulation and on the clinical effect of drugs for depression. ISRCTN75872983.

  18. DARPA-funded efforts in the development of novel brain-computer interface technologies.

    PubMed

    Miranda, Robbin A; Casebeer, William D; Hein, Amy M; Judy, Jack W; Krotkov, Eric P; Laabs, Tracy L; Manzo, Justin E; Pankratz, Kent G; Pratt, Gill A; Sanchez, Justin C; Weber, Douglas J; Wheeler, Tracey L; Ling, Geoffrey S F

    2015-04-15

    The Defense Advanced Research Projects Agency (DARPA) has funded innovative scientific research and technology developments in the field of brain-computer interfaces (BCI) since the 1970s. This review highlights some of DARPA's major advances in the field of BCI, particularly those made in recent years. Two broad categories of DARPA programs are presented with respect to the ultimate goals of supporting the nation's warfighters: (1) BCI efforts aimed at restoring neural and/or behavioral function, and (2) BCI efforts aimed at improving human training and performance. The programs discussed are synergistic and complementary to one another, and, moreover, promote interdisciplinary collaborations among researchers, engineers, and clinicians. Finally, this review includes a summary of some of the remaining challenges for the field of BCI, as well as the goals of new DARPA efforts in this domain. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Computer Technology and Social Issues.

    ERIC Educational Resources Information Center

    Garson, G. David

    Computing involves social issues and political choices. Issues such as privacy, computer crime, gender inequity, disemployment, and electronic democracy versus "Big Brother" are addressed in the context of efforts to develop a national public policy for information technology. A broad range of research and case studies are examined in an…

  20. Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.

  1. Experimental Evaluation and Workload Characterization for High-Performance Computer Architectures

    NASA Technical Reports Server (NTRS)

    El-Ghazawi, Tarek A.

    1995-01-01

    This research is conducted in the context of the Joint NSF/NASA Initiative on Evaluation (JNNIE). JNNIE is an inter-agency research program that goes beyond typical.bencbking to provide and in-depth evaluations and understanding of the factors that limit the scalability of high-performance computing systems. Many NSF and NASA centers have participated in the effort. Our research effort was an integral part of implementing JNNIE in the NASA ESS grand challenge applications context. Our research work under this program was composed of three distinct, but related activities. They include the evaluation of NASA ESS high- performance computing testbeds using the wavelet decomposition application; evaluation of NASA ESS testbeds using astrophysical simulation applications; and developing an experimental model for workload characterization for understanding workload requirements. In this report, we provide a summary of findings that covers all three parts, a list of the publications that resulted from this effort, and three appendices with the details of each of the studies using a key publication developed under the respective work.

  2. Quadratic Programming for Allocating Control Effort

    NASA Technical Reports Server (NTRS)

    Singh, Gurkirpal

    2005-01-01

    A computer program calculates an optimal allocation of control effort in a system that includes redundant control actuators. The program implements an iterative (but otherwise single-stage) algorithm of the quadratic-programming type. In general, in the quadratic-programming problem, one seeks the values of a set of variables that minimize a quadratic cost function, subject to a set of linear equality and inequality constraints. In this program, the cost function combines control effort (typically quantified in terms of energy or fuel consumed) and control residuals (differences between commanded and sensed values of variables to be controlled). In comparison with prior control-allocation software, this program offers approximately equal accuracy but much greater computational efficiency. In addition, this program offers flexibility, robustness to actuation failures, and a capability for selective enforcement of control requirements. The computational efficiency of this program makes it suitable for such complex, real-time applications as controlling redundant aircraft actuators or redundant spacecraft thrusters. The program is written in the C language for execution in a UNIX operating system.

  3. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johanna H Oxstrand; Katya L Le Blanc

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts wemore » are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups, sharing procedures between fellow coworkers, the use of multiple procedures at once, etc. were considered. The model describes which affordances associated with paper based procedures should be transferred to computer-based procedures as well as what features should not be incorporated. The model also provides a means to identify what new features not present in paper based procedures need to be added to the computer-based procedures to further enhance performance. The next step is to use the requirements and specifications to develop concepts and prototypes of computer-based procedures. User tests and other data collection efforts will be conducted to ensure that the real issues with field procedures and their usage are being addressed and solved in the best manner possible. This paper describes the baseline study, the construction of the model of procedure use, and the requirements and specifications for computer-based procedures that were developed based on the model. It also addresses how the model and the insights gained from it were used to develop concepts and prototypes for computer based procedures.« less

  4. Influence of lifestyle patterns on perceptions of obesity and overweight among expatriates in Abha city of Kingdom of Saudi Arabia

    PubMed Central

    Zaman, Gaffar Sarwar

    2015-01-01

    Background: We evaluated the influence of lifestyle patterns such as watching TV, working with computer and idle sitting time on perceptions of obesity and beliefs about overweight are associated with obesity and overweight amongst Expatriates in Abha. Materials and Methods: The method used in this study was a cross-sectional survey with a self-administered paper-based questionnaire. The survey collected information on lifestyle choices and the risk factors that contribute to obesity. In addition, height and weight were measured. Results: Greater number of our study subjects spent over 2 h/day without any physical activity, specifically accounting for over 2 h/day each in viewing TV, computer, and spending idle time. This increased lack of physical activities was significantly associated with overweight. While the overweight subjects were aware of very wide options for treating their condition, a significant number believed in self-effort in managing their diet and exercise regimen as the best efforts to reduce their overweight. Interestingly very few overweight subjects considered medication or surgery as a potential therapeutic option and 75% of the overweight subjects considered overweight to be of no or only slight concern on wellbeing. Conclusions: Overweight and obesity among expatriates within Saudi Arabia poses an important public health problem. The lack of awareness about the potential impact of obesity on health and optimal treatment options is a serious concern, which needs to be addressed by appropriate public health programs at national level. PMID:26283823

  5. Dynamic Interaction of Long Suspension Bridges with Running Trains

    NASA Astrophysics Data System (ADS)

    XIA, H.; XU, Y. L.; CHAN, T. H. T.

    2000-10-01

    This paper presents an investigation of dynamic interaction of long suspension bridges with running trains. A three-dimensional finite element model is used to represent a long suspension bridge. Each 4-axle vehicle in a train is modelled by a 27-degrees-of-freedom dynamic system. The dynamic interaction between the bridge and train is realized through the contact forces between the wheels and track. By applying a mode superposition technique to the bridge only and taking the measured track irregularities as known quantities, the number of degrees of freedom (d.o.f.) the bridge-train system is significantly reduced and the coupled equations of motion are efficiently solved. The proposed formulation and the associated computer program are then applied to a real long suspension bridge carrying a railway within the bridge deck. The dynamic response of the bridge-train system and the derail and offload factors related to the running safety of the train are computed. The results show that the formulation presented in this paper can well predict dynamic behaviors of both bridge and train with reasonable computation efforts. Dynamic interaction between the long suspension bridge and train is not significant.

  6. Time accurate application of the MacCormack 2-4 scheme on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Hudson, Dale A.; Long, Lyle N.

    1995-01-01

    Many recent computational efforts in turbulence and acoustics research have used higher order numerical algorithms. One popular method has been the explicit MacCormack 2-4 scheme. The MacCormack 2-4 scheme is second order accurate in time and fourth order accurate in space, and is stable for CFL's below 2/3. Current research has shown that the method can give accurate results but does exhibit significant Gibbs phenomena at sharp discontinuities. The impact of adding Jameson type second, third, and fourth order artificial viscosity was examined here. Category 2 problems, the nonlinear traveling wave and the Riemann problem, were computed using a CFL number of 0.25. This research has found that dispersion errors can be significantly reduced or nearly eliminated by using a combination of second and third order terms in the damping. Use of second and fourth order terms reduced the magnitude of dispersion errors but not as effectively as the second and third order combination. The program was coded using Thinking Machine's CM Fortran, a variant of Fortran 90/High Performance Fortran, and was executed on a 2K CM-200. Simple extrapolation boundary conditions were used for both problems.

  7. Concurrent processing simulation of the space station

    NASA Technical Reports Server (NTRS)

    Gluck, R.; Hale, A. L.; Sunkel, John W.

    1989-01-01

    The development of a new capability for the time-domain simulation of multibody dynamic systems and its application to the study of a large angle rotational maneuvers of the Space Station is described. The effort was divided into three sequential tasks, which required significant advancements of the state-of-the art to accomplish. These were: (1) the development of an explicit mathematical model via symbol manipulation of a flexible, multibody dynamic system; (2) the development of a methodology for balancing the computational load of an explicit mathematical model for concurrent processing; and (3) the implementation and successful simulation of the above on a prototype Custom Architectured Parallel Processing System (CAPPS) containing eight processors. The throughput rate achieved by the CAPPS operating at only 70 percent efficiency, was 3.9 times greater than that obtained sequentially by the IBM 3090 supercomputer simulating the same problem. More significantly, analysis of the results leads to the conclusion that the relative cost effectiveness of concurrent vs. sequential digital computation will grow substantially as the computational load is increased. This is a welcomed development in an era when very complex and cumbersome mathematical models of large space vehicles must be used as substitutes for full scale testing which has become impractical.

  8. Deformation of Soft Tissue and Force Feedback Using the Smoothed Particle Hydrodynamics

    PubMed Central

    Liu, Xuemei; Wang, Ruiyi; Li, Yunhua; Song, Dongdong

    2015-01-01

    We study the deformation and haptic feedback of soft tissue in virtual surgery based on a liver model by using a force feedback device named PHANTOM OMNI developed by SensAble Company in USA. Although a significant amount of research efforts have been dedicated to simulating the behaviors of soft tissue and implementing force feedback, it is still a challenging problem. This paper introduces a kind of meshfree method for deformation simulation of soft tissue and force computation based on viscoelastic mechanical model and smoothed particle hydrodynamics (SPH). Firstly, viscoelastic model can present the mechanical characteristics of soft tissue which greatly promotes the realism. Secondly, SPH has features of meshless technique and self-adaption, which supply higher precision than methods based on meshes for force feedback computation. Finally, a SPH method based on dynamic interaction area is proposed to improve the real time performance of simulation. The results reveal that SPH methodology is suitable for simulating soft tissue deformation and force feedback calculation, and SPH based on dynamic local interaction area has a higher computational efficiency significantly compared with usual SPH. Our algorithm has a bright prospect in the area of virtual surgery. PMID:26417380

  9. Near-Source Modeling Updates: Building Downwash & Near-Road

    EPA Science Inventory

    The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...

  10. 76 FR 17424 - President's National Security Telecommunications Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-29

    ... discuss and vote on the Communications Resiliency Report and receive an update on the cloud computing... Communications Resiliency Report III. Update on the Cloud Computing Scoping Effort IV. Closing Remarks Dated...

  11. Conjugate Gradient Algorithms For Manipulator Simulation

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Scheid, Robert E.

    1991-01-01

    Report discusses applicability of conjugate-gradient algorithms to computation of forward dynamics of robotic manipulators. Rapid computation of forward dynamics essential to teleoperation and other advanced robotic applications. Part of continuing effort to find algorithms meeting requirements for increased computational efficiency and speed. Method used for iterative solution of systems of linear equations.

  12. Using Information Technology in Mathematics Education.

    ERIC Educational Resources Information Center

    Tooke, D. James, Ed.; Henderson, Norma, Ed.

    This collection of essays examines the history and impact of computers in mathematics and mathematics education from the early, computer-assisted instruction efforts through LOGO, the constructivist educational software for K-9 schools developed in the 1980s, to MAPLE, the computer algebra system for mathematical problem solving developed in the…

  13. Cooperation Support in Computer-Aided Authoring and Learning.

    ERIC Educational Resources Information Center

    Muhlhauser, Max; Rudebusch, Tom

    This paper discusses the use of Computer Supported Cooperative Work (CSCW) techniques for computer-aided learning (CAL); the work was started in the context of project Nestor, a joint effort of German universities about cooperative multimedia authoring/learning environments. There are four major categories of cooperation for CAL: author/author,…

  14. 2016 Annual Report - Argonne Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Jim; Papka, Michael E.; Cerny, Beth A.

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  15. Computer-Based Training: Capitalizing on Lessons Learned

    ERIC Educational Resources Information Center

    Bedwell, Wendy L.; Salas, Eduardo

    2010-01-01

    Computer-based training (CBT) is a methodology for providing systematic, structured learning; a useful tool when properly designed. CBT has seen a resurgence given the serious games movement, which is at the forefront of integrating primarily entertainment computer-based games into education and training. This effort represents a multidisciplinary…

  16. "Computer Science Can Feed a Lot of Dreams"

    ERIC Educational Resources Information Center

    Educational Horizons, 2014

    2014-01-01

    Pat Yongpradit is the director of education at Code.org. He leads all education efforts, including professional development and curriculum creation, and he builds relationships with school districts. Pat joined "Educational Horizons" to talk about why it is important to teach computer science--even for non-computer science teachers. This…

  17. Understanding the Internet.

    ERIC Educational Resources Information Center

    Oblinger, Diana

    The Internet is an international network linking hundreds of smaller computer networks in North America, Europe, and Asia. Using the Internet, computer users can connect to a variety of computers with little effort or expense. The potential for use by college faculty is enormous. The largest problem faced by most users is understanding what such…

  18. "Intelligent" Computer Assisted Instruction (CAI) Applications. Interim Report.

    ERIC Educational Resources Information Center

    Brown, John Seely; And Others

    Interim work is documented describing efforts to modify computer techniques used to recognize and process English language requests to an instructional simulator. The conversion from a hand-coded to a table driven technique are described in detail. Other modifications to a simulation based computer assisted instruction program to allow a gaming…

  19. Pseudo-point transport technique: a new method for solving the Boltzmann transport equation in media with highly fluctuating cross sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhai, B.

    A new method for solving radiation transport problems is presented. The heart of the technique is a new cross section processing procedure for the calculation of group-to-point and point-to-group cross sections sets. The method is ideally suited for problems which involve media with highly fluctuating cross sections, where the results of the traditional multigroup calculations are beclouded by the group averaging procedures employed. Extensive computational efforts, which would be required to evaluate double integrals in the multigroup treatment numerically, prohibit iteration to optimize the energy boundaries. On the other hand, use of point-to-point techniques (as in the stochastic technique) ismore » often prohibitively expensive due to the large computer storage requirement. The pseudo-point code is a hybrid of the two aforementioned methods (group-to-group and point-to-point) - hence the name pseudo-point - that reduces the computational efforts of the former and the large core requirements of the latter. The pseudo-point code generates the group-to-point or the point-to-group transfer matrices, and can be coupled with the existing transport codes to calculate pointwise energy-dependent fluxes. This approach yields much more detail than is available from the conventional energy-group treatments. Due to the speed of this code, several iterations could be performed (in affordable computing efforts) to optimize the energy boundaries and the weighting functions. The pseudo-point technique is demonstrated by solving six problems, each depicting a certain aspect of the technique. The results are presented as flux vs energy at various spatial intervals. The sensitivity of the technique to the energy grid and the savings in computational effort are clearly demonstrated.« less

  20. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less

  1. Web Service Model for Plasma Simulations with Automatic Post Processing and Generation of Visual Diagnostics*

    NASA Astrophysics Data System (ADS)

    Exby, J.; Busby, R.; Dimitrov, D. A.; Bruhwiler, D.; Cary, J. R.

    2003-10-01

    We present our design and initial implementation of a web service model for running particle-in-cell (PIC) codes remotely from a web browser interface. PIC codes have grown significantly in complexity and now often require parallel execution on multiprocessor computers, which in turn requires sophisticated post-processing and data analysis. A significant amount of time and effort is required for a physicist to develop all the necessary skills, at the expense of actually doing research. Moreover, parameter studies with a computationally intensive code justify the systematic management of results with an efficient way to communicate them among a group of remotely located collaborators. Our initial implementation uses the OOPIC Pro code [1], Linux, Apache, MySQL, Python, and PHP. The Interactive Data Language is used for visualization. [1] D.L. Bruhwiler et al., Phys. Rev. ST-AB 4, 101302 (2001). * This work is supported by DOE grant # DE-FG02-03ER83857 and by Tech-X Corp. ** Also University of Colorado.

  2. Unbiased Rare Event Sampling in Spatial Stochastic Systems Biology Models Using a Weighted Ensemble of Trajectories

    PubMed Central

    Donovan, Rory M.; Tapia, Jose-Juan; Sullivan, Devin P.; Faeder, James R.; Murphy, Robert F.; Dittrich, Markus; Zuckerman, Daniel M.

    2016-01-01

    The long-term goal of connecting scales in biological simulation can be facilitated by scale-agnostic methods. We demonstrate that the weighted ensemble (WE) strategy, initially developed for molecular simulations, applies effectively to spatially resolved cell-scale simulations. The WE approach runs an ensemble of parallel trajectories with assigned weights and uses a statistical resampling strategy of replicating and pruning trajectories to focus computational effort on difficult-to-sample regions. The method can also generate unbiased estimates of non-equilibrium and equilibrium observables, sometimes with significantly less aggregate computing time than would be possible using standard parallelization. Here, we use WE to orchestrate particle-based kinetic Monte Carlo simulations, which include spatial geometry (e.g., of organelles, plasma membrane) and biochemical interactions among mobile molecular species. We study a series of models exhibiting spatial, temporal and biochemical complexity and show that although WE has important limitations, it can achieve performance significantly exceeding standard parallel simulation—by orders of magnitude for some observables. PMID:26845334

  3. Computer Modeling of High-Intensity Cs-Sputter Ion Sources

    NASA Astrophysics Data System (ADS)

    Brown, T. A.; Roberts, M. L.; Southon, J. R.

    The grid-point mesh program NEDLab has been used to computer model the interior of the high-intensity Cs-sputter source used in routine operations at the Center for Accelerator Mass Spectrometry (CAMS), with the goal of improving negative ion output. NEDLab has several features that are important to realistic modeling of such sources. First, space-charge effects are incorporated in the calculations through an automated ion-trajectories/Poissonelectric-fields successive-iteration process. Second, space charge distributions can be averaged over successive iterations to suppress model instabilities. Third, space charge constraints on ion emission from surfaces can be incorporate under Child's Law based algorithms. Fourth, the energy of ions emitted from a surface can be randomly chosen from within a thermal energy distribution. And finally, ions can be emitted from a surface at randomized angles The results of our modeling effort indicate that significant modification of the interior geometry of the source will double Cs+ ion production from our spherical ionizer and produce a significant increase in negative ion output from the source.

  4. Compressive Spectral Method for the Simulation of the Nonlinear Gravity Waves

    PubMed Central

    Bayındır, Cihan

    2016-01-01

    In this paper an approach for decreasing the computational effort required for the spectral simulations of the fully nonlinear ocean waves is introduced. The proposed approach utilizes the compressive sampling algorithm and depends on the idea of using a smaller number of spectral components compared to the classical spectral method. After performing the time integration with a smaller number of spectral components and using the compressive sampling technique, it is shown that the ocean wave field can be reconstructed with a significantly better efficiency compared to the classical spectral method. For the sparse ocean wave model in the frequency domain the fully nonlinear ocean waves with Jonswap spectrum is considered. By implementation of a high-order spectral method it is shown that the proposed methodology can simulate the linear and the fully nonlinear ocean waves with negligible difference in the accuracy and with a great efficiency by reducing the computation time significantly especially for large time evolutions. PMID:26911357

  5. Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur; Modiano, David

    1995-01-01

    Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.

  6. Innovative computer-aided methods for the discovery of new kinase ligands.

    PubMed

    Abuhammad, Areej; Taha, Mutasem

    2016-04-01

    Recent evidence points to significant roles played by protein kinases in cell signaling and cellular proliferation. Faulty protein kinases are involved in cancer, diabetes and chronic inflammation. Efforts are continuously carried out to discover new inhibitors for selected protein kinases. In this review, we discuss two new computer-aided methodologies we developed to mine virtual databases for new bioactive compounds. One method is ligand-based exploration of the pharmacophoric space of inhibitors of any particular biotarget followed by quantitative structure-activity relationship-based selection of the best pharmacophore(s). The second approach is structure-based assuming that potent ligands come into contact with binding site spots distinct from those contacted by weakly potent ligands. Both approaches yield pharmacophores useful as 3D search queries for the discovery of new bioactive (kinase) inhibitors.

  7. Structural factoring approach for analyzing stochastic networks

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  8. Productivity increase through implementation of CAD/CAE workstation

    NASA Technical Reports Server (NTRS)

    Bromley, L. K.

    1985-01-01

    The tracking and communication division computer aided design/computer aided engineering system is now operational. The system is utilized in an effort to automate certain tasks that were previously performed manually. These tasks include detailed test configuration diagrams of systems under certification test in the ESTL, floorplan layouts of future planned laboratory reconfigurations, and other graphical documentation of division activities. The significant time savings achieved with this CAD/CAE system are examined: (1) input of drawings and diagrams; (2) editing of initial drawings; (3) accessibility of the data; and (4) added versatility. It is shown that the Applicon CAD/CAE system, with its ease of input and editing, the accessibility of data, and its added versatility, has made more efficient many of the necessary but often time-consuming tasks associated with engineering design and testing.

  9. Enhancing survey data collection among youth and adults: use of handheld and laptop computers.

    PubMed

    Bobula, James A; Anderson, Lori S; Riesch, Susan K; Canty-Mitchell, Janie; Duncan, Angela; Kaiser-Krueger, Heather A; Brown, Roger L; Angresano, Nicole

    2004-01-01

    Tobacco use, alcohol and other drug use, early sexual behavior, dietary practices, physical inactivity, and activities that contribute to unintentional and intentional injuries are a significant threat to the health of young people. These behaviors have immediate and long-term consequences and contribute to diminished health, educational, and social outcomes. Research suggests that health risk behaviors exhibited during adolescence and adulthood have their origins earlier in childhood and preventive interventions are less successful after the risk behaviors have begun. Therefore, efforts to prevent health risk behaviors are best initiated in late childhood or early adolescence. However, to document the efficacy of these efforts, reliable, valid, and parent/child-friendly systems of data collection are required. Computerized data collection for research has been found to improve privacy, confidentiality, and portability over the paper-and-pencil method, which, in turn, enhances the reliability of sensitive data such as alcohol use or sexual activity. We developed programming tools for the personal computer and a handheld personal data assistant to offer a comprehensive set of user interface design elements, relational databases, and ample programming languages so that adults could answer 261 items and youth 346 items. The purpose of the article was to describe an innovative handheld computer-assisted survey interview method of collecting sensitive data with children aged 9 to 11. The method was developed as part of a large multisite, national study to prevent substance use.

  10. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  11. New Parallel computing framework for radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, M.A.; /Michigan State U., NSCL; Mokhov, N.V.

    A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility canmore » be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  12. CFD Based Design of a Filming Injector for N+3 Combustors

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Mongia, Hukam; Lee, Phil

    2016-01-01

    An effort was undertaken to perform CFD analysis of fluid flow in Lean-Direct Injection (LDI) combustors with axial swirl-venturi elements for next-generation LDI-3 combustor design. The National Combustion Code (NCC) was used to perform non-reacting and two-phase reacting flow computations for a newly-designed pre-filming type fuel injector LDI-3 injector, in a single-injector and a five-injector array configuration. All computations were performed with a consistent approach of mesh-optimization, spray-modeling, ignition and kinetics-modeling. Computational predictions of the aerodynamics of the single-injector were used to arrive at an optimized main-injector design that meets effective area and fuel-air mixing criteria. Emissions (EINOx) characteristics were predicted for a medium-power engine cycle condition, and will be compared with data when it is made available from experimental measurements. The use of a PDF-like turbulence-chemistry interaction model with NCC's Time-Filtered Navier-Stokes (TFNS) solver is shown to produce a significant impact on the CFD results, when compared with a laminar-chemistry TFNS approach for the five-injector computations.

  13. Efficient computation of the phylogenetic likelihood function on multi-gene alignments and multi-core architectures.

    PubMed

    Stamatakis, Alexandros; Ott, Michael

    2008-12-27

    The continuous accumulation of sequence data, for example, due to novel wet-laboratory techniques such as pyrosequencing, coupled with the increasing popularity of multi-gene phylogenies and emerging multi-core processor architectures that face problems of cache congestion, poses new challenges with respect to the efficient computation of the phylogenetic maximum-likelihood (ML) function. Here, we propose two approaches that can significantly speed up likelihood computations that typically represent over 95 per cent of the computational effort conducted by current ML or Bayesian inference programs. Initially, we present a method and an appropriate data structure to efficiently compute the likelihood score on 'gappy' multi-gene alignments. By 'gappy' we denote sampling-induced gaps owing to missing sequences in individual genes (partitions), i.e. not real alignment gaps. A first proof-of-concept implementation in RAXML indicates that this approach can accelerate inferences on large and gappy alignments by approximately one order of magnitude. Moreover, we present insights and initial performance results on multi-core architectures obtained during the transition from an OpenMP-based to a Pthreads-based fine-grained parallelization of the ML function.

  14. Promoting Physical Activity through Hand-Held Computer Technology

    PubMed Central

    King, Abby C.; Ahn, David K.; Oliveira, Brian M.; Atienza, Audie A.; Castro, Cynthia M.; Gardner, Christopher D.

    2009-01-01

    Background Efforts to achieve population-wide increases in walking and similar moderate-intensity physical activities potentially can be enhanced through relevant applications of state-of-the-art interactive communication technologies. Yet few systematic efforts to evaluate the efficacy of hand-held computers and similar devices for enhancing physical activity levels have occurred. The purpose of this first-generation study was to evaluate the efficacy of a hand-held computer (i.e., personal digital assistant [PDA]) for increasing moderate intensity or more vigorous (MOD+) physical activity levels over 8 weeks in mid-life and older adults relative to a standard information control arm. Design Randomized, controlled 8-week experiment. Data were collected in 2005 and analyzed in 2006-2007. Setting/Participants Community-based study of 37 healthy, initially underactive adults aged 50 years and older who were randomized and completed the 8-week study (intervention=19, control=18). Intervention Participants received an instructional session and a PDA programmed to monitor their physical activity levels twice per day and provide daily and weekly individualized feedback, goal setting, and support. Controls received standard, age-appropriate written physical activity educational materials. Main Outcome Measure Physical activity was assessed via the Community Healthy Activities Model Program for Seniors (CHAMPS) questionnaire at baseline and 8 weeks. Results Relative to controls, intervention participants reported significantly greater 8-week mean estimated caloric expenditure levels and minutes per week in MOD+ activity (p<0.04). Satisfaction with the PDA was reasonably high in this largely PDA-naive sample. Conclusions Results from this first-generation study indicate that hand-held computers may be effective tools for increasing initial physical activity levels among underactive adults. PMID:18201644

  15. Loss of Productivity Due to Neck/Shoulder Symptoms and Hand/Arm Symptoms: Results from the PROMO-Study

    PubMed Central

    IJmker, Stefan; Blatter, Birgitte M.; de Korte, Elsbeth M.

    2007-01-01

    Introduction The objective of the present study is to describe the extent of productivity loss among computer workers with neck/shoulder symptoms and hand/arm symptoms, and to examine associations between pain intensity, various physical and psychosocial factors and productivity loss in computer workers with neck/shoulder and hand/arm symptoms. Methods A cross-sectional design was used. The study population consisted of 654 computer workers with neck/shoulder or hand/arm symptoms from five different companies. Descriptive statistics were used to describe the occurrence of self-reported productivity loss. Logistic regression analyses were used to examine the associations. Results In 26% of all the cases reporting symptoms, productivity loss was involved, the most often in cases reporting both symptoms (36%). Productivity loss involved sickness absence in 11% of the arm/hand cases, 32% of the neck/shoulder cases and 43% of the cases reporting both symptoms. The multivariate analyses showed statistically significant odds ratios for pain intensity (OR: 1.26; CI: 1.12–1.41), for high effort/no low reward (OR: 2.26; CI: 1.24–4.12), for high effort/low reward (OR: 1.95; CI: 1.09–3.50), and for low job satisfaction (OR: 3.10; CI: 1.44–6.67). Physical activity in leisure time, full-time work and overcommitment were not associated with productivity loss. Conclusion In most computer workers with neck/shoulder symptoms or hand/arm symptoms productivity loss derives from a decreased performance at work and not from sickness absence. Favorable psychosocial work characteristics might prevent productivity loss in symptomatic workers. PMID:17636455

  16. A comparison of native GPU computing versus OpenACC for implementing flow-routing algorithms in hydrological applications

    NASA Astrophysics Data System (ADS)

    Rueda, Antonio J.; Noguera, José M.; Luque, Adrián

    2016-02-01

    In recent years GPU computing has gained wide acceptance as a simple low-cost solution for speeding up computationally expensive processing in many scientific and engineering applications. However, in most cases accelerating a traditional CPU implementation for a GPU is a non-trivial task that requires a thorough refactorization of the code and specific optimizations that depend on the architecture of the device. OpenACC is a promising technology that aims at reducing the effort required to accelerate C/C++/Fortran code on an attached multicore device. Virtually with this technology the CPU code only has to be augmented with a few compiler directives to identify the areas to be accelerated and the way in which data has to be moved between the CPU and GPU. Its potential benefits are multiple: better code readability, less development time, lower risk of errors and less dependency on the underlying architecture and future evolution of the GPU technology. Our aim with this work is to evaluate the pros and cons of using OpenACC against native GPU implementations in computationally expensive hydrological applications, using the classic D8 algorithm of O'Callaghan and Mark for river network extraction as case-study. We implemented the flow accumulation step of this algorithm in CPU, using OpenACC and two different CUDA versions, comparing the length and complexity of the code and its performance with different datasets. We advance that although OpenACC can not match the performance of a CUDA optimized implementation (×3.5 slower in average), it provides a significant performance improvement against a CPU implementation (×2-6) with by far a simpler code and less implementation effort.

  17. Re-Computation of Numerical Results Contained in NACA Report No. 496

    NASA Technical Reports Server (NTRS)

    Perry, Boyd, III

    2015-01-01

    An extensive examination of NACA Report No. 496 (NACA 496), "General Theory of Aerodynamic Instability and the Mechanism of Flutter," by Theodore Theodorsen, is described. The examination included checking equations and solution methods and re-computing interim quantities and all numerical examples in NACA 496. The checks revealed that NACA 496 contains computational shortcuts (time- and effort-saving devices for engineers of the time) and clever artifices (employed in its solution methods), but, unfortunately, also contains numerous tripping points (aspects of NACA 496 that have the potential to cause confusion) and some errors. The re-computations were performed employing the methods and procedures described in NACA 496, but using modern computational tools. With some exceptions, the magnitudes and trends of the original results were in fair-to-very-good agreement with the re-computed results. The exceptions included what are speculated to be computational errors in the original in some instances and transcription errors in the original in others. Independent flutter calculations were performed and, in all cases, including those where the original and re-computed results differed significantly, were in excellent agreement with the re-computed results. Appendix A contains NACA 496; Appendix B contains a Matlab(Reistered) program that performs the re-computation of results; Appendix C presents three alternate solution methods, with examples, for the two-degree-of-freedom solution method of NACA 496; Appendix D contains the three-degree-of-freedom solution method (outlined in NACA 496 but never implemented), with examples.

  18. Application Portable Parallel Library

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  19. A comparative study on different methods of automatic mesh generation of human femurs.

    PubMed

    Viceconti, M; Bellingeri, L; Cristofolini, L; Toni, A

    1998-01-01

    The aim of this study was to evaluate comparatively five methods for automating mesh generation (AMG) when used to mesh a human femur. The five AMG methods considered were: mapped mesh, which provides hexahedral elements through a direct mapping of the element onto the geometry; tetra mesh, which generates tetrahedral elements from a solid model of the object geometry; voxel mesh which builds cubic 8-node elements directly from CT images; and hexa mesh that automatically generated hexahedral elements from a surface definition of the femur geometry. The various methods were tested against two reference models: a simplified geometric model and a proximal femur model. The first model was useful to assess the inherent accuracy of the meshes created by the AMG methods, since an analytical solution was available for the elastic problem of the simplified geometric model. The femur model was used to test the AMG methods in a more realistic condition. The femoral geometry was derived from a reference model (the "standardized femur") and the finite element analyses predictions were compared to experimental measurements. All methods were evaluated in terms of human and computer effort needed to carry out the complete analysis, and in terms of accuracy. The comparison demonstrated that each tested method deserves attention and may be the best for specific situations. The mapped AMG method requires a significant human effort but is very accurate and it allows a tight control of the mesh structure. The tetra AMG method requires a solid model of the object to be analysed but is widely available and accurate. The hexa AMG method requires a significant computer effort but can also be used on polygonal models and is very accurate. The voxel AMG method requires a huge number of elements to reach an accuracy comparable to that of the other methods, but it does not require any pre-processing of the CT dataset to extract the geometry and in some cases may be the only viable solution.

  20. Baseline Intraocular Pressure Is Associated With Subjective Sensitivity to Physical Exertion in Young Males.

    PubMed

    Vera, Jesús; Jiménez, Raimundo; García, José Antonio; Perales, José Cesar; Cárdenas, David

    2018-03-01

    The purposes of this study were to (a) investigate the effect of physical effort (cycling for 60 min at 60 ± 5% of individually computed reserve heart-rate capacity), combined with 2 different levels of cognitive demand (2-back, oddball), on intraocular pressure (IOP) and subjective judgments of perceived exertion (ratings of perceived exertion [RPE]), affect (Affective Valence subscale of the Self-Assessment Manikin [SAM]), and mental workload (National Aeronautics and Space Administration Task Load Index [NASA-TLX]); and (b) ascertain whether baseline IOP, measured before exercise, is associated with individual differences in subjective assessments of effort and affective response during exercise. Seventeen participants (M age = 23.28 ± 2.37 years) performed 2 physical/cognitive dual tasks, matched in physical demand but with different mental requirements (2-back, oddball). We assessed IOP before exercise, after 2 min of active recovery, and after 15 min of passive recovery, and we also collected RPE and SAM measures during the sessions (28 measurement points). We used NASA-TLX and cognitive performance as checks of the mental manipulation. (a) Intraocular pressure increased after concomitant physical/mental effort, with the effect reaching statistical significance after the 2-back task (p = .002, d = 0.35) but not after the oddball condition (p = .092, d = 0.29). (b) Baseline IOP was associated with subjective sensitivity to effort and showed statistical significance for the oddball condition (p = .03, ƞ p 2  = .622) but not for the 2-back task (F < 1). Results suggest a relationship between IOP and physical/cognitive effort, which could have implications for the management of glaucoma. Additionally, a rapid measure of IOP could be used as a marker of individual effort sensitivity in applied settings.

  1. A robust variant of block Jacobi-Davidson for extracting a large number of eigenpairs: Application to grid-based real-space density functional theory

    NASA Astrophysics Data System (ADS)

    Lee, M.; Leiter, K.; Eisner, C.; Breuer, A.; Wang, X.

    2017-09-01

    In this work, we investigate a block Jacobi-Davidson (J-D) variant suitable for sparse symmetric eigenproblems where a substantial number of extremal eigenvalues are desired (e.g., ground-state real-space quantum chemistry). Most J-D algorithm variations tend to slow down as the number of desired eigenpairs increases due to frequent orthogonalization against a growing list of solved eigenvectors. In our specification of block J-D, all of the steps of the algorithm are performed in clusters, including the linear solves, which allows us to greatly reduce computational effort with blocked matrix-vector multiplies. In addition, we move orthogonalization against locked eigenvectors and working eigenvectors outside of the inner loop but retain the single Ritz vector projection corresponding to the index of the correction vector. Furthermore, we minimize the computational effort by constraining the working subspace to the current vectors being updated and the latest set of corresponding correction vectors. Finally, we incorporate accuracy thresholds based on the precision required by the Fermi-Dirac distribution. The net result is a significant reduction in the computational effort against most previous block J-D implementations, especially as the number of wanted eigenpairs grows. We compare our approach with another robust implementation of block J-D (JDQMR) and the state-of-the-art Chebyshev filter subspace (CheFSI) method for various real-space density functional theory systems. Versus CheFSI, for first-row elements, our method yields competitive timings for valence-only systems and 4-6× speedups for all-electron systems with up to 10× reduced matrix-vector multiplies. For all-electron calculations on larger elements (e.g., gold) where the wanted spectrum is quite narrow compared to the full spectrum, we observe 60× speedup with 200× fewer matrix-vector multiples vs. CheFSI.

  2. A robust variant of block Jacobi-Davidson for extracting a large number of eigenpairs: Application to grid-based real-space density functional theory.

    PubMed

    Lee, M; Leiter, K; Eisner, C; Breuer, A; Wang, X

    2017-09-21

    In this work, we investigate a block Jacobi-Davidson (J-D) variant suitable for sparse symmetric eigenproblems where a substantial number of extremal eigenvalues are desired (e.g., ground-state real-space quantum chemistry). Most J-D algorithm variations tend to slow down as the number of desired eigenpairs increases due to frequent orthogonalization against a growing list of solved eigenvectors. In our specification of block J-D, all of the steps of the algorithm are performed in clusters, including the linear solves, which allows us to greatly reduce computational effort with blocked matrix-vector multiplies. In addition, we move orthogonalization against locked eigenvectors and working eigenvectors outside of the inner loop but retain the single Ritz vector projection corresponding to the index of the correction vector. Furthermore, we minimize the computational effort by constraining the working subspace to the current vectors being updated and the latest set of corresponding correction vectors. Finally, we incorporate accuracy thresholds based on the precision required by the Fermi-Dirac distribution. The net result is a significant reduction in the computational effort against most previous block J-D implementations, especially as the number of wanted eigenpairs grows. We compare our approach with another robust implementation of block J-D (JDQMR) and the state-of-the-art Chebyshev filter subspace (CheFSI) method for various real-space density functional theory systems. Versus CheFSI, for first-row elements, our method yields competitive timings for valence-only systems and 4-6× speedups for all-electron systems with up to 10× reduced matrix-vector multiplies. For all-electron calculations on larger elements (e.g., gold) where the wanted spectrum is quite narrow compared to the full spectrum, we observe 60× speedup with 200× fewer matrix-vector multiples vs. CheFSI.

  3. Photogrammetry Measurements During a Tanking Test on the Space Shuttle External Tank, ET-137

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.; Schmidt, Tim; Tyson, John; Oliver, Stanley T.; Melis, Matthew E.; Ruggeri, Charles

    2012-01-01

    On November 5, 2010, a significant foam liberation threat was observed as the Space Shuttle STS-133 launch effort was scrubbed because of a hydrogen leak at the ground umbilical carrier plate. Further investigation revealed the presence of multiple cracks at the tops of stringers in the intertank region of the Space Shuttle External Tank. As part of an instrumented tanking test conducted on December 17, 2010, a three dimensional digital image correlation photogrammetry system was used to measure radial deflections and overall deformations of a section of the intertank region. This paper will describe the experimental challenges that were overcome in order to implement the photogrammetry measurements for the tanking test in support of STS-133. The technique consisted of configuring and installing two pairs of custom stereo camera bars containing calibrated cameras on the 215-ft level of the fixed service structure of Launch Pad 39-A. The cameras were remotely operated from the Launch Control Center 3.5 miles away during the 8 hour duration test, which began before sunrise and lasted through sunset. The complete deformation time history was successfully computed from the acquired images and would prove to play a crucial role in the computer modeling validation efforts supporting the successful completion of the root cause analysis of the cracked stringer problem by the Space Shuttle Program. The resulting data generated included full field fringe plots, data extraction time history analysis, section line spatial analyses and differential stringer peak ]valley motion. Some of the sample results are included with discussion. The resulting data showed that new stringer crack formation did not occur for the panel examined, and that large amounts of displacement in the external tank occurred because of the loads derived from its filling. The measurements acquired were also used to validate computer modeling efforts completed by NASA Marshall Space Flight Center (MSFC).

  4. Infrared Algorithm Development for Ocean Observations with EOS/MODIS

    NASA Technical Reports Server (NTRS)

    Brown, Otis B.

    1997-01-01

    Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.

  5. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  6. Tiger LDRD final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steich, D J; Brugger, S T; Kallman, J S

    2000-02-01

    This final report describes our efforts on the Three-Dimensional Massively Parallel CEM Technologies LDRD project (97-ERD-009). Significant need exists for more advanced time domain computational electromagnetics modeling. Bookkeeping details and modifying inflexible software constitute a vast majority of the effort required to address such needs. The required effort escalates rapidly as problem complexity increases. For example, hybrid meshes requiring hybrid numerics on massively parallel platforms (MPPs). This project attempts to alleviate the above limitations by investigating flexible abstractions for these numerical algorithms on MPPs using object-oriented methods, providing a programming environment insulating physics from bookkeeping. The three major design iterationsmore » during the project, known as TIGER-I to TIGER-III, are discussed. Each version of TIGER is briefly discussed along with lessons learned during the development and implementation. An Application Programming Interface (API) of the object-oriented interface for Tiger-III is included in three appendices. The three appendices contain the Utilities, Entity-Attribute, and Mesh libraries developed during the project. The API libraries represent a snapshot of our latest attempt at insulated the physics from the bookkeeping.« less

  7. Towards an Entropy Stable Spectral Element Framework for Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Parsani, Matteo; Fisher, Travis C.; Nielsen, Eric J.

    2016-01-01

    Entropy stable (SS) discontinuous spectral collocation formulations of any order are developed for the compressible Navier-Stokes equations on hexahedral elements. Recent progress on two complementary efforts is presented. The first effort is a generalization of previous SS spectral collocation work to extend the applicable set of points from tensor product, Legendre-Gauss-Lobatto (LGL) to tensor product Legendre-Gauss (LG) points. The LG and LGL point formulations are compared on a series of test problems. Although being more costly to implement, it is shown that the LG operators are significantly more accurate on comparable grids. Both the LGL and LG operators are of comparable efficiency and robustness, as is demonstrated using test problems for which conventional FEM techniques suffer instability. The second effort generalizes previous SS work to include the possibility of p-refinement at non-conforming interfaces. A generalization of existing entropy stability machinery is developed to accommodate the nuances of fully multi-dimensional summation-by-parts (SBP) operators. The entropy stability of the compressible Euler equations on non-conforming interfaces is demonstrated using the newly developed LG operators and multi-dimensional interface interpolation operators.

  8. Staff | Computational Science | NREL

    Science.gov Websites

    develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High

  9. Evaluating the Implementation of International Computing Curricular in African Universities: A Design-Reality Gap Approach

    ERIC Educational Resources Information Center

    Dasuki, Salihu Ibrahim; Ogedebe, Peter; Kanya, Rislana Abdulazeez; Ndume, Hauwa; Makinde, Julius

    2015-01-01

    Efforts are been made by Universities in developing countries to ensure that it's graduate are not left behind in the competitive global information society; thus have adopted international computing curricular for their computing degree programs. However, adopting these international curricula seem to be very challenging for developing countries…

  10. Automated computer grading of hardwood lumber

    Treesearch

    P. Klinkhachorn; J.P. Franklin; Charles W. McMillin; R.W. Conners; H.A. Huber

    1988-01-01

    This paper describes an improved computer program to grade hardwood lumber. The program was created as part of a system to automate various aspects of the hardwood manufacturing industry. It enhances previous efforts by considering both faces of the board and provides easy application of species dependent rules. The program can be readily interfaced with a computer...

  11. Distance Learning and Cloud Computing: "Just Another Buzzword or a Major E-Learning Breakthrough?"

    ERIC Educational Resources Information Center

    Romiszowski, Alexander J.

    2012-01-01

    "Cloud computing is a model for the enabling of ubiquitous, convenient, and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and other services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." This…

  12. The Use of Computers in the Math Classroom.

    ERIC Educational Resources Information Center

    Blass, Barbara; And Others

    In an effort to increase faculty use and knowledge of computers, Oakland Community College (OCC), in Michigan, developed a Summer Technology Institute (STI), and a Computer Technology Grants (CTG) project beginning in 1989. The STI involved 3-day forums during summers 1989, 1990, and 1991 to expose faculty to hardware and software applications.…

  13. Commentary: It Is Not Only about the Computers--An Argument for Broadening the Conversation

    ERIC Educational Resources Information Center

    DeWitt, Scott W.

    2006-01-01

    In 2002 the members of the National Technology Leadership Initiative (NTLI) framed seven conclusions relating to handheld computers and ubiquitous computing in schools. While several of the conclusions are laudable efforts to increase research and professional development, the factual and conceptual bases for this document are seriously flawed.…

  14. The Relationship between Computational Fluency and Student Success in General Studies Mathematics

    ERIC Educational Resources Information Center

    Hegeman, Jennifer; Waters, Gavin

    2012-01-01

    Many developmental mathematics programs emphasize computational fluency with the assumption that this is a necessary contributor to student success in general studies mathematics. In an effort to determine which skills are most essential, scores on a computational fluency test were correlated with student success in general studies mathematics at…

  15. Computational procedure for finite difference solution of one-dimensional heat conduction problems reduces computer time

    NASA Technical Reports Server (NTRS)

    Iida, H. T.

    1966-01-01

    Computational procedure reduces the numerical effort whenever the method of finite differences is used to solve ablation problems for which the surface recession is large relative to the initial slab thickness. The number of numerical operations required for a given maximum space mesh size is reduced.

  16. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    ERIC Educational Resources Information Center

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  17. Workstyle risk factors for work related musculoskeletal symptoms among computer professionals in India.

    PubMed

    Sharan, Deepak; Parijat, Prakriti; Sasidharan, Ajeesh Padinjattethil; Ranganathan, Rameshkumar; Mohandoss, Mathankumar; Jose, Jeena

    2011-12-01

    Work-related musculoskeletal disorders are common in computer professionals. Workstyle may be one of the risk factors in the development of musculoskeletal discomfort. The objective of this retrospective study was to examine the prevalence of adverse workstyle in computer professionals from India and to evaluate if workstyle factors were predictors of pain and loss of productivity. Office workers from various information technology (IT) companies in India responded to the short-form workstyle questionnaire and pain questionnaire. Correlation analyses were conducted to examine the associations between different variables followed by a multivariate logistic regression to understand the unique predictors of pain and loss of productivity. 4,500 participants responded to the workstyle and pain questionnaire. 22% of participants were reported to have a high risk of an adverse workstyle. 63% of participants reported pain symptoms. Social reactivity, lack of breaks, and deadlines/pressure subscales of workstyle questionnaire were significantly correlated with pain and loss of productivity. Regression analyses revealed that workstyle factors and duration of computer use per day were significant predictors of pain. Workstyle seems to be a mediating factor for musculoskeletal pain, discomfort, and loss of productivity. Based on the study findings, it is recommended that intervention efforts directed towards prevention of musculoskeletal disorders should focus on psychosocial work factors such adverse workstyle in addition to biomechanical risk factors.

  18. Computational protein design-the next generation tool to expand synthetic biology applications.

    PubMed

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  19. Design and implementation of a Windows NT network to support CNC activities

    NASA Technical Reports Server (NTRS)

    Shearrow, C. A.

    1996-01-01

    The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.

  20. Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.

  1. Experimental design applications for modeling and assessing carbon dioxide sequestration in saline aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, John

    2014-11-29

    This project was a computer modeling effort to couple reservoir simulation and ED/RSM using Sensitivity Analysis, Uncertainty Analysis, and Optimization Methods, to assess geologic, geochemical, geomechanical, and rock-fluid effects and factors on CO 2 injectivity, capacity, and plume migration. The project objective was to develop proxy models to simplify the highly complex coupled geochemical and geomechanical models in the utilization and storage of CO 2 in the subsurface. The goals were to investigate and prove the feasibility of the ED/RSM processes and engineering development, and bridge the gaps regarding the uncertainty and unknowns of the many geochemical and geomechanical interactingmore » parameters in the development and operation of anthropogenic CO 2 sequestration and storage sites. The bottleneck in this workflow is the high computational effort of reactive transport simulation models and large number of input variables to optimize with ED/RSM techniques. The project was not to develop the reactive transport, geomechanical, or ED/RSM software, but was to use what was commercially and/or publically available as a proof of concept to generate proxy or surrogate models. A detailed geologic and petrographic mineral assemblage and geologic structure of the doubly plunging anticline was defined using the USDOE RMOTC formations of interest data (e.g., Lower Sundance, Crow Mountain, Alcova Limestone, and Red Peak). The assemblage of 23 minerals was primarily developed from literature data and petrophysical (well log) analysis. The assemblage and structure was input into a commercial reactive transport simulator to predict the effects of CO 2 injection and complex reactions with the reservoir rock. Significant impediments were encountered during the execution phase of the project. The only known commercial reactive transport simulator was incapable of simulating complex geochemistry modeled in this project. Significant effort and project funding was expended to determine the limitations of both the commercial simulator and the Lawrence Berkeley National Laboratory (LBNL) R&D simulator, TOUGHREACT available to the project. A simplified layer cake model approximating the volume of the RMOTC targeted reservoirs was defined with 1-3 minerals eventually modeled with limited success. Modeling reactive transport in porous media requires significant computational power. In this project, up to 24 processors were used to model a limited mineral set of 1-3 minerals. In addition, geomechanical aspects of injecting CO 2 into closed, semi-open, and open systems in various well completion methods was simulated. Enhanced Oil Recovery (EOR) as a storage method was not modeled. A robust and stable simulation dataset or base case was developed and used to create a master dataset with embedded instructions for input to the ED/RSM software. Little success was achieved toward the objective of the project using the commercial simulator or the LBNL simulator versions available during the time of this project. Several hundred realizations were run with the commercial simulator and ED/RSM software, most having convergence problems and terminating prematurely. A proxy model for full field CO 2 injection sequestration utilization and storage was not capable of being developed with software available for this project. Though the chemistry is reasonably known and understood, based on the amount of effort and huge computational time required, predicting CO 2 sequestration storage capacity in geologic formations to within the program goals of ±30% proved unsuccessful.« less

  2. F‐GHG Emissions Reduction Efforts: FY2015 Supplier Profiles

    EPA Pesticide Factsheets

    The Supplier Profiles outlined in this document detail the efforts of large‐area flat panel suppliers to reduce their F‐GHG emissions in manufacturing facilities that make today’s large‐area panels used for products such as TVs and computer monitors.

  3. F‐GHG Emissions Reduction Efforts: FY2016 Supplier Profiles

    EPA Pesticide Factsheets

    The Supplier Profiles outlined in this document detail the efforts of large‐area flat panel suppliers to reduce their F‐GHG emissions in manufacturing facilities that make today’s large‐area panels used for products such as TVs and computer monitors.

  4. Operating manual for coaxial injection combustion model. [for the space shuttle main engine

    NASA Technical Reports Server (NTRS)

    Sutton, R. D.; Schuman, M. D.; Chadwick, W. D.

    1974-01-01

    An operating manual for the coaxial injection combustion model (CICM) is presented as the final report for an eleven month effort designed to provide improvement, to verify, and to document the comprehensive computer program for analyzing the performance of thrust chamber operation with gas/liquid coaxial jet injection. The effort culminated in delivery of an operation FORTRAN IV computer program and associated documentation pertaining to the combustion conditions in the space shuttle main engine. The computer program is structured for compatibility with the standardized Joint Army-Navy-NASA-Air Force (JANNAF) performance evaluation procedure. Use of the CICM in conjunction with the JANNAF procedure allows the analysis of engine systems using coaxial gas/liquid injection.

  5. Preface: SciDAC 2005

    NASA Astrophysics Data System (ADS)

    Mezzacappa, Anthony

    2005-01-01

    On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.

  6. Solving coiled-coil protein structures

    DOE PAGES

    Dauter, Zbigniew

    2015-02-26

    With the availability of more than 100,000 entries stored in the Protein Data Bank (PDB) that can be used as search models, molecular replacement (MR) is currently the most popular method of solving crystal structures of macromolecules. Significant methodological efforts have been directed in recent years towards making this approach more powerful and practical. This resulted in the creation of several computer programs, highly automated and user friendly, that are able to successfully solve many structures even by researchers who, although interested in structures of biomolecules, are not very experienced in crystallography.

  7. Development of ROTC (Reserve Officers’ Training Corps) Data Sets and Evaluation of Their Usefulness for Officer Longitudinal Research Data Base

    DTIC Science & Technology

    1987-08-01

    Two ROTC/OLRDB data sets result from this effort. They reside at the National Institutes of Health (NIH) computer facility. They were both built and...59Z) 1985 8.326 3.836 (46%) Total 31,967 18,617 (58%) Research use of these data sets would benefit from further documentation for some data which...to the existing files, there vould appear to be significant benefit from the inclusion of additional years of OLRDB data vith the newly formed ROTC

  8. Automated Test for NASA CFS

    NASA Technical Reports Server (NTRS)

    McComas, David C.; Strege, Susanne L.; Carpenter, Paul B. Hartman, Randy

    2015-01-01

    The core Flight System (cFS) is a flight software (FSW) product line developed by the Flight Software Systems Branch (FSSB) at NASA's Goddard Space Flight Center (GSFC). The cFS uses compile-time configuration parameters to implement variable requirements to enable portability across embedded computing platforms and to implement different end-user functional needs. The verification and validation of these requirements is proving to be a significant challenge. This paper describes the challenges facing the cFS and the results of a pilot effort to apply EXB Solution's testing approach to the cFS applications.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baart, T. A.; Vandersypen, L. M. K.; Kavli Institute of Nanoscience, Delft University of Technology, P.O. Box 5046, 2600 GA Delft

    We report the computer-automated tuning of gate-defined semiconductor double quantum dots in GaAs heterostructures. We benchmark the algorithm by creating three double quantum dots inside a linear array of four quantum dots. The algorithm sets the correct gate voltages for all the gates to tune the double quantum dots into the single-electron regime. The algorithm only requires (1) prior knowledge of the gate design and (2) the pinch-off value of the single gate T that is shared by all the quantum dots. This work significantly alleviates the user effort required to tune multiple quantum dot devices.

  10. The medical communications officer. A resource for data collection, quality management and medical control.

    PubMed

    Gunderson, Michael; Barnard, Jeff; McPherson, John; Kearns, Conrad T

    2002-08-01

    Pinellas County EMS' Medical Communications Officers provide a wide variety of services to patients, field clinicians, managers and their medical director. The concurrent data collection processes used in the MCO program for performance measurement of resuscitation efforts, intubations, submersion incidents and aeromedical transports for trauma cases have been very effective in the integration of data from multiple computer databases and telephone follow-ups with field crews and receiving emergency department staffs. This has facilitated significant improvements in the performance of these and many other aspects of our EMS system.

  11. Inverse finite-size scaling for high-dimensional significance analysis

    NASA Astrophysics Data System (ADS)

    Xu, Yingying; Puranen, Santeri; Corander, Jukka; Kabashima, Yoshiyuki

    2018-06-01

    We propose an efficient procedure for significance determination in high-dimensional dependence learning based on surrogate data testing, termed inverse finite-size scaling (IFSS). The IFSS method is based on our discovery of a universal scaling property of random matrices which enables inference about signal behavior from much smaller scale surrogate data than the dimensionality of the original data. As a motivating example, we demonstrate the procedure for ultra-high-dimensional Potts models with order of 1010 parameters. IFSS reduces the computational effort of the data-testing procedure by several orders of magnitude, making it very efficient for practical purposes. This approach thus holds considerable potential for generalization to other types of complex models.

  12. Aviation security : vulnerabilities still exist in the aviation security system

    DOT National Transportation Integrated Search

    2000-04-06

    The testimony today discusses the Federal Aviation Administration's (FAA) efforts to implement and improve security in two key areas: air traffic control computer systems and airport passenger screening checkpoints. Computer systems-and the informati...

  13. Psychological Issues in Online Adaptive Task Allocation

    NASA Technical Reports Server (NTRS)

    Morris, N. M.; Rouse, W. B.; Ward, S. L.; Frey, P. R.

    1984-01-01

    Adaptive aiding is an idea that offers potential for improvement over many current approaches to aiding in human-computer systems. The expected return of tailoring the system to fit the user could be in the form of improved system performance and/or increased user satisfaction. Issues such as the manner in which information is shared between human and computer, the appropriate division of labor between them, and the level of autonomy of the aid are explored. A simulated visual search task was developed. Subjects are required to identify targets in a moving display while performing a compensatory sub-critical tracking task. By manipulating characteristics of the situation such as imposed task-related workload and effort required to communicate with the computer, it is possible to create conditions in which interaction with the computer would be more or less desirable. The results of preliminary research using this experimental scenario are presented, and future directions for this research effort are discussed.

  14. Advanced transportation system studies. Alternate propulsion subsystem concepts: Propulsion database

    NASA Technical Reports Server (NTRS)

    Levack, Daniel

    1993-01-01

    The Advanced Transportation System Studies alternate propulsion subsystem concepts propulsion database interim report is presented. The objective of the database development task is to produce a propulsion database which is easy to use and modify while also being comprehensive in the level of detail available. The database is to be available on the Macintosh computer system. The task is to extend across all three years of the contract. Consequently, a significant fraction of the effort in this first year of the task was devoted to the development of the database structure to ensure a robust base for the following years' efforts. Nonetheless, significant point design propulsion system descriptions and parametric models were also produced. Each of the two propulsion databases, parametric propulsion database and propulsion system database, are described. The descriptions include a user's guide to each code, write-ups for models used, and sample output. The parametric database has models for LOX/H2 and LOX/RP liquid engines, solid rocket boosters using three different propellants, a hybrid rocket booster, and a NERVA derived nuclear thermal rocket engine.

  15. Molecular dynamics simulations in hybrid particle-continuum schemes: Pitfalls and caveats

    NASA Astrophysics Data System (ADS)

    Stalter, S.; Yelash, L.; Emamy, N.; Statt, A.; Hanke, M.; Lukáčová-Medvid'ová, M.; Virnau, P.

    2018-03-01

    Heterogeneous multiscale methods (HMM) combine molecular accuracy of particle-based simulations with the computational efficiency of continuum descriptions to model flow in soft matter liquids. In these schemes, molecular simulations typically pose a computational bottleneck, which we investigate in detail in this study. We find that it is preferable to simulate many small systems as opposed to a few large systems, and that a choice of a simple isokinetic thermostat is typically sufficient while thermostats such as Lowe-Andersen allow for simulations at elevated viscosity. We discuss suitable choices for time steps and finite-size effects which arise in the limit of very small simulation boxes. We also argue that if colloidal systems are considered as opposed to atomistic systems, the gap between microscopic and macroscopic simulations regarding time and length scales is significantly smaller. We propose a novel reduced-order technique for the coupling to the macroscopic solver, which allows us to approximate a non-linear stress-strain relation efficiently and thus further reduce computational effort of microscopic simulations.

  16. Optimizing a mobile robot control system using GPU acceleration

    NASA Astrophysics Data System (ADS)

    Tuck, Nat; McGuinness, Michael; Martin, Fred

    2012-01-01

    This paper describes our attempt to optimize a robot control program for the Intelligent Ground Vehicle Competition (IGVC) by running computationally intensive portions of the system on a commodity graphics processing unit (GPU). The IGVC Autonomous Challenge requires a control program that performs a number of different computationally intensive tasks ranging from computer vision to path planning. For the 2011 competition our Robot Operating System (ROS) based control system would not run comfortably on the multicore CPU on our custom robot platform. The process of profiling the ROS control program and selecting appropriate modules for porting to run on a GPU is described. A GPU-targeting compiler, Bacon, is used to speed up development and help optimize the ported modules. The impact of the ported modules on overall performance is discussed. We conclude that GPU optimization can free a significant amount of CPU resources with minimal effort for expensive user-written code, but that replacing heavily-optimized library functions is more difficult, and a much less efficient use of time.

  17. Aeromechanics Analysis of a Boundary Layer Ingesting Fan

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.; Reddy, T. S. R.; Herrick, Gregory P.; Shabbir, Aamir; Florea, Razvan V.

    2013-01-01

    Boundary layer ingesting propulsion systems have the potential to significantly reduce fuel burn but these systems must overcome the challe nges related to aeromechanics-fan flutter stability and forced response dynamic stresses. High-fidelity computational analysis of the fan a eromechanics is integral to the ongoing effort to design a boundary layer ingesting inlet and fan for fabrication and wind-tunnel test. A t hree-dimensional, time-accurate, Reynolds-averaged Navier Stokes computational fluid dynamics code is used to study aerothermodynamic and a eromechanical behavior of the fan in response to both clean and distorted inflows. The computational aeromechanics analyses performed in th is study show an intermediate design iteration of the fan to be flutter-free at the design conditions analyzed with both clean and distorte d in-flows. Dynamic stresses from forced response have been calculated for the design rotational speed. Additional work is ongoing to expan d the analyses to off-design conditions, and for on-resonance conditions.

  18. Applications of Phase-Based Motion Processing

    NASA Technical Reports Server (NTRS)

    Branch, Nicholas A.; Stewart, Eric C.

    2018-01-01

    Image pyramids provide useful information in determining structural response at low cost using commercially available cameras. The current effort applies previous work on the complex steerable pyramid to analyze and identify imperceptible linear motions in video. Instead of implicitly computing motion spectra through phase analysis of the complex steerable pyramid and magnifying the associated motions, instead present a visual technique and the necessary software to display the phase changes of high frequency signals within video. The present technique quickly identifies regions of largest motion within a video with a single phase visualization and without the artifacts of motion magnification, but requires use of the computationally intensive Fourier transform. While Riesz pyramids present an alternative to the computationally intensive complex steerable pyramid for motion magnification, the Riesz formulation contains significant noise, and motion magnification still presents large amounts of data that cannot be quickly assessed by the human eye. Thus, user-friendly software is presented for quickly identifying structural response through optical flow and phase visualization in both Python and MATLAB.

  19. Validation of a Computational Fluid Dynamics (CFD) Code for Supersonic Axisymmetric Base Flow

    NASA Technical Reports Server (NTRS)

    Tucker, P. Kevin

    1993-01-01

    The ability to accurately and efficiently calculate the flow structure in the base region of bodies of revolution in supersonic flight is a significant step in CFD code validation for applications ranging from base heating for rockets to drag for protectives. The FDNS code is used to compute such a flow and the results are compared to benchmark quality experimental data. Flowfield calculations are presented for a cylindrical afterbody at M = 2.46 and angle of attack a = O. Grid independent solutions are compared to mean velocity profiles in the separated wake area and downstream of the reattachment point. Additionally, quantities such as turbulent kinetic energy and shear layer growth rates are compared to the data. Finally, the computed base pressures are compared to the measured values. An effort is made to elucidate the role of turbulence models in the flowfield predictions. The level of turbulent eddy viscosity, and its origin, are used to contrast the various turbulence models and compare the results to the experimental data.

  20. Processing LHC data in the UK

    PubMed Central

    Colling, D.; Britton, D.; Gordon, J.; Lloyd, S.; Doyle, A.; Gronbech, P.; Coles, J.; Sansum, A.; Patrick, G.; Jones, R.; Middleton, R.; Kelsey, D.; Cass, A.; Geddes, N.; Clark, P.; Barnby, L.

    2013-01-01

    The Large Hadron Collider (LHC) is one of the greatest scientific endeavours to date. The construction of the collider itself and the experiments that collect data from it represent a huge investment, both financially and in terms of human effort, in our hope to understand the way the Universe works at a deeper level. Yet the volumes of data produced are so large that they cannot be analysed at any single computing centre. Instead, the experiments have all adopted distributed computing models based on the LHC Computing Grid. Without the correct functioning of this grid infrastructure the experiments would not be able to understand the data that they have collected. Within the UK, the Grid infrastructure needed by the experiments is provided by the GridPP project. We report on the operations, performance and contributions made to the experiments by the GridPP project during the years of 2010 and 2011—the first two significant years of the running of the LHC. PMID:23230163

  1. Brain computer interface learning for systems based on electrocorticography and intracortical microelectrode arrays.

    PubMed

    Hiremath, Shivayogi V; Chen, Weidong; Wang, Wei; Foldes, Stephen; Yang, Ying; Tyler-Kabara, Elizabeth C; Collinger, Jennifer L; Boninger, Michael L

    2015-01-01

    A brain-computer interface (BCI) system transforms neural activity into control signals for external devices in real time. A BCI user needs to learn to generate specific cortical activity patterns to control external devices effectively. We call this process BCI learning, and it often requires significant effort and time. Therefore, it is important to study this process and develop novel and efficient approaches to accelerate BCI learning. This article reviews major approaches that have been used for BCI learning, including computer-assisted learning, co-adaptive learning, operant conditioning, and sensory feedback. We focus on BCIs based on electrocorticography and intracortical microelectrode arrays for restoring motor function. This article also explores the possibility of brain modulation techniques in promoting BCI learning, such as electrical cortical stimulation, transcranial magnetic stimulation, and optogenetics. Furthermore, as proposed by recent BCI studies, we suggest that BCI learning is in many ways analogous to motor and cognitive skill learning, and therefore skill learning should be a useful metaphor to model BCI learning.

  2. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  3. Results of solar electric thrust vector control system design, development and tests

    NASA Technical Reports Server (NTRS)

    Fleischer, G. E.

    1973-01-01

    Efforts to develop and test a thrust vector control system TVCS for a solar-energy-powered ion engine array are described. The results of solar electric propulsion system technology (SEPST) III real-time tests of present versions of TVCS hardware in combination with computer-simulated attitude dynamics of a solar electric multi-mission spacecraft (SEMMS) Phase A-type spacecraft configuration are summarized. Work on an improved solar electric TVCS, based on the use of a state estimator, is described. SEPST III tests of TVCS hardware have generally proved successful and dynamic response of the system is close to predictions. It appears that, if TVCS electronic hardware can be effectively replaced by control computer software, a significant advantage in control capability and flexibility can be gained in future developmental testing, with practical implications for flight systems as well. Finally, it is concluded from computer simulations that TVCS stabilization using rate estimation promises a substantial performance improvement over the present design.

  4. Zonal and tesseral harmonic coefficients for the geopotential function, from zero to 18th order

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, J. C.

    1976-01-01

    Zonal and tesseral harmonic coefficients for the geopotential function are usually tabulated in normalized form to provide immediate information as to the relative significance of the coefficients in the gravity model. The normalized form of the geopotential coefficients cannot be used for computational purposes unless the gravity model has been modified to receive them. This modification is usually not done because the absolute or unnormalized form of the coefficients can be obtained from the simple mathematical relationship that relates the two forms. This computation can be quite tedious for hand calculation, especially for the higher order terms, and can be costly in terms of storage and execution time for machine computation. In this report, zonal and tesseral harmonic coefficients for the geopotential function are tabulated in absolute or unnormalized form. The report is designed to be used as a ready reference for both hand and machine calculation to save the user time and effort.

  5. Systematic assignment of thermodynamic constraints in metabolic network models

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    Background The availability of genome sequences for many organisms enabled the reconstruction of several genome-scale metabolic network models. Currently, significant efforts are put into the automated reconstruction of such models. For this, several computational tools have been developed that particularly assist in identifying and compiling the organism-specific lists of metabolic reactions. In contrast, the last step of the model reconstruction process, which is the definition of the thermodynamic constraints in terms of reaction directionalities, still needs to be done manually. No computational method exists that allows for an automated and systematic assignment of reaction directions in genome-scale models. Results We present an algorithm that – based on thermodynamics, network topology and heuristic rules – automatically assigns reaction directions in metabolic models such that the reaction network is thermodynamically feasible with respect to the production of energy equivalents. It first exploits all available experimentally derived Gibbs energies of formation to identify irreversible reactions. As these thermodynamic data are not available for all metabolites, in a next step, further reaction directions are assigned on the basis of network topology considerations and thermodynamics-based heuristic rules. Briefly, the algorithm identifies reaction subsets from the metabolic network that are able to convert low-energy co-substrates into their high-energy counterparts and thus net produce energy. Our algorithm aims at disabling such thermodynamically infeasible cyclic operation of reaction subnetworks by assigning reaction directions based on a set of thermodynamics-derived heuristic rules. We demonstrate our algorithm on a genome-scale metabolic model of E. coli. The introduced systematic direction assignment yielded 130 irreversible reactions (out of 920 total reactions), which corresponds to about 70% of all irreversible reactions that are required to disable thermodynamically infeasible energy production. Conclusion Although not being fully comprehensive, our algorithm for systematic reaction direction assignment could define a significant number of irreversible reactions automatically with low computational effort. We envision that the presented algorithm is a valuable part of a computational framework that assists the automated reconstruction of genome-scale metabolic models. PMID:17123434

  6. Automated Boundary Conditions for Wind Tunnel Simulations

    NASA Technical Reports Server (NTRS)

    Carlson, Jan-Renee

    2018-01-01

    Computational fluid dynamic (CFD) simulations of models tested in wind tunnels require a high level of fidelity and accuracy particularly for the purposes of CFD validation efforts. Considerable effort is required to ensure the proper characterization of both the physical geometry of the wind tunnel and recreating the correct flow conditions inside the wind tunnel. The typical trial-and-error effort used for determining the boundary condition values for a particular tunnel configuration are time and computer resource intensive. This paper describes a method for calculating and updating the back pressure boundary condition in wind tunnel simulations by using a proportional-integral-derivative controller. The controller methodology and equations are discussed, and simulations using the controller to set a tunnel Mach number in the NASA Langley 14- by 22-Foot Subsonic Tunnel are demonstrated.

  7. Neuromorphic neural interfaces: from neurophysiological inspiration to biohybrid coupling with nervous systems

    NASA Astrophysics Data System (ADS)

    Broccard, Frédéric D.; Joshi, Siddharth; Wang, Jun; Cauwenberghs, Gert

    2017-08-01

    Objective. Computation in nervous systems operates with different computational primitives, and on different hardware, than traditional digital computation and is thus subjected to different constraints from its digital counterpart regarding the use of physical resources such as time, space and energy. In an effort to better understand neural computation on a physical medium with similar spatiotemporal and energetic constraints, the field of neuromorphic engineering aims to design and implement electronic systems that emulate in very large-scale integration (VLSI) hardware the organization and functions of neural systems at multiple levels of biological organization, from individual neurons up to large circuits and networks. Mixed analog/digital neuromorphic VLSI systems are compact, consume little power and operate in real time independently of the size and complexity of the model. Approach. This article highlights the current efforts to interface neuromorphic systems with neural systems at multiple levels of biological organization, from the synaptic to the system level, and discusses the prospects for future biohybrid systems with neuromorphic circuits of greater complexity. Main results. Single silicon neurons have been interfaced successfully with invertebrate and vertebrate neural networks. This approach allowed the investigation of neural properties that are inaccessible with traditional techniques while providing a realistic biological context not achievable with traditional numerical modeling methods. At the network level, populations of neurons are envisioned to communicate bidirectionally with neuromorphic processors of hundreds or thousands of silicon neurons. Recent work on brain-machine interfaces suggests that this is feasible with current neuromorphic technology. Significance. Biohybrid interfaces between biological neurons and VLSI neuromorphic systems of varying complexity have started to emerge in the literature. Primarily intended as a computational tool for investigating fundamental questions related to neural dynamics, the sophistication of current neuromorphic systems now allows direct interfaces with large neuronal networks and circuits, resulting in potentially interesting clinical applications for neuroengineering systems, neuroprosthetics and neurorehabilitation.

  8. Development of a GPU Compatible Version of the Fast Radiation Code RRTMG

    NASA Astrophysics Data System (ADS)

    Iacono, M. J.; Mlawer, E. J.; Berthiaume, D.; Cady-Pereira, K. E.; Suarez, M.; Oreopoulos, L.; Lee, D.

    2012-12-01

    The absorption of solar radiation and emission/absorption of thermal radiation are crucial components of the physics that drive Earth's climate and weather. Therefore, accurate radiative transfer calculations are necessary for realistic climate and weather simulations. Efficient radiation codes have been developed for this purpose, but their accuracy requirements still necessitate that as much as 30% of the computational time of a GCM is spent computing radiative fluxes and heating rates. The overall computational expense constitutes a limitation on a GCM's predictive ability if it becomes an impediment to adding new physics to or increasing the spatial and/or vertical resolution of the model. The emergence of Graphics Processing Unit (GPU) technology, which will allow the parallel computation of multiple independent radiative calculations in a GCM, will lead to a fundamental change in the competition between accuracy and speed. Processing time previously consumed by radiative transfer will now be available for the modeling of other processes, such as physics parameterizations, without any sacrifice in the accuracy of the radiative transfer. Furthermore, fast radiation calculations can be performed much more frequently and will allow the modeling of radiative effects of rapid changes in the atmosphere. The fast radiation code RRTMG, developed at Atmospheric and Environmental Research (AER), is utilized operationally in many dynamical models throughout the world. We will present the results from the first stage of an effort to create a version of the RRTMG radiation code designed to run efficiently in a GPU environment. This effort will focus on the RRTMG implementation in GEOS-5. RRTMG has an internal pseudo-spectral vector of length of order 100 that, when combined with the much greater length of the global horizontal grid vector from which the radiation code is called in GEOS-5, makes RRTMG/GEOS-5 particularly suited to achieving a significant speed improvement through GPU technology. This large number of independent cases will allow us to take full advantage of the computational power of the latest GPUs, ensuring that all thread cores in the GPU remain active, a key criterion for obtaining significant speedup. The CUDA (Compute Unified Device Architecture) Fortran compiler developed by PGI and Nvidia will allow us to construct this parallel implementation on the GPU while remaining in the Fortran language. This implementation will scale very well across various CUDA-supported GPUs such as the recently released Fermi Nvidia cards. We will present the computational speed improvements of the GPU-compatible code relative to the standard CPU-based RRTMG with respect to a very large and diverse suite of atmospheric profiles. This suite will also be utilized to demonstrate the minimal impact of the code restructuring on the accuracy of radiation calculations. The GPU-compatible version of RRTMG will be directly applicable to future versions of GEOS-5, but it is also likely to provide significant associated benefits for other GCMs that employ RRTMG.

  9. Effects of instructional strategies using cross sections on the recognition of anatomical structures in correlated CT and MR images.

    PubMed

    Khalil, Mohammed K; Paas, Fred; Johnson, Tristan E; Su, Yung K; Payer, Andrew F

    2008-01-01

    This research is an effort to best utilize the interactive anatomical images for instructional purposes based on cognitive load theory. Three studies explored the differential effects of three computer-based instructional strategies that use anatomical cross-sections to enhance the interpretation of radiological images. These strategies include: (1) cross-sectional images of the head that can be superimposed on radiological images, (2) transparent highlighting of anatomical structures in radiological images, and (3) cross-sectional images of the head with radiological images presented side-by-side. Data collected included: (1) time spent on instruction and on solving test questions, (2) mental effort during instruction and test, and (3) students' performance to identify anatomical structures in radiological images. Participants were 28 freshmen medical students (15 males and 13 females) and 208 biology students (190 females and 18 males). All studies used posttest-only control group design, and the collected data were analyzed by either t test or ANOVA. In self-directed computer-based environments, the strategies that used cross sections to improve students' ability to recognize anatomic structures in radiological images showed no significant positive effects. However, when increasing the complexity of the instructional materials, cross-sectional images imposed a higher cognitive load, as indicated by higher investment of mental effort. There is not enough evidence to claim that the simultaneous combination of cross sections and radiological images has no effect on the identification of anatomical structures in radiological images for novices. Further research that control for students' learning and cognitive style is needed to reach an informative conclusion.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heffelfinger, Grant S.; Martino, Anthony; Rintoul, Mark Daniel

    This SAND report provides the technical progress through October 2004 of the Sandia-led project, %22Carbon Sequestration in Synechococcus Sp.: From Molecular Machines to Hierarchical Modeling,%22 funded by the DOE Office of Science Genomes to Life Program. Understanding, predicting, and perhaps manipulating carbon fixation in the oceans has long been a major focus of biological oceanography and has more recently been of interest to a broader audience of scientists and policy makers. It is clear that the oceanic sinks and sources of CO2 are important terms in the global environmental response to anthropogenic atmospheric inputs of CO2 and that oceanic microorganismsmore » play a key role in this response. However, the relationship between this global phenomenon and the biochemical mechanisms of carbon fixation in these microorganisms is poorly understood. In this project, we will investigate the carbon sequestration behavior of Synechococcus Sp., an abundant marine cyanobacteria known to be important to environmental responses to carbon dioxide levels, through experimental and computational methods. This project is a combined experimental and computational effort with emphasis on developing and applying new computational tools and methods. Our experimental effort will provide the biology and data to drive the computational efforts and include significant investment in developing new experimental methods for uncovering protein partners, characterizing protein complexes, identifying new binding domains. We will also develop and apply new data measurement and statistical methods for analyzing microarray experiments. Computational tools will be essential to our efforts to discover and characterize the function of the molecular machines of Synechococcus. To this end, molecular simulation methods will be coupled with knowledge discovery from diverse biological data sets for high-throughput discovery and characterization of protein-protein complexes. In addition, we will develop a set of novel capabilities for inference of regulatory pathways in microbial genomes across multiple sources of information through the integration of computational and experimental technologies. These capabilities will be applied to Synechococcus regulatory pathways to characterize their interaction map and identify component proteins in these - 4 - pathways. We will also investigate methods for combining experimental and computational results with visualization and natural language tools to accelerate discovery of regulatory pathways. The ultimate goal of this effort is develop and apply new experimental and computational methods needed to generate a new level of understanding of how the Synechococcus genome affects carbon fixation at the global scale. Anticipated experimental and computational methods will provide ever-increasing insight about the individual elements and steps in the carbon fixation process, however relating an organism's genome to its cellular response in the presence of varying environments will require systems biology approaches. Thus a primary goal for this effort is to integrate the genomic data generated from experiments and lower level simulations with data from the existing body of literature into a whole cell model. We plan to accomplish this by developing and applying a set of tools for capturing the carbon fixation behavior of complex of Synechococcus at different levels of resolution. Finally, the explosion of data being produced by high-throughput experiments requires data analysis and models which are more computationally complex, more heterogeneous, and require coupling to ever increasing amounts of experimentally obtained data in varying formats. These challenges are unprecedented in high performance scientific computing and necessitate the development of a companion computational infrastructure to support this effort. More information about this project, including a copy of the original proposal, can be found at www.genomes-to-life.org Acknowledgment We want to gratefully acknowledge the contributions of the GTL Project Team as follows: Grant S. Heffelfinger1*, Anthony Martino2, Andrey Gorin3, Ying Xu10,3, Mark D. Rintoul1, Al Geist3, Matthew Ennis1, Hashimi Al-Hashimi8, Nikita Arnold3, Andrei Borziak3, Bianca Brahamsha6, Andrea Belgrano12, Praveen Chandramohan3, Xin Chen9, Pan Chongle3, Paul Crozier1, PguongAn Dam10, George S. Davidson1, Robert Day3, Jean Loup Faulon2, Damian Gessler12, Arlene Gonzalez2, David Haaland1, William Hart1, Victor Havin3, Tao Jiang9, Howland Jones1, David Jung3, Ramya Krishnamurthy3, Yooli Light2, Shawn Martin1, Rajesh Munavalli3, Vijaya Natarajan3, Victor Olman10, Frank Olken4, Brian Palenik6, Byung Park3, Steven Plimpton1, Diana Roe2, Nagiza Samatova3, Arie Shoshani4, Michael Sinclair1, Alex Slepoy1, Shawn Stevens8, Chris Stork1, Charlie Strauss5, Zhengchang Su10, Edward Thomas1, Jerilyn A. Timlin1, Xiufeng Wan11, HongWei Wu10, Dong Xu11, Gong-Xin Yu3, Grover Yip8, Zhaoduo Zhang2, Erik Zuiderweg8 *Author to whom correspondence should be addressed (gsheffe%40sandia.gov) 1. Sandia National Laboratories, Albuquerque, NM 2. Sandia National Laboratories, Livermore, CA 3. Oak Ridge National Laboratory, Oak Ridge, TN 4. Lawrence Berkeley National Laboratory, Berkeley, CA 5. Los Alamos National Laboratory, Los Alamos, NM 6. University of California, San Diego 7. University of Illinois, Urbana/Champaign 8. University of Michigan, Ann Arbor 9. University of California, Riverside 10. University of Georgia, Athens 11. University of Missouri, Columbia 12. National Center for Genome Resources, Santa Fe, NM Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.« less

  11. Genomes to Life Project Quarterly Report April 2005.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heffelfinger, Grant S.; Martino, Anthony; Rintoul, Mark Daniel

    2006-02-01

    This SAND report provides the technical progress through April 2005 of the Sandia-led project, "Carbon Sequestration in Synechococcus Sp.: From Molecular Machines to Hierarchical Modeling," funded by the DOE Office of Science Genomics:GTL Program. Understanding, predicting, and perhaps manipulating carbon fixation in the oceans has long been a major focus of biological oceanography and has more recently been of interest to a broader audience of scientists and policy makers. It is clear that the oceanic sinks and sources of CO2 are important terms in the global environmental response to anthropogenic atmospheric inputs of CO2 and that oceanic microorganisms play amore » key role in this response. However, the relationship between this global phenomenon and the biochemical mechanisms of carbon fixation in these microorganisms is poorly understood. In this project, we will investigate the carbon sequestration behavior of Synechococcus Sp., an abundant marine cyanobacteria known to be important to environmental responses to carbon dioxide levels, through experimental and computational methods. This project is a combined experimental and computational effort with emphasis on developing and applying new computational tools and methods. Our experimental effort will provide the biology and data to drive the computational efforts and include significant investment in developing new experimental methods for uncovering protein partners, characterizing protein complexes, identifying new binding domains. We will also develop and apply new data measurement and statistical methods for analyzing microarray experiments. Computational tools will be essential to our efforts to discover and characterize the function of the molecular machines of Synechococcus. To this end, molecular simulation methods will be coupled with knowledge discovery from diverse biological data sets for high-throughput discovery and characterization of protein-protein complexes. In addition, we will develop a set of novel capabilities for inference of regulatory pathways in microbial genomes across multiple sources of information through the integration of computational and experimental technologies. These capabilities will be applied to Synechococcus regulatory pathways to characterize their interaction map and identify component proteins in these - 4 -pathways. We will also investigate methods for combining experimental and computational results with visualization and natural language tools to accelerate discovery of regulatory pathways. The ultimate goal of this effort is develop and apply new experimental and computational methods needed to generate a new level of understanding of how the Synechococcus genome affects carbon fixation at the global scale. Anticipated experimental and computational methods will provide ever-increasing insight about the individual elements and steps in the carbon fixation process, however relating an organism's genome to its cellular response in the presence of varying environments will require systems biology approaches. Thus a primary goal for this effort is to integrate the genomic data generated from experiments and lower level simulations with data from the existing body of literature into a whole cell model. We plan to accomplish this by developing and applying a set of tools for capturing the carbon fixation behavior of complex of Synechococcus at different levels of resolution. Finally, the explosion of data being produced by high-throughput experiments requires data analysis and models which are more computationally complex, more heterogeneous, and require coupling to ever increasing amounts of experimentally obtained data in varying formats. These challenges are unprecedented in high performance scientific computing and necessitate the development of a companion computational infrastructure to support this effort. More information about this project can be found at www.genomes-to-life.org Acknowledgment We want to gratefully acknowledge the contributions of: Grant Heffelfinger1*, Anthony Martino2, Brian Palenik6, Andrey Gorin3, Ying Xu10,3, Mark Daniel Rintoul1, Al Geist3, Matthew Ennis1, with Pratul Agrawal3, Hashim Al-Hashimi8, Andrea Belgrano12, Mike Brown1, Xin Chen9, Paul Crozier1, PguongAn Dam10, Jean-Loup Faulon2, Damian Gessler12, David Haaland1, Victor Havin4, C.F. Huang5, Tao Jiang9, Howland Jones1, David Jung3, Katherine Kang14, Michael Langston15, Shawn Martin1, Shawn Means1, Vijaya Natarajan4, Roy Nielson5, Frank Olken4, Victor Olman10, Ian Paulsen14, Steve Plimpton1, Andreas Reichsteiner5, Nagiza Samatova3, Arie Shoshani4, Michael Sinclair1, Alex Slepoy1, Shawn Stevens8, Charlie Strauss5, Zhengchang Su10, Ed Thomas1, Jerilyn Timlin1, WimVermaas13, Xiufeng Wan11, HongWei Wu10, Dong Xu11, Grover Yip8, Erik Zuiderweg8 *Author to whom correspondence should be addressed (gsheffe@sandia.gov) 1. Sandia National Laboratories, Albuquerque, NM 2. Sandia National Laboratories, Livermore, CA 3. Oak Ridge National Laboratory, Oak Ridge, TN 4. Lawrence Berkeley National Laboratory, Berkeley, CA 5. Los Alamos National Laboratory, Los Alamos, NM 6. University of California, San Diego 7. University of Illinois, Urbana/Champaign 8. University of Michigan, Ann Arbor 9. University of California, Riverside 10. University of Georgia, Athens 11. University of Missouri, Columbia 12. National Center for Genome Resources, Santa Fe, NM 13. Arizona State University 14. The Institute for Genomic Research 15. University of Tennessee 5 Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL8500.« less

  12. Genome-Wide Analysis of Gene-Gene and Gene-Environment Interactions Using Closed-Form Wald Tests.

    PubMed

    Yu, Zhaoxia; Demetriou, Michael; Gillen, Daniel L

    2015-09-01

    Despite the successful discovery of hundreds of variants for complex human traits using genome-wide association studies, the degree to which genes and environmental risk factors jointly affect disease risk is largely unknown. One obstacle toward this goal is that the computational effort required for testing gene-gene and gene-environment interactions is enormous. As a result, numerous computationally efficient tests were recently proposed. However, the validity of these methods often relies on unrealistic assumptions such as additive main effects, main effects at only one variable, no linkage disequilibrium between the two single-nucleotide polymorphisms (SNPs) in a pair or gene-environment independence. Here, we derive closed-form and consistent estimates for interaction parameters and propose to use Wald tests for testing interactions. The Wald tests are asymptotically equivalent to the likelihood ratio tests (LRTs), largely considered to be the gold standard tests but generally too computationally demanding for genome-wide interaction analysis. Simulation studies show that the proposed Wald tests have very similar performances with the LRTs but are much more computationally efficient. Applying the proposed tests to a genome-wide study of multiple sclerosis, we identify interactions within the major histocompatibility complex region. In this application, we find that (1) focusing on pairs where both SNPs are marginally significant leads to more significant interactions when compared to focusing on pairs where at least one SNP is marginally significant; and (2) parsimonious parameterization of interaction effects might decrease, rather than increase, statistical power. © 2015 WILEY PERIODICALS, INC.

  13. An improved multilevel Monte Carlo method for estimating probability distribution functions in stochastic oil reservoir simulations

    DOE PAGES

    Lu, Dan; Zhang, Guannan; Webster, Clayton G.; ...

    2016-12-30

    In this paper, we develop an improved multilevel Monte Carlo (MLMC) method for estimating cumulative distribution functions (CDFs) of a quantity of interest, coming from numerical approximation of large-scale stochastic subsurface simulations. Compared with Monte Carlo (MC) methods, that require a significantly large number of high-fidelity model executions to achieve a prescribed accuracy when computing statistical expectations, MLMC methods were originally proposed to significantly reduce the computational cost with the use of multifidelity approximations. The improved performance of the MLMC methods depends strongly on the decay of the variance of the integrand as the level increases. However, the main challengemore » in estimating CDFs is that the integrand is a discontinuous indicator function whose variance decays slowly. To address this difficult task, we approximate the integrand using a smoothing function that accelerates the decay of the variance. In addition, we design a novel a posteriori optimization strategy to calibrate the smoothing function, so as to balance the computational gain and the approximation error. The combined proposed techniques are integrated into a very general and practical algorithm that can be applied to a wide range of subsurface problems for high-dimensional uncertainty quantification, such as a fine-grid oil reservoir model considered in this effort. The numerical results reveal that with the use of the calibrated smoothing function, the improved MLMC technique significantly reduces the computational complexity compared to the standard MC approach. Finally, we discuss several factors that affect the performance of the MLMC method and provide guidance for effective and efficient usage in practice.« less

  14. A Noise-Assisted Reprogrammable Nanomechanical Logic Gate

    DTIC Science & Technology

    2009-01-01

    effort toward scalable mechanical computation.1-4 This effort can be traced back to 1822 (at least), when Charles Babbage presented a mechanical...the ONR (N000140910963). REFERENCES AND NOTES (1) Babbage , H. P. Babbage’s Calculating Engines; Charles Babbage Reprint Series for the History of

  15. Enabling GEODSS for Space Situational Awareness (SSA)

    NASA Astrophysics Data System (ADS)

    Wootton, S.

    2016-09-01

    The Ground-Based Electro-Optical Deep Space Surveillance (GEODSS) System has been in operation since the mid-1980's. While GEODSS has been the Space Surveillance Network's (SSN's) workhorse in terms of deep space surveillance, it has not undergone a significant modernization since the 1990's. This means GEODSS continues to operate under a mostly obsolete, legacy data processing baseline. The System Program Office (SPO) responsible for GEODSS, SMC/SYGO, has a number of advanced Space Situational Awareness (SSA)-related efforts in progress, in the form of innovative optical capabilities, data processing algorithms, and hardware upgrades. Each of these efforts is in various stages of evaluation and acquisition. These advanced capabilities rely upon a modern computing environment in which to integrate, but GEODSS does not have one—yet. The SPO is also executing a Service Life Extension Program (SLEP) to modernize the various subsystems within GEODSS, along with a parallel effort to implement a complete, modern software re-architecture. The goal is to use a modern, service-based architecture to provide expedient integration as well as easier and more sustainable expansion. This presentation will describe these modernization efforts in more detail and discuss how adopting such modern paradigms and practices will help ensure the GEODSS system remains relevant and sustainable far beyond 2027.

  16. How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes

    PubMed Central

    Meyniel, Florent; Safra, Lou; Pessiglione, Mathias

    2014-01-01

    A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty. PMID:24743711

  17. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Wes

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less

  18. Identifying Differences between Depressed Adolescent Suicide Ideators and Attempters

    PubMed Central

    Auerbach, Randy P.; Millner, Alexander J.; Stewart, Jeremy G.; Esposito, Erika

    2015-01-01

    Background Adolescent depression and suicide are pressing public health concerns, and identifying key differences among suicide ideators and attempters is critical. The goal of the current study is to test whether depressed adolescent suicide attempters report greater anhedonia severity and exhibit aberrant effort-cost computations in the face of uncertainty. Methods Depressed adolescents (n = 101) ages 13–19 years were administered structured clinical interviews to assess current mental health disorders and a history of suicidality (suicide ideators = 55, suicide attempters = 46). Then, participants completed self-report instruments assessing symptoms of suicidal ideation, depression, anhedonia, and anxiety as well as a computerized effort-cost computation task. Results Compared with depressed adolescent suicide ideators, attempters report greater anhedonia severity, even after concurrently controlling for symptoms of suicidal ideation, depression, and anxiety. Additionally, when completing the effort-cost computation task, suicide attempters are less likely to pursue the difficult, high value option when outcomes are uncertain. Follow-up, trial-level analyses of effort-cost computations suggest that receipt of reward does not influence future decision-making among suicide attempters, however, suicide ideators exhibit a win-stay approach when receiving rewards on previous trials. Limitations Findings should be considered in light of limitations including a modest sample size, which limits generalizability, and the cross-sectional design. Conclusions Depressed adolescent suicide attempters are characterized by greater anhedonia severity, which may impair the ability to integrate previous rewarding experiences to inform future decisions. Taken together, this may generate a feeling of powerlessness that contributes to increased suicidality and a needless loss of life. PMID:26233323

  19. DIGGING DEEPER INTO DEEP DATA: MOLECULAR DOCKING AS A HYPOTHESIS-DRIVEN BIOPHYSICAL INTERROGATION SYSTEM IN COMPUTATIONAL TOXICOLOGY.

    EPA Science Inventory

    Developing and evaluating prediactive strategies to elucidate the mode of biological activity of environmental chemicals is a major objective of the concerted efforts of the US-EPA's computational toxicology program.

  20. Globus Quick Start Guide. Globus Software Version 1.1

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.

  1. Recrystallization and Grain Growth Kinetics in Binary Alpha Titanium-Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Trump, Anna Marie

    Titanium alloys are used in a variety of important naval and aerospace applications and often undergo thermomechanical processing which leads to recrystallization and grain growth. Both of these processes have a significant impact on the mechanical properties of the material. Therefore, understanding the kinetics of these processes is crucial to being able to predict the final properties. Three alloys are studied with varying concentrations of aluminum which allows for the direct quantification of the effect of aluminum content on the kinetics of recrystallization and grain growth. Aluminum is the most common alpha stabilizing alloying element used in titanium alloys, however the effect of aluminum on these processes has not been previously studied. This work is also part of a larger Integrated Computational Materials Engineering (ICME) effort whose goal is to combine both computational and experimental efforts to develop computationally efficient models that predict materials microstructure and properties based on processing history. The static recrystallization kinetics are measured using an electron backscatter diffraction (EBSD) technique and a significant retardation in the kinetics is observed with increasing aluminum concentration. An analytical model is then used to capture these results and is able to successfully predict the effect of solute concentration on the time to 50% recrystallization. The model reveals that this solute effect is due to a combination of a decrease in grain boundary mobility and a decrease in driving force with increasing aluminum concentration. The effect of microstructural inhomogeneities is also experimentally quantified and the results are validated with a phase field model for recrystallization. These microstructural inhomogeneities explain the experimentally measured Avrami exponent, which is lower than the theoretical value calculated by the JMAK model. Similar to the effect seen in recrystallization, the addition of aluminum also significantly slows downs the grain growth kinetics. This is generally attributed to the solute drag effect due to segregation of solute atoms at the grain boundaries, however aluminum segregation is not observed in these alloys. The mechanism for this result is explained and is used to validate the prediction of an existing model for solute drag.

  2. HART-II: Prediction of Blade-Vortex Interaction Loading

    NASA Technical Reports Server (NTRS)

    Lim, Joon W.; Tung, Chee; Yu, Yung H.; Burley, Casey L.; Brooks, Thomas; Boyd, Doug; vanderWall, Berend; Schneider, Oliver; Richard, Hugues; Raffel, Markus

    2003-01-01

    During the HART-I data analysis, the need for comprehensive wake data was found including vortex creation and aging, and its re-development after blade-vortex interaction. In October 2001, US Army AFDD, NASA Langley, German DLR, French ONERA and Dutch DNW performed the HART-II test as an international joint effort. The main objective was to focus on rotor wake measurement using a PIV technique along with the comprehensive data of blade deflections, airloads, and acoustics. Three prediction teams made preliminary correlation efforts with HART-II data: a joint US team of US Army AFDD and NASA Langley, German DLR, and French ONERA. The predicted results showed significant improvements over the HART-I predicted results, computed about several years ago, which indicated that there has been better understanding of complicated wake modeling in the comprehensive rotorcraft analysis. All three teams demonstrated satisfactory prediction capabilities, in general, though there were slight deviations of prediction accuracies for various disciplines.

  3. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  4. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE PAGES

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...

    2017-03-28

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  5. A Comparison of Approaches for Solving Hard Graph-Theoretic Problems

    DTIC Science & Technology

    2015-05-01

    collaborative effort “ Adiabatic Quantum Computing Applications Research” (14-RI-CRADA-02) between the Information Directorate and Lock- 3 Algorithm 3...using Matlab, a quantum annealing approach using the D-Wave computer , and lastly using satisfiability modulo theory (SMT) and corresponding SMT...methods are explored and consist of a parallel computing approach using Matlab, a quantum annealing approach using the D-Wave computer , and lastly using

  6. Computational Omics Pre-Awardees | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the pre-awardees of the Computational Omics solicitation. Working with NVIDIA Foundation's Compute the Cure initiative and Leidos Biomedical Research Inc., the NCI, through this solicitation, seeks to leverage computational efforts to provide tools for the mining and interpretation of large-scale publicly available ‘omics’ datasets.

  7. Computers on Wheels: An Alternative to Each One Has One

    ERIC Educational Resources Information Center

    Grant, Michael M.; Ross, Steven M.; Wang, Weiping; Potter, Allison

    2005-01-01

    Four fifth-grade classrooms embarked on a modified ubiquitous computing initiative in the fall of 2003. Two 15-computer wireless laptop carts were shared among the four classrooms in an effort to integrate technology across the curriculum and affect change in student learning and teacher pedagogy. This initiative--in contrast to other one-to-one…

  8. Apple Seeks To Regain Its Stature in World of Academic Computing.

    ERIC Educational Resources Information Center

    Young, Jeffrey R.; Blumenstyk, Goldie

    1998-01-01

    Managers of Apple Computer, the company that pioneered campus personal computing and later lost most of its share of the market, are again focusing energies on academic buyers. Campus technology officials, even those fond of Apples, are greeting the company's efforts with caution. Some feel it may be too late for Apple to regain a significant…

  9. Crossbar Nanocomputer Development

    DTIC Science & Technology

    2012-04-01

    their utilization. Areas such as neuromorphic computing, signal processing, arithmetic processing, and crossbar computing are only some of the...due to its intrinsic, network-on- chip flexibility to re-route around defects. Preliminary efforts in crossbar computing have been demonstrated by...they approach their scaling limits [2]. Other applications that memristive devices are suited for include FPGA [3], encryption [4], and neuromorphic

  10. A new generation in computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, R.E.

    1983-11-01

    Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.

  11. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  12. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  13. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  14. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  15. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  16. Reading Teachers' Beliefs and Utilization of Computer and Technology: A Case Study

    ERIC Educational Resources Information Center

    Remetio, Jessica Espinas

    2014-01-01

    Many researchers believe that computers have the ability to help improve the reading skills of students. In an effort to improve the poor reading scores of students on state tests, as well as improve students' overall academic performance, computers and other technologies have been installed in Frozen Bay School classrooms. As the success of these…

  17. Attitudes of Design Students toward Computer Usage in Design

    ERIC Educational Resources Information Center

    Pektas, Sule Tasli; Erkip, Feyzan

    2006-01-01

    The success of efforts to integrate technology with design education is largely affected by the attitudes of students toward technology. This paper presents the findings of a research on the attitudes of design students toward the use of computers in design and its correlates. Computer Aided Design (CAD) tools are the most widely used computer…

  18. Using an Online Homework System to Submit Accounting Homework: Role of Cognitive Need, Computer Efficacy, and Perception

    ERIC Educational Resources Information Center

    Peng, Jacob C.

    2009-01-01

    The author investigated whether students' effort in working on homework problems was affected by their need for cognition, their perception of the system, and their computer efficacy when instructors used an online system to collect accounting homework. Results showed that individual intrinsic motivation and computer efficacy are important factors…

  19. Education:=Coding+Aesthetics; Aesthetic Understanding, Computer Science Education, and Computational Thinking

    ERIC Educational Resources Information Center

    Good, Jonathon; Keenan, Sarah; Mishra, Punya

    2016-01-01

    The popular press is rife with examples of how students in the United States and around the globe are learning to program, make, and tinker. The Hour of Code, maker-education, and similar efforts are advocating that more students be exposed to principles found within computer science. We propose an expansion beyond simply teaching computational…

  20. New Perspectives on Neuroengineering and Neurotechnologies: NSF-DFG Workshop Report.

    PubMed

    Moritz, Chet T; Ruther, Patrick; Goering, Sara; Stett, Alfred; Ball, Tonio; Burgard, Wolfram; Chudler, Eric H; Rao, Rajesh P N

    2016-07-01

    To identify and overcome barriers to creating new neurotechnologies capable of restoring both motor and sensory function in individuals with neurological conditions. This report builds upon the outcomes of a joint workshop between the US National Science Foundation and the German Research Foundation on New Perspectives in Neuroengineering and Neurotechnology convened in Arlington, VA, USA, November 13-14, 2014. The participants identified key technological challenges for recording and manipulating neural activity, decoding, and interpreting brain data in the presence of plasticity, and early considerations of ethical and social issues pertinent to the adoption of neurotechnologies. The envisaged progress in neuroengineering requires tightly integrated hardware and signal processing efforts, advances in understanding of physiological adaptations to closed-loop interactions with neural devices, and an open dialog with stakeholders and potential end-users of neurotechnology. The development of new neurotechnologies (e.g., bidirectional brain-computer interfaces) could significantly improve the quality of life of people living with the effects of brain or spinal cord injury, or other neurodegenerative diseases. Focused efforts aimed at overcoming the remaining barriers at the electrode tissue interface, developing implantable hardware with on-board computation, and refining stimulation methods to precisely activate neural tissue will advance both our understanding of brain function and our ability to treat currently intractable disorders of the nervous system.

  1. Numerical and experimental investigation of melting with internal heat generation within cylindrical enclosures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amber Shrivastava; Brian Williams; Ali S. Siahpush

    2014-06-01

    There have been significant efforts by the heat transfer community to investigate the melting phenomenon of materials. These efforts have included the analytical development of equations to represent melting, numerical development of computer codes to assist in modeling the phenomena, and collection of experimental data. The understanding of the melting phenomenon has application in several areas of interest, for example, the melting of a Phase Change Material (PCM) used as a thermal storage medium as well as the melting of the fuel bundle in a nuclear power plant during an accident scenario. The objective of this research is two-fold. Firstmore » a numerical investigation, using computational fluid dynamics (CFD), of melting with internal heat generation for a vertical cylindrical geometry is presented. Second, to the best of authors knowledge, there are very limited number of engineering experimental results available for the case of melting with Internal Heat Generation (IHG). An experiment was performed to produce such data using resistive, or Joule, heating as the IHG mechanism. The numerical results are compared against the experimental results and showed favorable correlation. Uncertainties in the numerical and experimental analysis are discussed. Based on the numerical and experimental analysis, recommendations are made for future work.« less

  2. Human-computer interaction in multitask situations

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  3. Robotics-Centered Outreach Activities: An Integrated Approach

    ERIC Educational Resources Information Center

    Ruiz-del-Solar, Javier

    2010-01-01

    Nowadays, universities are making extensive efforts to attract prospective students to the fields of electrical, electronic, and computer engineering. Thus, outreach is becoming increasingly important, and activities with schoolchildren are being extensively carried out as part of this effort. In this context, robotics is a very attractive and…

  4. Automated Lumber Processing

    Treesearch

    Powsiri Klinkhachorn; J. Moody; Philip A. Araman

    1995-01-01

    For the past few decades, researchers have devoted time and effort to apply automation and modern computer technologies towards improving the productivity of traditional industries. To be competitive, one must streamline operations and minimize production costs, while maintaining an acceptable margin of profit. This paper describes the effort of one such endeavor...

  5. Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Gumbert, Clyde

    2017-01-01

    The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.

  6. Axisymmetric computational fluid dynamics analysis of a film/dump-cooled rocket nozzle plume

    NASA Technical Reports Server (NTRS)

    Tucker, P. K.; Warsi, S. A.

    1993-01-01

    Prediction of convective base heating rates for a new launch vehicle presents significant challenges to analysts concerned with base environments. The present effort seeks to augment classical base heating scaling techniques via a detailed investigation of the exhaust plume shear layer of a single H2/O2 Space Transportation Main Engine (STME). Use of fuel-rich turbine exhaust to cool the STME nozzle presented concerns regarding potential recirculation of these gases to the base region with attendant increase in the base heating rate. A pressure-based full Navier-Stokes computational fluid dynamics (CFD) code with finite rate chemistry is used to predict plumes for vehicle altitudes of 10 kft and 50 kft. Levels of combustible species within the plume shear layers are calculated in order to assess assumptions made in the base heating analysis.

  7. Kinetic Monte Carlo Simulation of Oxygen Diffusion in Ytterbium Disilicate

    NASA Astrophysics Data System (ADS)

    Good, Brian

    2015-03-01

    Ytterbium disilicate is of interest as a potential environmental barrier coating for aerospace applications, notably for use in next generation jet turbine engines. In such applications, the diffusion of oxygen and water vapor through these coatings is undesirable if high temperature corrosion is to be avoided. In an effort to understand the diffusion process in these materials, we have performed kinetic Monte Carlo simulations of vacancy-mediated oxygen diffusion in Ytterbium Disilicate. Oxygen vacancy site energies and diffusion barrier energies are computed using Density Functional Theory. We find that many potential diffusion paths involve large barrier energies, but some paths have barrier energies smaller than one electron volt. However, computed vacancy formation energies suggest that the intrinsic vacancy concentration is small in the pure material, with the result that the material is unlikely to exhibit significant oxygen permeability.

  8. Audio-visual affective expression recognition

    NASA Astrophysics Data System (ADS)

    Huang, Thomas S.; Zeng, Zhihong

    2007-11-01

    Automatic affective expression recognition has attracted more and more attention of researchers from different disciplines, which will significantly contribute to a new paradigm for human computer interaction (affect-sensitive interfaces, socially intelligent environments) and advance the research in the affect-related fields including psychology, psychiatry, and education. Multimodal information integration is a process that enables human to assess affective states robustly and flexibly. In order to understand the richness and subtleness of human emotion behavior, the computer should be able to integrate information from multiple sensors. We introduce in this paper our efforts toward machine understanding of audio-visual affective behavior, based on both deliberate and spontaneous displays. Some promising methods are presented to integrate information from both audio and visual modalities. Our experiments show the advantage of audio-visual fusion in affective expression recognition over audio-only or visual-only approaches.

  9. Positioning Continuing Education Computer Programs for the Corporate Market.

    ERIC Educational Resources Information Center

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  10. Researching and Reducing the Health Burden of Stroke

    MedlinePlus

    ... the result of continuing research to map the brain and interface it with a computer to enable stroke patients to regain function. How important is the new effort to map the human brain? The brain is more complex than any computer ...

  11. Brain transcriptome atlases: a computational perspective.

    PubMed

    Mahfouz, Ahmed; Huisman, Sjoerd M H; Lelieveldt, Boudewijn P F; Reinders, Marcel J T

    2017-05-01

    The immense complexity of the mammalian brain is largely reflected in the underlying molecular signatures of its billions of cells. Brain transcriptome atlases provide valuable insights into gene expression patterns across different brain areas throughout the course of development. Such atlases allow researchers to probe the molecular mechanisms which define neuronal identities, neuroanatomy, and patterns of connectivity. Despite the immense effort put into generating such atlases, to answer fundamental questions in neuroscience, an even greater effort is needed to develop methods to probe the resulting high-dimensional multivariate data. We provide a comprehensive overview of the various computational methods used to analyze brain transcriptome atlases.

  12. Cloud Computing for Complex Performance Codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  13. Space shuttle low cost/risk avionics study

    NASA Technical Reports Server (NTRS)

    1971-01-01

    All work breakdown structure elements containing any avionics related effort were examined for pricing the life cycle costs. The analytical, testing, and integration efforts are included for the basic onboard avionics and electrical power systems. The design and procurement of special test equipment and maintenance and repair equipment are considered. Program management associated with these efforts is described. Flight test spares and labor and materials associated with the operations and maintenance of the avionics systems throughout the horizontal flight test are examined. It was determined that cost savings can be achieved by using existing hardware, maximizing orbiter-booster commonality, specifying new equipments to MIL quality standards, basing redundancy on cost effective analysis, minimizing software complexity and reducing cross strapping and computer-managed functions, utilizing compilers and floating point computers, and evolving the design as dictated by the horizontal flight test schedules.

  14. A phenomenographic study of the ways of understanding conditional and repetition structures in computer programming languages

    NASA Astrophysics Data System (ADS)

    Bucks, Gregory Warren

    Computers have become an integral part of how engineers complete their work, allowing them to collect and analyze data, model potential solutions and aiding in production through automation and robotics. In addition, computers are essential elements of the products themselves, from tennis shoes to construction materials. An understanding of how computers function, both at the hardware and software level, is essential for the next generation of engineers. Despite the need for engineers to develop a strong background in computing, little opportunity is given for engineering students to develop these skills. Learning to program is widely seen as a difficult task, requiring students to develop not only an understanding of specific concepts, but also a way of thinking. In addition, students are forced to learn a new tool, in the form of the programming environment employed, along with these concepts and thought processes. Because of this, many students will not develop a sufficient proficiency in programming, even after progressing through the traditional introductory programming sequence. This is a significant problem, especially in the engineering disciplines, where very few students receive more than one or two semesters' worth of instruction in an already crowded engineering curriculum. To address these issues, new pedagogical techniques must be investigated in an effort to enhance the ability of engineering students to develop strong computing skills. However, these efforts are hindered by the lack of published assessment instruments available for probing an individual's understanding of programming concepts across programming languages. Traditionally, programming knowledge has been assessed by producing written code in a specific language. This can be an effective method, but does not lend itself well to comparing the pedagogical impact of different programming environments, languages or paradigms. This dissertation presents a phenomenographic research study exploring the different ways of understanding held by individuals of two programming concepts: conditional structures and repetition structures. This work lays the foundation for the development of language independent assessment instruments, which can ultimately be used to assess the pedagogical implications of various programming environments.

  15. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  16. Airline Safety and Economy

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This video documents efforts at NASA Langley Research Center to improve safety and economy in aircraft. Featured are the cockpit weather information needs computer system, which relays real time weather information to the pilot, and efforts to improve techniques to detect structural flaws and corrosion, such as the thermal bond inspection system.

  17. MUMPS Based Integration of Disparate Computer-Assisted Medical Diagnosis Modules

    DTIC Science & Technology

    1989-12-12

    modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used to develop an...Abdominal and Chest Pain modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used

  18. Interactive Electronic Storybooks for Kindergartners to Promote Vocabulary Growth

    ERIC Educational Resources Information Center

    Smeets, Daisy J. H.; Bus, Adriana G.

    2012-01-01

    The goals of this study were to examine (a) whether extratextual vocabulary instructions embedded in electronic storybooks facilitated word learning over reading alone and (b) whether instructional formats that required children to invest more effort were more effective than formats that required less effort. A computer-based "assistant" was added…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan; Piro, Markus H.A.

    Thermochimica is a software library that determines a unique combination of phases and their compositions at thermochemical equilibrium. Thermochimica can be used for stand-alone calculations or it can be directly coupled to other codes. This release of the software does not have a graphical user interface (GUI) and it can be executed from the command line or from an Application Programming Interface (API). Also, it is not intended for thermodynamic model development or for constructing phase diagrams. The main purpose of the software is to be directly coupled with a multi-physics code to provide material properties and boundary conditions formore » various physical phenomena. Significant research efforts have been dedicated to enhance computational performance through advanced algorithm development, such as improved estimation techniques and non-linear solvers. Various useful parameters can be provided as output from Thermochimica, such as: determination of which phases are stable at equilibrium, the mass of solution species and phases at equilibrium, mole fractions of solution phase constituents, thermochemical activities (which are related to partial pressures for gaseous species), chemical potentials of solution species and phases, and integral Gibbs energy (referenced relative to standard state). The overall goal is to provide an open source computational tool to enhance the predictive capability of multi-physics codes without significantly impeding computational performance.« less

  20. Coupled molecular dynamics and continuum electrostatic method to compute the ionization pKa's of proteins as a function of pH. Test on a large set of proteins.

    PubMed

    Vorobjev, Yury N; Scheraga, Harold A; Vila, Jorge A

    2018-02-01

    A computational method, to predict the pKa values of the ionizable residues Asp, Glu, His, Tyr, and Lys of proteins, is presented here. Calculation of the electrostatic free-energy of the proteins is based on an efficient version of a continuum dielectric electrostatic model. The conformational flexibility of the protein is taken into account by carrying out molecular dynamics simulations of 10 ns in implicit water. The accuracy of the proposed method of calculation of pKa values is estimated from a test set of experimental pKa data for 297 ionizable residues from 34 proteins. The pKa-prediction test shows that, on average, 57, 86, and 95% of all predictions have an error lower than 0.5, 1.0, and 1.5 pKa units, respectively. This work contributes to our general understanding of the importance of protein flexibility for an accurate computation of pKa, providing critical insight about the significance of the multiple neutral states of acid and histidine residues for pKa-prediction, and may spur significant progress in our effort to develop a fast and accurate electrostatic-based method for pKa-predictions of proteins as a function of pH.

  1. Hidden Markov induced Dynamic Bayesian Network for recovering time evolving gene regulatory networks

    NASA Astrophysics Data System (ADS)

    Zhu, Shijia; Wang, Yadong

    2015-12-01

    Dynamic Bayesian Networks (DBN) have been widely used to recover gene regulatory relationships from time-series data in computational systems biology. Its standard assumption is ‘stationarity’, and therefore, several research efforts have been recently proposed to relax this restriction. However, those methods suffer from three challenges: long running time, low accuracy and reliance on parameter settings. To address these problems, we propose a novel non-stationary DBN model by extending each hidden node of Hidden Markov Model into a DBN (called HMDBN), which properly handles the underlying time-evolving networks. Correspondingly, an improved structural EM algorithm is proposed to learn the HMDBN. It dramatically reduces searching space, thereby substantially improving computational efficiency. Additionally, we derived a novel generalized Bayesian Information Criterion under the non-stationary assumption (called BWBIC), which can help significantly improve the reconstruction accuracy and largely reduce over-fitting. Moreover, the re-estimation formulas for all parameters of our model are derived, enabling us to avoid reliance on parameter settings. Compared to the state-of-the-art methods, the experimental evaluation of our proposed method on both synthetic and real biological data demonstrates more stably high prediction accuracy and significantly improved computation efficiency, even with no prior knowledge and parameter settings.

  2. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic

    PubMed Central

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-01-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333

  3. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.

    PubMed

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-09-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.

  4. Challenging the Myth of Disability.

    ERIC Educational Resources Information Center

    Brightman, Alan

    1989-01-01

    Discussion of the rhetoric of disability, including physical, hearing, and visual impairments, highlights possible benefits that computer technology can provide. Designing for disabled individuals is discussed, and product development efforts by Apple Computer to increase microcomputer access to disabled children and adults are described. (LRW)

  5. Brief History of Computer-Assisted Instruction at the Institute for Mathematical Studies in the Social Sciences.

    ERIC Educational Resources Information Center

    Stanford Univ., CA. Inst. for Mathematical Studies in Social Science.

    In 1963, the Institute began a program of research and development in computer-assisted instruction (CAI). Their efforts have been funded at various times by the Carnegie Corporation of New York, The National Science Foundation and the United States Office of Education. Starting with a medium-sized computer and six student stations, the Institute…

  6. Efficient Computational Prototyping of Mixed Technology Microfluidic Components and Systems

    DTIC Science & Technology

    2002-08-01

    AFRL-IF-RS-TR-2002-190 Final Technical Report August 2002 EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC...SUBTITLE EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC COMPONENTS AND SYSTEMS 6. AUTHOR(S) Narayan R. Aluru, Jacob White...Aided Design (CAD) tools for microfluidic components and systems were developed in this effort. Innovative numerical methods and algorithms for mixed

  7. A Case Study on Collective Cognition and Operation in Team-Based Computer Game Design by Middle-School Children

    ERIC Educational Resources Information Center

    Ke, Fengfeng; Im, Tami

    2014-01-01

    This case study examined team-based computer-game design efforts by children with diverse abilities to explore the nature of their collective design actions and cognitive processes. Ten teams of middle-school children, with a high percentage of minority students, participated in a 6-weeks, computer-assisted math-game-design program. Essential…

  8. Computer Science Lesson Study: Building Computing Skills among Elementary School Teachers

    ERIC Educational Resources Information Center

    Newman, Thomas R.

    2017-01-01

    The lack of diversity in the technology workforce in the United States has proven to be a stubborn problem, resisting even the most well-funded reform efforts. With the absence of computer science education in the mainstream K-12 curriculum, only a narrow band of students in public schools go on to careers in technology. The problem persists…

  9. Using Computer Games to Train Information Warfare Teams

    DTIC Science & Technology

    2004-01-01

    Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2004 2004 Paper No 1729 Page 1 of 10 Using Computer Games to...responses they will experience on real missions is crucial. 3D computer games have proved themselves to be highly effective in engaging players...motivationally and emotionally. This effort, therefore, uses gaming technology to provide realistic simulations. These games are augmented with

  10. High-Performance Computing: High-Speed Computer Networks in the United States, Europe, and Japan. Report to Congressional Requesters.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Information Management and Technology Div.

    This report was prepared in response to a request from the Senate Committee on Commerce, Science, and Transportation, and from the House Committee on Science, Space, and Technology, for information on efforts to develop high-speed computer networks in the United States, Europe (limited to France, Germany, Italy, the Netherlands, and the United…

  11. Terrestrial implications of mathematical modeling developed for space biomedical research

    NASA Technical Reports Server (NTRS)

    Lujan, Barbara F.; White, Ronald J.; Leonard, Joel I.; Srinivasan, R. Srini

    1988-01-01

    This paper summarizes several related research projects supported by NASA which seek to apply computer models to space medicine and physiology. These efforts span a wide range of activities, including mathematical models used for computer simulations of physiological control systems; power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and computer-aided diagnosis programs.

  12. Writing. A Research-Based Writing Program for Students with High Access to Computers. ACOT Report #2.

    ERIC Educational Resources Information Center

    Hiebert, Elfrieda H.; And Others

    This report summarizes the curriculum development and research effort that took place at the Cupertino Apple Classrooms of Tomorrow (ACOT) site from January through June 1987. Based on the premise that computers make revising and editing much easier, the four major objectives emphasized by the computer-intensive writing program are fluency,…

  13. Small Computer Applications for Base Supply.

    DTIC Science & Technology

    1984-03-01

    research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign

  14. Technology for Kids' Desktops: How One School Brought Its Computers Out of the Lab and into Classrooms.

    ERIC Educational Resources Information Center

    Bozzone, Meg A.

    1997-01-01

    Purchasing custom-made desks with durable glass tops to house computers and double as student work space solved the problem of how to squeeze in additional classroom computers at Johnson Park Elementary School in Princeton, New Jersey. This article describes a K-5 grade school's efforts to overcome barriers to integrating technology. (PEN)

  15. Twenty Years of Girls into Computing Days: Has It Been Worth the Effort?

    ERIC Educational Resources Information Center

    Craig, Annemieke; Lang, Catherine; Fisher, Julie

    2008-01-01

    The first documented day-long program to encourage girls to consider computing as a career was held in 1987 in the U.K. Over the last 20 years these one-day events, labeled "Girls into Computing" days, have been conducted by academics and professionals to foster female-student interest in information technology (IT) degrees and careers.…

  16. Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities. AI Memo 417.

    ERIC Educational Resources Information Center

    Carr, Brian

    Wusor II is the second intelligent computer aided instruction (ICAI) program that has been developed to monitor the progress of, and offer suggestions to, students playing Wumpus, a computer game designed to teach logical thinking and problem solving. From the earlier efforts with Wusor I, it was possible to produce a rule-based expert which…

  17. Computational Lipidomics and Lipid Bioinformatics: Filling In the Blanks.

    PubMed

    Pauling, Josch; Klipp, Edda

    2016-12-22

    Lipids are highly diverse metabolites of pronounced importance in health and disease. While metabolomics is a broad field under the omics umbrella that may also relate to lipids, lipidomics is an emerging field which specializes in the identification, quantification and functional interpretation of complex lipidomes. Today, it is possible to identify and distinguish lipids in a high-resolution, high-throughput manner and simultaneously with a lot of structural detail. However, doing so may produce thousands of mass spectra in a single experiment which has created a high demand for specialized computational support to analyze these spectral libraries. The computational biology and bioinformatics community has so far established methodology in genomics, transcriptomics and proteomics but there are many (combinatorial) challenges when it comes to structural diversity of lipids and their identification, quantification and interpretation. This review gives an overview and outlook on lipidomics research and illustrates ongoing computational and bioinformatics efforts. These efforts are important and necessary steps to advance the lipidomics field alongside analytic, biochemistry, biomedical and biology communities and to close the gap in available computational methodology between lipidomics and other omics sub-branches.

  18. Energy scaling advantages of resistive memory crossbar based computation and its application to sparse coding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Sapan; Quach, Tu -Thach; Parekh, Ojas

    In this study, the exponential increase in data over the last decade presents a significant challenge to analytics efforts that seek to process and interpret such data for various applications. Neural-inspired computing approaches are being developed in order to leverage the computational properties of the analog, low-power data processing observed in biological systems. Analog resistive memory crossbars can perform a parallel read or a vector-matrix multiplication as well as a parallel write or a rank-1 update with high computational efficiency. For an N × N crossbar, these two kernels can be O(N) more energy efficient than a conventional digital memory-basedmore » architecture. If the read operation is noise limited, the energy to read a column can be independent of the crossbar size (O(1)). These two kernels form the basis of many neuromorphic algorithms such as image, text, and speech recognition. For instance, these kernels can be applied to a neural sparse coding algorithm to give an O(N) reduction in energy for the entire algorithm when run with finite precision. Sparse coding is a rich problem with a host of applications including computer vision, object tracking, and more generally unsupervised learning.« less

  19. Methane Adsorption in Zr-Based MOFs: Comparison and Critical Evaluation of Force Fields

    PubMed Central

    2017-01-01

    The search for nanoporous materials that are highly performing for gas storage and separation is one of the contemporary challenges in material design. The computational tools to aid these experimental efforts are widely available, and adsorption isotherms are routinely computed for huge sets of (hypothetical) frameworks. Clearly the computational results depend on the interactions between the adsorbed species and the adsorbent, which are commonly described using force fields. In this paper, an extensive comparison and in-depth investigation of several force fields from literature is reported for the case of methane adsorption in the Zr-based Metal–Organic Frameworks UiO-66, UiO-67, DUT-52, NU-1000, and MOF-808. Significant quantitative differences in the computed uptake are observed when comparing different force fields, but most qualitative features are common which suggests some predictive power of the simulations when it comes to these properties. More insight into the host–guest interactions is obtained by benchmarking the force fields with an extensive number of ab initio computed single molecule interaction energies. This analysis at the molecular level reveals that especially ab initio derived force fields perform well in reproducing the ab initio interaction energies. Finally, the high sensitivity of uptake predictions on the underlying potential energy surface is explored. PMID:29170687

  20. Porting marine ecosystem model spin-up using transport matrices to GPUs

    NASA Astrophysics Data System (ADS)

    Siewertsen, E.; Piwonski, J.; Slawig, T.

    2013-01-01

    We have ported an implementation of the spin-up for marine ecosystem models based on transport matrices to graphics processing units (GPUs). The original implementation was designed for distributed-memory architectures and uses the Portable, Extensible Toolkit for Scientific Computation (PETSc) library that is based on the Message Passing Interface (MPI) standard. The spin-up computes a steady seasonal cycle of ecosystem tracers with climatological ocean circulation data as forcing. Since the transport is linear with respect to the tracers, the resulting operator is represented by matrices. Each iteration of the spin-up involves two matrix-vector multiplications and the evaluation of the used biogeochemical model. The original code was written in C and Fortran. On the GPU, we use the Compute Unified Device Architecture (CUDA) standard, a customized version of PETSc and a commercial CUDA Fortran compiler. We describe the extensions to PETSc and the modifications of the original C and Fortran codes that had to be done. Here we make use of freely available libraries for the GPU. We analyze the computational effort of the main parts of the spin-up for two exemplar ecosystem models and compare the overall computational time to those necessary on different CPUs. The results show that a consumer GPU can compete with a significant number of cluster CPUs without further code optimization.

  1. Energy scaling advantages of resistive memory crossbar based computation and its application to sparse coding

    DOE PAGES

    Agarwal, Sapan; Quach, Tu -Thach; Parekh, Ojas; ...

    2016-01-06

    In this study, the exponential increase in data over the last decade presents a significant challenge to analytics efforts that seek to process and interpret such data for various applications. Neural-inspired computing approaches are being developed in order to leverage the computational properties of the analog, low-power data processing observed in biological systems. Analog resistive memory crossbars can perform a parallel read or a vector-matrix multiplication as well as a parallel write or a rank-1 update with high computational efficiency. For an N × N crossbar, these two kernels can be O(N) more energy efficient than a conventional digital memory-basedmore » architecture. If the read operation is noise limited, the energy to read a column can be independent of the crossbar size (O(1)). These two kernels form the basis of many neuromorphic algorithms such as image, text, and speech recognition. For instance, these kernels can be applied to a neural sparse coding algorithm to give an O(N) reduction in energy for the entire algorithm when run with finite precision. Sparse coding is a rich problem with a host of applications including computer vision, object tracking, and more generally unsupervised learning.« less

  2. The Computational Infrastructure for Geodynamics as a Community of Practice

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Kellogg, L. H.

    2016-12-01

    Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.

  3. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.

  4. Factors Affecting Radiologist's PACS Usage.

    PubMed

    Forsberg, Daniel; Rosipko, Beverly; Sunshine, Jeffrey L

    2016-12-01

    The purpose of this study was to determine if any of the factors radiologist, examination category, time of week, and week effect PACS usage, with PACS usage defined as the sequential order of computer commands issued by a radiologist in a PACS during interpretation and dictation. We initially hypothesized that only radiologist and examination category would have significant effects on PACS usage. Command logs covering 8 weeks of PACS usage were analyzed. For each command trace (describing performed activities of an attending radiologist interpreting a single examination), the PACS usage variables number of commands, number of command classes, bigram repetitiveness, and time to read were extracted. Generalized linear models were used to determine the significance of the factors on the PACS usage variables. The statistical results confirmed the initial hypothesis that radiologist and examination category affect PACS usage and that the factors week and time of week to a large extent have no significant effect. As such, this work provides direction for continued efforts to analyze system data to better understand PACS utilization, which in turn can provide input to enable optimal utilization and configuration of corresponding systems. These continued efforts were, in this work, exemplified by a more detailed analysis using PACS usage profiles, which revealed insights directly applicable to improve PACS utilization through modified system configuration.

  5. High resolution flow field prediction for tail rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.

    1989-01-01

    The prediction of tail rotor noise due to the impingement of the main rotor wake poses a significant challenge to current analysis methods in rotorcraft aeroacoustics. This paper describes the development of a new treatment of the tail rotor aerodynamic environment that permits highly accurate resolution of the incident flow field with modest computational effort relative to alternative models. The new approach incorporates an advanced full-span free wake model of the main rotor in a scheme which reconstructs high-resolution flow solutions from preliminary, computationally inexpensive simulations with coarse resolution. The heart of the approach is a novel method for using local velocity correction terms to capture the steep velocity gradients characteristic of the vortex-dominated incident flow. Sample calculations have been undertaken to examine the principal types of interactions between the tail rotor and the main rotor wake and to examine the performance of the new method. The results of these sample problems confirm the success of this approach in capturing the high-resolution flows necessary for analysis of rotor-wake/rotor interactions with dramatically reduced computational cost. Computations of radiated sound are also carried out that explore the role of various portions of the main rotor wake in generating tail rotor noise.

  6. Quantitative Assessment of Optical Coherence Tomography Imaging Performance with Phantom-Based Test Methods And Computational Modeling

    NASA Astrophysics Data System (ADS)

    Agrawal, Anant

    Optical coherence tomography (OCT) is a powerful medical imaging modality that uniquely produces high-resolution cross-sectional images of tissue using low energy light. Its clinical applications and technological capabilities have grown substantially since its invention about twenty years ago, but efforts have been limited to develop tools to assess performance of OCT devices with respect to the quality and content of acquired images. Such tools are important to ensure information derived from OCT signals and images is accurate and consistent, in order to support further technology development, promote standardization, and benefit public health. The research in this dissertation investigates new physical and computational models which can provide unique insights into specific performance characteristics of OCT devices. Physical models, known as phantoms, are fabricated and evaluated in the interest of establishing standardized test methods to measure several important quantities relevant to image quality. (1) Spatial resolution is measured with a nanoparticle-embedded phantom and model eye which together yield the point spread function under conditions where OCT is commonly used. (2) A multi-layered phantom is constructed to measure the contrast transfer function along the axis of light propagation, relevant for cross-sectional imaging capabilities. (3) Existing and new methods to determine device sensitivity are examined and compared, to better understand the detection limits of OCT. A novel computational model based on the finite-difference time-domain (FDTD) method, which simulates the physics of light behavior at the sub-microscopic level within complex, heterogeneous media, is developed to probe device and tissue characteristics influencing the information content of an OCT image. This model is first tested in simple geometric configurations to understand its accuracy and limitations, then a highly realistic representation of a biological cell, the retinal cone photoreceptor, is created and its resulting OCT signals studied. The phantoms and their associated test methods have successfully yielded novel types of data on the specific performance parameters of interest, which can feed standardization efforts within the OCT community. The level of signal detail provided by the computational model is unprecedented and gives significant insights into the effects of subcellular structures on OCT signals. Together, the outputs of this research effort serve as new tools in the toolkit to examine the intricate details of how and how well OCT devices produce information-rich images of biological tissue.

  7. Future of Assurance: Ensuring that a System is Trustworthy

    NASA Astrophysics Data System (ADS)

    Sadeghi, Ahmad-Reza; Verbauwhede, Ingrid; Vishik, Claire

    Significant efforts are put in defining and implementing strong security measures for all components of the comput-ing environment. It is equally important to be able to evaluate the strength and robustness of these measures and establish trust among the components of the computing environment based on parameters and attributes of these elements and best practices associated with their production and deployment. Today the inventory of techniques used for security assurance and to establish trust -- audit, security-conscious development process, cryptographic components, external evaluation - is somewhat limited. These methods have their indisputable strengths and have contributed significantly to the advancement in the area of security assurance. However, shorter product and tech-nology development cycles and the sheer complexity of modern digital systems and processes have begun to decrease the efficiency of these techniques. Moreover, these approaches and technologies address only some aspects of security assurance and, for the most part, evaluate assurance in a general design rather than an instance of a product. Additionally, various components of the computing environment participating in the same processes enjoy different levels of security assurance, making it difficult to ensure adequate levels of protection end-to-end. Finally, most evaluation methodologies rely on the knowledge and skill of the evaluators, making reliable assessments of trustworthiness of a system even harder to achieve. The paper outlines some issues in security assurance that apply across the board, with the focus on the trustworthiness and authenticity of hardware components and evaluates current approaches to assurance.

  8. AlaScan: A Graphical User Interface for Alanine Scanning Free-Energy Calculations.

    PubMed

    Ramadoss, Vijayaraj; Dehez, François; Chipot, Christophe

    2016-06-27

    Computation of the free-energy changes that underlie molecular recognition and association has gained significant importance due to its considerable potential in drug discovery. The massive increase of computational power in recent years substantiates the application of more accurate theoretical methods for the calculation of binding free energies. The impact of such advances is the application of parent approaches, like computational alanine scanning, to investigate in silico the effect of amino-acid replacement in protein-ligand and protein-protein complexes, or probe the thermostability of individual proteins. Because human effort represents a significant cost that precludes the routine use of this form of free-energy calculations, minimizing manual intervention constitutes a stringent prerequisite for any such systematic computation. With this objective in mind, we propose a new plug-in, referred to as AlaScan, developed within the popular visualization program VMD to automate the major steps in alanine-scanning calculations, employing free-energy perturbation as implemented in the widely used molecular dynamics code NAMD. The AlaScan plug-in can be utilized upstream, to prepare input files for selected alanine mutations. It can also be utilized downstream to perform the analysis of different alanine-scanning calculations and to report the free-energy estimates in a user-friendly graphical user interface, allowing favorable mutations to be identified at a glance. The plug-in also assists the end-user in assessing the reliability of the calculation through rapid visual inspection.

  9. Telecommunications: Working To Enhance Global Understanding and Peace Education.

    ERIC Educational Resources Information Center

    Schrum, Lynne M.

    This paper describes educational activities that make use of microcomputers and information networks to link elementary and secondary students electronically using telecommunications, i.e., communication across distances using personal computers, modems, telephone lines, and computer networks. Efforts to promote global understanding and awareness…

  10. Measuring Impact of EPAs Computational Toxicology Research (BOSC)

    EPA Science Inventory

    Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...

  11. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1988-01-01

    The initial effort was concentrated on developing the quasi-analytical approach for two-dimensional transonic flow. To keep the problem computationally efficient and straightforward, only the two-dimensional flow was considered and the problem was modeled using the transonic small perturbation equation.

  12. Aerodynamic Characterization of a Modern Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Hall, Robert M.; Holland, Scott D.; Blevins, John A.

    2011-01-01

    A modern launch vehicle is by necessity an extremely integrated design. The accurate characterization of its aerodynamic characteristics is essential to determine design loads, to design flight control laws, and to establish performance. The NASA Ares Aerodynamics Panel has been responsible for technical planning, execution, and vetting of the aerodynamic characterization of the Ares I vehicle. An aerodynamics team supporting the Panel consists of wind tunnel engineers, computational engineers, database engineers, and other analysts that address topics such as uncertainty quantification. The team resides at three NASA centers: Langley Research Center, Marshall Space Flight Center, and Ames Research Center. The Panel has developed strategies to synergistically combine both the wind tunnel efforts and the computational efforts with the goal of validating the computations. Selected examples highlight key flow physics and, where possible, the fidelity of the comparisons between wind tunnel results and the computations. Lessons learned summarize what has been gleaned during the project and can be useful for other vehicle development projects.

  13. Research in the design of high-performance reconfigurable systems

    NASA Technical Reports Server (NTRS)

    Mcewan, S. D.; Spry, A. J.

    1985-01-01

    Computer aided design and computer aided manufacturing have the potential for greatly reducing the cost and lead time in the development of VLSI components. This potential paves the way for the design and fabrication of a wide variety of economically feasible high level functional units. It was observed that current computer systems have only a limited capacity to absorb new VLSI component types other than memory, microprocessors, and a relatively small number of other parts. The first purpose is to explore a system design which is capable of effectively incorporating a considerable number of VLSI part types and will both increase the speed of computation and reduce the attendant programming effort. A second purpose is to explore design techniques for VLSI parts which when incorporated by such a system will result in speeds and costs which are optimal. The proposed work may lay the groundwork for future efforts in the extensive simulation and measurements of the system's cost effectiveness and lead to prototype development.

  14. Clinical nursing informatics. Developing tools for knowledge workers.

    PubMed

    Ozbolt, J G; Graves, J R

    1993-06-01

    Current research in clinical nursing informatics is proceeding along three important dimensions: (1) identifying and defining nursing's language and structuring its data; (2) understanding clinical judgment and how computer-based systems can facilitate and not replace it; and (3) discovering how well-designed systems can transform nursing practice. A number of efforts are underway to find and use language that accurately represents nursing and that can be incorporated into computer-based information systems. These efforts add to understanding nursing problems, interventions, and outcomes, and provide the elements for databases from which nursing's costs and effectiveness can be studied. Research on clinical judgment focuses on how nurses (perhaps with different levels of expertise) assess patient needs, set goals, and plan and deliver care, as well as how computer-based systems can be developed to aid these cognitive processes. Finally, investigators are studying not only how computers can help nurses with the mechanics and logistics of processing information but also and more importantly how access to informatics tools changes nursing care.

  15. Relation between efficiency and energy cost with coordination in aquatic locomotion.

    PubMed

    Figueiredo, Pedro; Toussaint, Huub M; Vilas-Boas, João Paulo; Fernandes, Ricardo J

    2013-03-01

    The aim of this study was to establish the relationships between the intracycle velocity variation (IVV) and Froude efficiency (η(T)), energy cost (C), and index of coordination (IdC) throughout a 200-m freestyle race. Ten male international level swimmers performed a maximum 200 m front crawl swim. Performance was recorded with four below- and two above-water synchronized cameras. Oxygen consumption was measured continuously during the effort, and blood samples were collected before and after the test. IdC, body center of mass' IVV (x, y and z), and η(T) were also calculated. For assessing C swimmers performed also 50, 100 and 150 m at the same pace as in the 200-m splits to capture blood lactate samples after each 50-m lap of the 200-m effort. Swimmers attained a stable IVV (x, y, and z), as fatigue development along the 200-m effort induced a decrease in velocity, stroke length, stroke frequency, η(T), and an increase of IdC. Direct relationships between C and IdC for the second and fourth lap were found: R = 0.63 and R = 0.69 (P < 0.05), respectively. Computing partial correlation, also IdC and η(T) in the first lap were significantly correlated (R = -0.63, P < 0.05). IdC and η(T) showed to be significant for the within-subjects correlation (R = -0.45, P = 0.01), and IdC and C for the between-subjects correlation (R = 0.66, P = 0.04). Patterns of coordination modified during the 200-m event in response to the task constraints, observed by the changes in the other studied parameters, and allowing the IVV stability along the effort.

  16. Structural integrity of a confinement vessel for testing nuclear fuels for space propulsion

    NASA Astrophysics Data System (ADS)

    Bergmann, V. L.

    Nuclear propulsion systems for rockets could significantly reduce the travel time to distant destinations in space. However, long before such a concept can become reality, a significant effort must be invested in analysis and ground testing to guide the development of nuclear fuels. Any testing in support of development of nuclear fuels for space propulsion must be safely contained to prevent the release of radioactive materials. This paper describes analyses performed to assess the structural integrity of a test confinement vessel. The confinement structure, a stainless steel pressure vessel with bolted flanges, was designed for operating static pressures in accordance with the ASME Boiler and Pressure Vessel Code. In addition to the static operating pressures, the confinement barrier must withstand static overpressures from off-normal conditions without releasing radioactive material. Results from axisymmetric finite element analyses are used to evaluate the response of the confinement structure under design and accident conditions. For the static design conditions, the stresses computed from the ASME code are compared with the stresses computed by the finite element method.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friese, Ryan; Khemka, Bhavesh; Maciejewski, Anthony A

    Rising costs of energy consumption and an ongoing effort for increases in computing performance are leading to a significant need for energy-efficient computing. Before systems such as supercomputers, servers, and datacenters can begin operating in an energy-efficient manner, the energy consumption and performance characteristics of the system must be analyzed. In this paper, we provide an analysis framework that will allow a system administrator to investigate the tradeoffs between system energy consumption and utility earned by a system (as a measure of system performance). We model these trade-offs as a bi-objective resource allocation problem. We use a popular multi-objective geneticmore » algorithm to construct Pareto fronts to illustrate how different resource allocations can cause a system to consume significantly different amounts of energy and earn different amounts of utility. We demonstrate our analysis framework using real data collected from online benchmarks, and further provide a method to create larger data sets that exhibit similar heterogeneity characteristics to real data sets. This analysis framework can provide system administrators with insight to make intelligent scheduling decisions based on the energy and utility needs of their systems.« less

  18. Development of a SMA-Based Slat-Cove Filler for Reduction of Aeroacoustic Noise Associated With Transport-Class Aircraft Wings

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Kidd, Reggie T.; Hartl, Darren J.; Scholten, William D.

    2013-01-01

    Airframe noise is a significant part of the overall noise produced by typical, transport-class aircraft during the approach and landing phases of flight. Leading-edge slat noise is a prominent source of airframe noise. The concept of a slat-cove filler was proposed in previous work as an effective means of mitigating slat noise. Bench-top models were deployed at 75% scale to study the feasibility of producing a functioning slat-cove filler. Initial results from several concepts led to a more-focused effort investigating a deformable structure based upon pseudoelastic SMA materials. The structure stows in the cavity between the slat and main wing during cruise and deploys simultaneously with the slat to guide the aerodynamic flow suitably for low noise. A qualitative parametric study of SMA-enabled, slat-cove filler designs was performed on the bench-top. Computational models were developed and analyses were performed to assess the displacement response under representative aerodynamic load. The bench-top and computational results provide significant insight into design trades and an optimal design.

  19. Carbon Nanotubes for Space Applications

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya

    2000-01-01

    The potential of nanotube technology for NASA missions is significant and is properly recognized by NASA management. Ames has done much pioneering research in the last five years on carbon nanotube growth, characterization, atomic force microscopy, sensor development and computational nanotechnology. NASA Johnson Space Center has focused on laser ablation production of nanotubes and composites development. These in-house efforts, along with strategic collaboration with academia and industry, are geared towards meeting the agency's mission requirements. This viewgraph presentation (including an explanation for each slide) outlines the research focus for Ames nanotechnology, including details on carbon nanotubes' properties, applications, and synthesis.

  20. Burnout among clinical dental students at Jordanian universities.

    PubMed

    Badran, D H; Al-Ali, M H; Duaibis, R B; Amin, W M

    2010-04-01

    Dentistry is a profession demanding physical and mental efforts as well as people contact, which can result in burnout. The level of burnout among 307 clinical dental students in 2 Jordanian universities was evaluated using the Maslach Burnout Inventory survey. Scores for the inventory's 3 subscales were calculated and the mean values for the students' groups were computed separately. Dental students in both universities suffered high levels of emotional exhaustion and depersonalization. The dental students at the University of Jordan demonstrated a significantly higher level of emotional exhaustion than their counterparts at the Jordan University of Science and Technology.

  1. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    There are no author-identified significant results in this report. Research efforts have been placed on: (1) location, acquisition, and preparation of baseline information necessary for the computer analysis, and (2) refinement of techniques for analysis of MSS data obtained from ERTS-1. Analysis of the first frame of data collected by the ERTS-1 multispectral scanner system over the Lake Texoma area has proven very valuable for determining the best procedures to follow in working with and analyzing ERTS data. Progress on the following projects is described: (1) cover type mapping; (2) geomorphology; and hydrologic feature surveys.

  2. Protein Simulation Data in the Relational Model.

    PubMed

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  3. Protein Simulation Data in the Relational Model

    PubMed Central

    Simms, Andrew M.; Daggett, Valerie

    2011-01-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost—significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server. PMID:23204646

  4. Development of a CRAY 1 version of the SINDA program. [thermo-structural analyzer program

    NASA Technical Reports Server (NTRS)

    Juba, S. M.; Fogerson, P. E.

    1982-01-01

    The SINDA thermal analyzer program was transferred from the UNIVAC 1110 computer to a CYBER And then to a CRAY 1. Significant changes to the code of the program were required in order to execute efficiently on the CYBER and CRAY. The program was tested on the CRAY using a thermal math model of the shuttle which was too large to run on either the UNIVAC or CYBER. An effort was then begun to further modify the code of SINDA in order to make effective use of the vector capabilities of the CRAY.

  5. A platform for evolving intelligently interactive adversaries.

    PubMed

    Fogel, David B; Hays, Timothy J; Johnson, Douglas R

    2006-07-01

    Entertainment software developers face significant challenges in designing games with broad appeal. One of the challenges concerns creating nonplayer (computer-controlled) characters that can adapt their behavior in light of the current and prospective situation, possibly emulating human behaviors. This adaptation should be inherently novel, unrepeatable, yet within the bounds of realism. Evolutionary algorithms provide a suitable method for generating such behaviors. This paper provides background on the entertainment software industry, and details a prior and current effort to create a platform for evolving nonplayer characters with genetic and behavioral traits within a World War I combat flight simulator.

  6. Computational Electromagnetic Modeling of SansEC(Trade Mark) Sensors

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.; Dudley, Kenneth L.; Szatkowski, George N.

    2011-01-01

    This paper describes the preliminary effort to apply computational design tools to aid in the development of an electromagnetic SansEC resonant sensor composite materials damage detection system. The computational methods and models employed on this research problem will evolve in complexity over time and will lead to the development of new computational methods and experimental sensor systems that demonstrate the capability to detect, diagnose, and monitor the damage of composite materials and structures on aerospace vehicles.

  7. An application of interactive computer graphics technology to the design of dispersal mechanisms

    NASA Technical Reports Server (NTRS)

    Richter, B. J.; Welch, B. H.

    1977-01-01

    Interactive computer graphics technology is combined with a general purpose mechanisms computer code to study the operational behavior of three guided bomb dispersal mechanism designs. These studies illustrate the use of computer graphics techniques to discover operational anomalies, to assess the effectiveness of design improvements, to reduce the time and cost of the modeling effort, and to provide the mechanism designer with a visual understanding of the physical operation of such systems.

  8. Structural behavior of composites with progressive fracture

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Murthy, P. L. N.; Chamis, C. C.

    1989-01-01

    The objective of the study is to unify several computational tools developed for the prediction of progressive damage and fracture with efforts for the prediction of the overall response of damaged composite structures. In particular, a computational finite element model for the damaged structure is developed using a computer program as a byproduct of the analysis of progressive damage and fracture. Thus, a single computational investigation can predict progressive fracture and the resulting variation in structural properties of angleplied composites.

  9. Multicore: Fallout from a Computing Evolution

    ScienceCinema

    Yelick, Kathy [Director, NERSC

    2017-12-09

    July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  10. Intelligent single switch wheelchair navigation.

    PubMed

    Ka, Hyun W; Simpson, Richard; Chung, Younghyun

    2012-11-01

    We have developed an intelligent single switch scanning interface and wheelchair navigation assistance system, called intelligent single switch wheelchair navigation (ISSWN), to improve driving safety, comfort and efficiency for individuals who rely on single switch scanning as a control method. ISSWN combines a standard powered wheelchair with a laser rangefinder, a single switch scanning interface and a computer. It provides the user with context sensitive and task specific scanning options that reduce driving effort based on an interpretation of sensor data together with user input. Trials performed by 9 able-bodied participants showed that the system significantly improved driving safety and efficiency in a navigation task by significantly reducing the number of switch presses to 43.5% of traditional single switch wheelchair navigation (p < 0.001). All participants made a significant improvement (39.1%; p < 0.001) in completion time after only two trials.

  11. Advanced computational tools for 3-D seismic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less

  12. Computational Modeling of River Flow, Sediment Transport, and Bed Evolution Using Remotely Sensed Data

    DTIC Science & Technology

    2011-01-01

    mile reach from Lewiston Lake to the North Fork of the Trinity, which includes the sites above. As of this writing, all data has been analyzed and...collection effort, probably a bathymetric LiDAR effort on the Kootenai River near Bonner’s Ferry, Idaho . Detailed multibeam acoustic surveys already

  13. Neural Network Research: A Personal Perspective,

    DTIC Science & Technology

    1988-03-01

    problems in computer science and technology today. Still others do both. Whatever the focus, here isafidred to adre efforts of a wide variety of gifted ...Still others do both. Whatever the focus, here is a field ready to challenge and reward the sustained efforts of a wide variety of gifted people. 14 7eN. a rcb

  14. An Integrated In Vitro and Computational Approach to Define the Exposure-Dose-Toxicity Relationships In High-Throughput Screens

    EPA Science Inventory

    Research efforts by the US Environmental Protection Agency have set out to develop alternative testing programs to prioritize limited testing resources toward chemicals that likely represent the greatest hazard to human health and the environment. Efforts such as EPA’s ToxCast r...

  15. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1988-01-01

    Research focused on two major areas. The first effort addressed the design and implementation of a technique that allows for the visualization of the real time variation of physical properties. The second effort focused on the design and implementation of an on-line help system with components designed for both authors and users of help information.

  16. Assignment Choice: Do Students Choose Briefer Assignments or Finishing What They Started?

    ERIC Educational Resources Information Center

    Hawthorn-Embree, Meredith L.; Skinner, Christopher H.; Parkhurst, John; O'Neil, Michael; Conley, Elisha

    2010-01-01

    Academic skill development requires engagement in effortful academic behaviors. Although students may be more likely to choose to engage in behaviors that require less effort, they also may be motivated to complete assignments that they have already begun. Seventh-grade students (N = 88) began a mathematics computation worksheet, but were stopped…

  17. A Meta-Analysis of Writing Instruction for Students in the Elementary Grades

    ERIC Educational Resources Information Center

    Graham, Steve; McKeown, Debra; Kiuhara, Sharlene; Harris, Karen R.

    2012-01-01

    In an effort to identify effective instructional practices for teaching writing to elementary grade students, we conducted a meta-analysis of the writing intervention literature, focusing our efforts on true and quasi-experiments. We located 115 documents that included the statistics for computing an effect size (ES). We calculated an average…

  18. Joint Force Cyberspace Component Command: Establishing Cyberspace Operations Unity of Effort for the Joint Force Commander

    DTIC Science & Technology

    2015-05-01

    say/. According to the article, “the hackers targeted big-name makers of nuclear and solar technology, stealing confidential business information...As JTF-GNO synchronized efforts to disinfect and protect over 2.5 million computers in 3,500 DoD organizations spanning 99 countries, Defense

  19. Using Ada: The deeper challenges

    NASA Technical Reports Server (NTRS)

    Feinberg, David A.

    1986-01-01

    The Ada programming language and the associated Ada Programming Support Environment (APSE) and Ada Run Time Environment (ARTE) provide the potential for significant life-cycle cost reductions in computer software development and maintenance activities. The Ada programming language itself is standardized, trademarked, and controlled via formal validation procedures. Though compilers are not yet production-ready as most would desire, the technology for constructing them is sufficiently well known and understood that time and money should suffice to correct current deficiencies. The APSE and ARTE are, on the other hand, significantly newer issues within most software development and maintenance efforts. Currently, APSE and ARTE are highly dependent on differing implementer concepts, strategies, and market objectives. Complex and sophisticated mission-critical computing systems require the use of a complete Ada-based capability, not just the programming language itself; yet the range of APSE and ARTE features which must actually be utilized can vary significantly from one system to another. As a consequence, the need to understand, objectively evaluate, and select differing APSE and ARTE capabilities and features is critical to the effective use of Ada and the life-cycle efficiencies it is intended to promote. It is the selection, collection, and understanding of APSE and ARTE which provide the deeper challenges of using Ada for real-life mission-critical computing systems. Some of the current issues which must be clarified, often on a case-by-case basis, in order to successfully realize the full capabilities of Ada are discussed.

  20. Application of a range of turbulence energy models to the determination of M4 tidal current profiles

    NASA Astrophysics Data System (ADS)

    Xing, Jiuxing; Davies, Alan M.

    1996-04-01

    A fully nonlinear, three-dimensional hydrodynamic model of the Irish Sea, using a range of turbulence energy sub-models, is used to examine the influence of the turbulence closure method upon the vertical variation of the current profile of the fundamental and higher harmonics of the tide in the region. Computed tidal current profiles are compared with previous calculations using a spectral model with eddy viscosity related to the flow field. The model has a sufficiently fine grid to resolve the advection terms, in particular the advection of turbulence and momentum. Calculations show that the advection of turbulence energy does not have a significant influence upon the current profile of either the fundamental or higher harmonic of the tide, although the advection of momentum is important in the region of headlands. The simplification of the advective terms by only including them in their vertically integrated form does not appear to make a significant difference to current profiles, but does reduce the computational effort by a significant amount. Computed current profiles both for the fundamental and the higher harmonic determined with a prognostic equation for turbulence and an algebraic mixing length formula, are as accurate as those determined with a two prognostic equation model (the so called q2- q2l model), provided the mixing length is specified correctly. A simple, flow-dependent eddy viscosity with a parabolic variation of viscosity also performs equally well.

  1. Networks in ATLAS

    NASA Astrophysics Data System (ADS)

    McKee, Shawn; ATLAS Collaboration

    2017-10-01

    Networks have played a critical role in high-energy physics (HEP), enabling us to access and effectively utilize globally distributed resources to meet the needs of our physicists. Because of their importance in enabling our grid computing infrastructure many physicists have taken leading roles in research and education (R&E) networking, participating in, and even convening, network related meetings and research programs with the broader networking community worldwide. This has led to HEP benefiting from excellent global networking capabilities for little to no direct cost. However, as other science domains ramp-up their need for similar networking it becomes less clear that this situation will continue unchanged. What this means for ATLAS in particular needs to be understood. ATLAS has evolved its computing model since the LHC started based upon its experience with using globally distributed resources. The most significant theme of those changes has been increased reliance upon, and use of, its networks. We will report on a number of networking initiatives in ATLAS including participation in the global perfSONAR network monitoring and measuring efforts of WLCG and OSG, the collaboration with the LHCOPN/LHCONE effort, the integration of network awareness into PanDA, the use of the evolving ATLAS analytics framework to better understand our networks and the changes in our DDM system to allow remote access to data. We will also discuss new efforts underway that are exploring the inclusion and use of software defined networks (SDN) and how ATLAS might benefit from: • Orchestration and optimization of distributed data access and data movement. • Better control of workflows, end to end. • Enabling prioritization of time-critical vs normal tasks • Improvements in the efficiency of resource usage

  2. Exemplar for simulation challenges: Large-deformation micromechanics of Sylgard 184/glass microballoon syntactic foams.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Judith Alice; Long, Kevin Nicholas

    2018-05-01

    Sylgard® 184/Glass Microballoon (GMB) potting material is currently used in many NW systems. Analysts need a macroscale constitutive model that can predict material behavior under complex loading and damage evolution. To address this need, ongoing modeling and experimental efforts have focused on study of damage evolution in these materials. Micromechanical finite element simulations that resolve individual GMB and matrix components promote discovery and better understanding of the material behavior. With these simulations, we can study the role of the GMB volume fraction, time-dependent damage, behavior under confined vs. unconfined compression, and the effects of partial damage. These simulations are challengingmore » and push the boundaries of capability even with the high performance computing tools available at Sandia. We summarize the major challenges and the current state of this modeling effort, as an exemplar of micromechanical modeling needs that can motivate advances in future computing efforts.« less

  3. Animals and the 3Rs in toxicology research and testing: The way forward.

    PubMed

    Stokes, W S

    2015-12-01

    Despite efforts to eliminate the use of animals in testing and the availability of many accepted alternative methods, animals are still widely used for toxicological research and testing. While research using in vitro and computational models has dramatically increased in recent years, such efforts have not yet measurably impacted animal use for regulatory testing and are not likely to do so for many years or even decades. Until regulatory authorities have accepted test methods that can totally replace animals and these are fully implemented, large numbers of animals will continue to be used and many will continue to experience significant pain and distress. In order to positively impact the welfare of these animals, accepted alternatives must be implemented, and efforts must be directed at eliminating pain and distress and reducing animal numbers. Animal pain and distress can be reduced by earlier predictive humane endpoints, pain-relieving medications, and supportive clinical care, while sequential testing and routine use of integrated testing and decision strategies can reduce animal numbers. Applying advances in science and technology to the development of scientifically sound alternative testing models and strategies can improve animal welfare and further reduce and replace animal use. © The Author(s) 2015.

  4. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    ERIC Educational Resources Information Center

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  5. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    ERIC Educational Resources Information Center

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  6. Educational Technology and the Restructuring Movement: Lessons from Research on Computers in Classrooms.

    ERIC Educational Resources Information Center

    Kell, Diane; And Others

    This paper presents findings from a recently completed study of the use of computers in primary classrooms as one source of evidence concerning the role technology can play in school restructuring efforts. The sites for the study were selected by Apple Computer, Inc. in the spring of 1988 and included 43 classrooms in 10 schools in 6 large, mostly…

  7. Active Computer Network Defense: An Assessment

    DTIC Science & Technology

    2001-04-01

    sufficient base of knowledge in information technology can be assumed to be working on some form of computer network warfare, even if only defensive in...the Defense Information Infrastructure (DII) to attack. Transmission Control Protocol/ Internet Protocol (TCP/IP) networks are inherently resistant to...aims to create this part of information superiority, and computer network defense is one of its fundamental components. Most of these efforts center

  8. Disappearing Happy Little Sheep: Changing the Culture of Computing Education by Infusing the Cultures of Games and Fine Arts

    ERIC Educational Resources Information Center

    Decker, Adrienne; Phelps, Andrew; Egert, Christopher A.

    2017-01-01

    This article explores the critical need to articulate computing as a creative discipline and the potential for gender and ethnic diversity that such efforts enable. By embracing a culture shift within the discipline and using games as a medium of discourse, we can engage students and faculty in a broader definition of computing. The transformative…

  9. Global 30m Height Above the Nearest Drainage

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Winsemius, Hessel; Schellekens, Jaap; Erickson, Tyler; Gao, Hongkai; Savenije, Hubert; van de Giesen, Nick

    2016-04-01

    Variability of the Earth surface is the primary characteristics affecting the flow of surface and subsurface water. Digital elevation models, usually represented as height maps above some well-defined vertical datum, are used a lot to compute hydrologic parameters such as local flow directions, drainage area, drainage network pattern, and many others. Usually, it requires a significant effort to derive these parameters at a global scale. One hydrological characteristic introduced in the last decade is Height Above the Nearest Drainage (HAND): a digital elevation model normalized using nearest drainage. This parameter has been shown to be useful for many hydrological and more general purpose applications, such as landscape hazard mapping, landform classification, remote sensing and rainfall-runoff modeling. One of the essential characteristics of HAND is its ability to capture heterogeneities in local environments, difficult to measure or model otherwise. While many applications of HAND were published in the academic literature, no studies analyze its variability on a global scale, especially, using higher resolution DEMs, such as the new, one arc-second (approximately 30m) resolution version of SRTM. In this work, we will present the first global version of HAND computed using a mosaic of two DEMS: 30m SRTM and Viewfinderpanorama DEM (90m). The lower resolution DEM was used to cover latitudes above 60 degrees north and below 56 degrees south where SRTM is not available. We compute HAND using the unmodified version of the input DEMs to ensure consistency with the original elevation model. We have parallelized processing by generating a homogenized, equal-area version of HydroBASINS catchments. The resulting catchment boundaries were used to perform processing using 30m resolution DEM. To compute HAND, a new version of D8 local drainage directions as well as flow accumulation were calculated. The latter was used to estimate river head by incorporating fixed and variable thresholding methods. The resulting HAND dataset was analyzed regarding its spatial variability and to assess the global distribution of the main landform types: valley, ecotone, slope, and plateau. The method used to compute HAND was implemented using PCRaster software, running on Google Compute Engine platform running under Ubuntu Linux. The Google Earth Engine was used to perform mosaicing and clipping of the original DEMs as well as to provide access to the final product. The effort took about three months of computing time on eight core CPU virtual machine.

  10. Selective laser sintering of calcium phosphate materials for orthopedic implants

    NASA Astrophysics Data System (ADS)

    Lee, Goonhee

    Two technologies, Solid Freeform Fabrication (SFF) and bioceramics are combined in this work to prepare bone replacement implants with complex geometry. SFF has emerged as a crucial technique for rapid prototyping in the last decade. Selective Laser Sintering (SLS) is one of the established SFF manufacturing processes that can build three-dimensional objects directly from computer models without part-specific tooling or human intervention. Meanwhile, there have been great efforts to develop implantable materials that can assist in regeneration of bone defects and injuries. However, little attention has been focused in shaping bones from these materials. The main thrust of this research was to develop a process that can combine those two separate efforts. The specific objective of this research is to develop a process that can construct bone replacement material of complex geometry from synthetic calcium phosphate materials by using the SLS process. The achievement of this goal can have a significant impact on the quality of health care in the sense that complete custom-fit bone and tooth structures suitable for implantation can be prepared within 24--48 hours of receipt of geometric information obtained either from patient Computed Tomographic (CT) data, from Computer Aided Design (CAD) software or from other imaging systems such as Magnetic Resonance Imaging (MRI) and Holographic Laser Range Imaging (HLRI). In this research, two different processes have been developed. First is the SLS fabrication of porous bone implants. In this effort, systematic procedures have been established and calcium phosphate implants were successfully fabricated from various sources of geometric information. These efforts include material selection and preparation, SLS process parameter optimization, and development of post-processing techniques within the 48-hour time frame. Post-processing allows accurate control of geometry and of the chemistry of calcium phosphate, as well as control of micro and macro pore structure, to maximize bone healing and provide sufficient mechanical strength. It also permits the complete removal of the polymeric binders that are resided in the SLS process. In collaboration with the University of Texas Health Science Center at San Antonio and BioMedical Enterprises, Inc., porous implants based on anatomical geometry have been successfully implanted in rabbits and dogs. These histologic animal studies reveal excellent biocompatibility and show its great potential for commercial custom-fit implant manufacture. The second research effort involves fabrication of fully dense bone for application in dental restoration and load-bearing orthopedic functions. Calcium phosphate glass melts, proven to be biocompatible in the first effort, were cast into carbon molds. Processes were developed for preparing the molds. These carbon molds of anatomic shape can be prepared from either Computer Numerical Control (CNC) milling of slab stock or SLS processing of thermoset-coated graphite powder. The CNC milling method provides accurate dimension of the molds in a short period of time, however, the capable geometries are limited; generally two pieces of molds are required for complex shapes. The SLS method provides very complex shape green molds. However, they need to go through pyrolysis of thermoset binder to provide the high temperature capability reached at calcium phosphate melt temperatures (1100°C) and noticeable shrinkage was observed during pyrolysis. The cast glass was annealed to develop polycrystalline calcium phosphate. This process also exhibits great potential.

  11. A collaborative institutional model for integrating computer applications in the medical curriculum.

    PubMed Central

    Friedman, C. P.; Oxford, G. S.; Juliano, E. L.

    1991-01-01

    The introduction and promotion of information technology in an established medical curriculum with existing academic and technical support structures poses a number of challenges. The UNC School of Medicine has developed the Taskforce on Educational Applications in Medicine (TEAM), to coordinate this effort. TEAM works as a confederation of existing research and support units with interests in computers and education, along with a core of interested faculty with curricular responsibilities. Constituent units of the TEAM confederation include the medical center library, medical television studios, basic science teaching laboratories, educational development office, microcomputer and network support groups, academic affairs administration, and a subset of course directors and teaching faculty. Among our efforts have been the establishment of (1) a mini-grant program to support faculty initiated development and implementation of computer applications in the curriculum, (2) a symposium series with visiting speakers to acquaint faculty with current developments in medical informatics and related curricular efforts at other institution, (3) 20 computer workstations located in the multipurpose teaching labs where first and second year students do much of their academic work, (4) a demonstration center for evaluation of courseware and technologically advanced delivery systems. The student workstations provide convenient access to electronic mail, University schedules and calendars, the CoSy computer conferencing system, and several software applications integral to their courses in pathology, histology, microbiology, biochemistry, and neurobiology. The progress achieved toward the primary goal has modestly exceeded our initial expectations, while the collegiality and interest expressed toward TEAM activities in the local environment stand as empirical measures of the success of the concept. PMID:1807705

  12. The applicability of a computer model for predicting head injury incurred during actual motor vehicle collisions.

    PubMed

    Moran, Stephan G; Key, Jason S; McGwin, Gerald; Keeley, Jason W; Davidson, James S; Rue, Loring W

    2004-07-01

    Head injury is a significant cause of both morbidity and mortality. Motor vehicle collisions (MVCs) are the most common source of head injury in the United States. No studies have conclusively determined the applicability of computer models for accurate prediction of head injuries sustained in actual MVCs. This study sought to determine the applicability of such models for predicting head injuries sustained by MVC occupants. The Crash Injury Research and Engineering Network (CIREN) database was queried for restrained drivers who sustained a head injury. These collisions were modeled using occupant dynamic modeling (MADYMO) software, and head injury scores were generated. The computer-generated head injury scores then were evaluated with respect to the actual head injuries sustained by the occupants to determine the applicability of MADYMO computer modeling for predicting head injury. Five occupants meeting the selection criteria for the study were selected from the CIREN database. The head injury scores generated by MADYMO were lower than expected given the actual injuries sustained. In only one case did the computer analysis predict a head injury of a severity similar to that actually sustained by the occupant. Although computer modeling accurately simulates experimental crash tests, it may not be applicable for predicting head injury in actual MVCs. Many complicating factors surrounding actual MVCs make accurate computer modeling difficult. Future modeling efforts should consider variables such as age of the occupant and should account for a wider variety of crash scenarios.

  13. Application of computational physics within Northrop

    NASA Technical Reports Server (NTRS)

    George, M. W.; Ling, R. T.; Mangus, J. F.; Thompkins, W. T.

    1987-01-01

    An overview of Northrop programs in computational physics is presented. These programs depend on access to today's supercomputers, such as the Numerical Aerodynamical Simulator (NAS), and future growth on the continuing evolution of computational engines. Descriptions here are concentrated on the following areas: computational fluid dynamics (CFD), computational electromagnetics (CEM), computer architectures, and expert systems. Current efforts and future directions in these areas are presented. The impact of advances in the CFD area is described, and parallels are drawn to analagous developments in CEM. The relationship between advances in these areas and the development of advances (parallel) architectures and expert systems is also presented.

  14. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  15. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  16. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  17. Experimental and Analytical Studies for a Computational Materials Program

    NASA Technical Reports Server (NTRS)

    Knauss, W. G.

    1999-01-01

    The studies supported by Grant NAG1-1780 were directed at providing physical data on polymer behavior that would form the basis for computationally modeling these types of materials. Because of ongoing work in polymer characterization this grant supported part of a larger picture in this regard. Efforts went into two combined areas of their time dependent mechanical response characteristics: Creep properties on the one hand, subject to different volumetric changes (nonlinearly viscoelastic behavior) and time or frequency dependence of dilatational material behavior. The details of these endeavors are outlined sufficiently in the two appended publications, so that no further description of the effort is necessary.

  18. A structure adapted multipole method for electrostatic interactions in protein dynamics

    NASA Astrophysics Data System (ADS)

    Niedermeier, Christoph; Tavan, Paul

    1994-07-01

    We present an algorithm for rapid approximate evaluation of electrostatic interactions in molecular dynamics simulations of proteins. Traditional algorithms require computational work of the order O(N2) for a system of N particles. Truncation methods which try to avoid that effort entail untolerably large errors in forces, energies and other observables. Hierarchical multipole expansion algorithms, which can account for the electrostatics to numerical accuracy, scale with O(N log N) or even with O(N) if they become augmented by a sophisticated scheme for summing up forces. To further reduce the computational effort we propose an algorithm that also uses a hierarchical multipole scheme but considers only the first two multipole moments (i.e., charges and dipoles). Our strategy is based on the consideration that numerical accuracy may not be necessary to reproduce protein dynamics with sufficient correctness. As opposed to previous methods, our scheme for hierarchical decomposition is adjusted to structural and dynamical features of the particular protein considered rather than chosen rigidly as a cubic grid. As compared to truncation methods we manage to reduce errors in the computation of electrostatic forces by a factor of 10 with only marginal additional effort.

  19. Soviet Cybernetics Review, Volume 3, Number 11.

    ERIC Educational Resources Information Center

    Holland, Wade B.

    Soviet efforts in designing third-generation computers are discussed in two featured articles which describe (1) the development and production of integrated circuits, and their role in computers; and (2) the use of amorphous chalcogenide glass in lasers, infrared devices, and semiconductors. Other articles discuss production-oriented branch…

  20. 76 FR 34965 - Cybersecurity, Innovation, and the Internet Economy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-15

    ... disrupt computing systems. These threats are exacerbated by the interconnected and interdependent architecture of today's computing environment. Theoretically, security deficiencies in one area may provide... does the move to cloud-based services have on education and research efforts in the I3S? 45. What is...

  1. NASA Aerodynamics Program Annual Report 1991

    DTIC Science & Technology

    1992-04-01

    results have been compared relation effort on an AH-1G Cobr : helicopter v- ,,odeI wind tunnel data at different has been completed. Computational...The computational studies cant discovery . Preliminary water-channel have shown the trapped vortex to be a viable and wind-tunnel tests have shown the

  2. 9,250 Apples for the Teacher.

    ERIC Educational Resources Information Center

    Uston, Ken

    1983-01-01

    Discusses Apple Computer Inc.'s plan to donate an Apple IIe to eligible elementary/secondary schools in California, dealer incentives for conducting orientation sessions for school personnel, and school uses of the computer (including peer tutoring and teacher education). Also discusses similar efforts of other microcomputer manufacturers. (JN)

  3. Computational Systems for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Soni, Bharat; Haupt, Tomasz; Koomullil, Roy; Luke, Edward; Thompson, David

    2002-01-01

    In this paper, we briefly describe our efforts to develop complex simulation systems. We focus first on four key infrastructure items: enterprise computational services, simulation synthesis, geometry modeling and mesh generation, and a fluid flow solver for arbitrary meshes. We conclude by presenting three diverse applications developed using these technologies.

  4. 75 FR 38595 - Guidance to States Regarding Driver History Record Information Security, Continuity of Operation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-02

    ... Standards and Technology's (NIST) Computer Security Division maintains a Computer Security Resource Center... Regarding Driver History Record Information Security, Continuity of Operation Planning, and Disaster... (SDLAs) to support their efforts at maintaining the security of information contained in the driver...

  5. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  6. Recombination of open-f-shell tungsten ions

    NASA Astrophysics Data System (ADS)

    Krantz, C.; Badnell, N. R.; Müller, A.; Schippers, S.; Wolf, A.

    2017-03-01

    We review experimental and theoretical efforts aimed at a detailed understanding of the recombination of electrons with highly charged tungsten ions characterised by an open 4f sub-shell. Highly charged tungsten occurs as a plasma contaminant in ITER-like tokamak experiments, where it acts as an unwanted cooling agent. Modelling of the charge state populations in a plasma requires reliable thermal rate coefficients for charge-changing electron collisions. The electron recombination of medium-charged tungsten species with open 4f sub-shells is especially challenging to compute reliably. Storage-ring experiments have been conducted that yielded recombination rate coefficients at high energy resolution and well-understood systematics. Significant deviations compared to simplified, but prevalent, computational models have been found. A new class of ab initio numerical calculations has been developed that provides reliable predictions of the total plasma recombination rate coefficients for these ions.

  7. Overview of aerothermodynamic loads definition study

    NASA Technical Reports Server (NTRS)

    Gaugler, Raymond E.

    1989-01-01

    Over the years, NASA has been conducting the Advanced Earth-to-Orbit (AETO) Propulsion Technology Program to provide the knowledge, understanding, and design methodology that will allow the development of advanced Earth-to-orbit propulsion systems with high performance, extended service life, automated operations, and diagnostics for in-flight health monitoring. The objective of the Aerothermodynamic Loads Definition Study is to develop methods to more accurately predict the operating environment in AETO propulsion systems, such as the Space Shuttle Main Engine (SSME) powerhead. The approach taken consists of 2 parts: to modify, apply, and disseminate existing computational fluid dynamics tools in response to current needs and to develop new technology that will enable more accurate computation of the time averaged and unsteady aerothermodynamic loads in the SSME powerhead. The software tools are detailed. Significant progress was made in the area of turbomachinery, where there is an overlap between the AETO efforts and research in the aeronautical gas turbine field.

  8. Portable color multimedia training systems based on monochrome laptop computers (CBT-in-a-briefcase), with spinoff implications for video uplink and downlink in spaceflight operations

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1994-01-01

    This report describes efforts to use digital motion video compression technology to develop a highly portable device that would convert 1990-91 era IBM-compatible and/or MacIntosh notebook computers into full-color, motion-video capable multimedia training systems. An architecture was conceived that would permit direct conversion of existing laser-disk-based multimedia courses with little or no reauthoring. The project did not physically demonstrate certain critical video keying techniques, but their implementation should be feasible. This investigation of digital motion video has spawned two significant spaceflight projects at MSFC: one to downlink multiple high-quality video signals from Spacelab, and the other to uplink videoconference-quality video in realtime and high quality video off-line, plus investigate interactive, multimedia-based techniques for enhancing onboard science operations. Other airborne or spaceborne spinoffs are possible.

  9. Iterative CT reconstruction using coordinate descent with ordered subsets of data

    NASA Astrophysics Data System (ADS)

    Noo, F.; Hahn, K.; Schöndube, H.; Stierstorfer, K.

    2016-04-01

    Image reconstruction based on iterative minimization of a penalized weighted least-square criteria has become an important topic of research in X-ray computed tomography. This topic is motivated by increasing evidence that such a formalism may enable a significant reduction in dose imparted to the patient while maintaining or improving image quality. One important issue associated with this iterative image reconstruction concept is slow convergence and the associated computational effort. For this reason, there is interest in finding methods that produce approximate versions of the targeted image with a small number of iterations and an acceptable level of discrepancy. We introduce here a novel method to produce such approximations: ordered subsets in combination with iterative coordinate descent. Preliminary results demonstrate that this method can produce, within 10 iterations and using only a constant image as initial condition, satisfactory reconstructions that retain the noise properties of the targeted image.

  10. Simulation Enabled Safeguards Assessment Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements inmore » functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.« less

  11. Ubiquitous information for ubiquitous computing: expressing clinical data sets with openEHR archetypes.

    PubMed

    Garde, Sebastian; Hovenga, Evelyn; Buck, Jasmin; Knaup, Petra

    2006-01-01

    Ubiquitous computing requires ubiquitous access to information and knowledge. With the release of openEHR Version 1.0 there is a common model available to solve some of the problems related to accessing information and knowledge by improving semantic interoperability between clinical systems. Considerable work has been undertaken by various bodies to standardise Clinical Data Sets. Notwithstanding their value, several problems remain unsolved with Clinical Data Sets without the use of a common model underpinning them. This paper outlines these problems like incompatible basic data types and overlapping and incompatible definitions of clinical content. A solution to this based on openEHR archetypes is motivated and an approach to transform existing Clinical Data Sets into archetypes is presented. To avoid significant overlaps and unnecessary effort during archetype development, archetype development needs to be coordinated nationwide and beyond and also across the various health professions in a formalized process.

  12. A summary of computational experience at GE Aircraft Engines for complex turbulent flows in gas turbines

    NASA Astrophysics Data System (ADS)

    Zerkle, Ronald D.; Prakash, Chander

    1995-03-01

    This viewgraph presentation summarizes some CFD experience at GE Aircraft Engines for flows in the primary gaspath of a gas turbine engine and in turbine blade cooling passages. It is concluded that application of the standard k-epsilon turbulence model with wall functions is not adequate for accurate CFD simulation of aerodynamic performance and heat transfer in the primary gas path of a gas turbine engine. New models are required in the near-wall region which include more physics than wall functions. The two-layer modeling approach appears attractive because of its computational complexity. In addition, improved CFD simulation of film cooling and turbine blade internal cooling passages will require anisotropic turbulence models. New turbulence models must be practical in order to have a significant impact on the engine design process. A coordinated turbulence modeling effort between NASA centers would be beneficial to the gas turbine industry.

  13. A summary of computational experience at GE Aircraft Engines for complex turbulent flows in gas turbines

    NASA Technical Reports Server (NTRS)

    Zerkle, Ronald D.; Prakash, Chander

    1995-01-01

    This viewgraph presentation summarizes some CFD experience at GE Aircraft Engines for flows in the primary gaspath of a gas turbine engine and in turbine blade cooling passages. It is concluded that application of the standard k-epsilon turbulence model with wall functions is not adequate for accurate CFD simulation of aerodynamic performance and heat transfer in the primary gas path of a gas turbine engine. New models are required in the near-wall region which include more physics than wall functions. The two-layer modeling approach appears attractive because of its computational complexity. In addition, improved CFD simulation of film cooling and turbine blade internal cooling passages will require anisotropic turbulence models. New turbulence models must be practical in order to have a significant impact on the engine design process. A coordinated turbulence modeling effort between NASA centers would be beneficial to the gas turbine industry.

  14. Recent Developments in Computed Tomography for Urolithiasis: Diagnosis and Characterization

    PubMed Central

    Mc Laughlin, P. D.; Crush, L.; Maher, M. M.; O'Connor, O. J.

    2012-01-01

    Objective. To critically evaluate the current literature in an effort to establish the current role of radiologic imaging, advances in computed tomography (CT) and standard film radiography in the diagnosis, and characterization of urinary tract calculi. Conclusion. CT has a valuable role when utilized prudently during surveillance of patients following endourological therapy. In this paper, we outline the basic principles relating to the effects of exposure to ionizing radiation as a result of CT scanning. We discuss the current developments in low-dose CT technology, which have resulted in significant reductions in CT radiation doses (to approximately one-third of what they were a decade ago) while preserving image quality. Finally, we will discuss an important recent development now commercially available on the latest generation of CT scanners, namely, dual energy imaging, which is showing promise in urinary tract imaging as a means of characterizing the composition of urinary tract calculi. PMID:22952473

  15. Toward Improved Modeling of Spectral Solar Irradiance for Solar Energy Applications: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit

    This study introduces the National Renewable Energy Laboratory's (NREL's) recent efforts to extend the capability of the Fast All-sky Radiation Model for Solar applications (FARMS) by computing spectral solar irradiances over both horizontal and inclined surfaces. A new model is developed by computing the optical thickness of the atmosphere using a spectral irradiance model for clear-sky conditions, SMARTS2. A comprehensive lookup table (LUT) of cloud bidirectional transmittance distribution functions (BTDFs) is precomputed for 2002 wavelength bands using an atmospheric radiative transfer model, libRadtran. The solar radiation transmitted through the atmosphere is given by considering all possible paths of photon transmissionmore » and the relevent scattering and absorption attenuation. Our results indicate that this new model has an accuracy that is similar to that of state-of-the-art radiative transfer models, but it is significantly more efficient.« less

  16. Robust Nucleus/Cell Detection and Segmentation in Digital Pathology and Microscopy Images: A Comprehensive Review.

    PubMed

    Xing, Fuyong; Yang, Lin

    2016-01-01

    Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to interobserver variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literature. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast, fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation.

  17. In silico discovery of metal-organic frameworks for precombustion CO2 capture using a genetic algorithm

    PubMed Central

    Chung, Yongchul G.; Gómez-Gualdrón, Diego A.; Li, Peng; Leperi, Karson T.; Deria, Pravas; Zhang, Hongda; Vermeulen, Nicolaas A.; Stoddart, J. Fraser; You, Fengqi; Hupp, Joseph T.; Farha, Omar K.; Snurr, Randall Q.

    2016-01-01

    Discovery of new adsorbent materials with a high CO2 working capacity could help reduce CO2 emissions from newly commissioned power plants using precombustion carbon capture. High-throughput computational screening efforts can accelerate the discovery of new adsorbents but sometimes require significant computational resources to explore the large space of possible materials. We report the in silico discovery of high-performing adsorbents for precombustion CO2 capture by applying a genetic algorithm to efficiently search a large database of metal-organic frameworks (MOFs) for top candidates. High-performing MOFs identified from the in silico search were synthesized and activated and show a high CO2 working capacity and a high CO2/H2 selectivity. One of the synthesized MOFs shows a higher CO2 working capacity than any MOF reported in the literature under the operating conditions investigated here. PMID:27757420

  18. LAMMPS strong scaling performance optimization on Blue Gene/Q

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffman, Paul; Jiang, Wei; Romero, Nichols A.

    2014-11-12

    LAMMPS "Large-scale Atomic/Molecular Massively Parallel Simulator" is an open-source molecular dynamics package from Sandia National Laboratories. Significant performance improvements in strong-scaling and time-to-solution for this application on IBM's Blue Gene/Q have been achieved through computational optimizations of the OpenMP versions of the short-range Lennard-Jones term of the CHARMM force field and the long-range Coulombic interaction implemented with the PPPM (particle-particle-particle mesh) algorithm, enhanced by runtime parameter settings controlling thread utilization. Additionally, MPI communication performance improvements were made to the PPPM calculation by re-engineering the parallel 3D FFT to use MPICH collectives instead of point-to-point. Performance testing was done using anmore » 8.4-million atom simulation scaling up to 16 racks on the Mira system at Argonne Leadership Computing Facility (ALCF). Speedups resulting from this effort were in some cases over 2x.« less

  19. Climatic response variability and machine learning: development of a modular technology framework for predicting bio-climatic change in pacific northwest ecosystems"

    NASA Astrophysics Data System (ADS)

    Seamon, E.; Gessler, P. E.; Flathers, E.

    2015-12-01

    The creation and use of large amounts of data in scientific investigations has become common practice. Data collection and analysis for large scientific computing efforts are not only increasing in volume as well as number, the methods and analysis procedures are evolving toward greater complexity (Bell, 2009, Clarke, 2009, Maimon, 2010). In addition, the growth of diverse data-intensive scientific computing efforts (Soni, 2011, Turner, 2014, Wu, 2008) has demonstrated the value of supporting scientific data integration. Efforts to bridge this gap between the above perspectives have been attempted, in varying degrees, with modular scientific computing analysis regimes implemented with a modest amount of success (Perez, 2009). This constellation of effects - 1) an increasing growth in the volume and amount of data, 2) a growing data-intensive science base that has challenging needs, and 3) disparate data organization and integration efforts - has created a critical gap. Namely, systems of scientific data organization and management typically do not effectively enable integrated data collaboration or data-intensive science-based communications. Our research efforts attempt to address this gap by developing a modular technology framework for data science integration efforts - with climate variation as the focus. The intention is that this model, if successful, could be generalized to other application areas. Our research aim focused on the design and implementation of a modular, deployable technology architecture for data integration. Developed using aspects of R, interactive python, SciDB, THREDDS, Javascript, and varied data mining and machine learning techniques, the Modular Data Response Framework (MDRF) was implemented to explore case scenarios for bio-climatic variation as they relate to pacific northwest ecosystem regions. Our preliminary results, using historical NETCDF climate data for calibration purposes across the inland pacific northwest region (Abatzoglou, Brown, 2011), show clear ecosystems shifting over a ten-year period (2001-2011), based on multiple supervised classifier methods for bioclimatic indicators.

  20. Aerothermal Ground Testing of Flexible Thermal Protection Systems for Hypersonic Inflatable Aerodynamic Decelerators

    NASA Technical Reports Server (NTRS)

    Bruce, Walter E., III; Mesick, Nathaniel J.; Ferlemann, Paul G.; Siemers, Paul M., III; DelCorso, Joseph A.; Hughes, Stephen J.; Tobin, Steven A.; Kardell, Matthew P.

    2012-01-01

    Flexible TPS development involves ground testing and analysis necessary to characterize performance of the FTPS candidates prior to flight testing. This paper provides an overview of the analysis and ground testing efforts performed over the last year at the NASA Langley Research Center and in the Boeing Large-Core Arc Tunnel (LCAT). In the LCAT test series, material layups were subjected to aerothermal loads commensurate with peak re-entry conditions enveloping a range of HIAD mission trajectories. The FTPS layups were tested over a heat flux range from 20 to 50 W/cm with associated surface pressures of 3 to 8 kPa. To support the testing effort a significant redesign of the existing shear (wedge) model holder from previous testing efforts was undertaken to develop a new test technique for supporting and evaluating the FTPS in the high-temperature, arc jet flow. Since the FTPS test samples typically experience a geometry change during testing, computational fluid dynamic (CFD) models of the arc jet flow field and test model were developed to support the testing effort. The CFD results were used to help determine the test conditions experienced by the test samples as the surface geometry changes. This paper includes an overview of the Boeing LCAT facility, the general approach for testing FTPS, CFD analysis methodology and results, model holder design and test methodology, and selected thermal results of several FTPS layups.

  1. Initial Progress Toward Development of a Voice-Based Computer-Delivered Motivational Intervention for Heavy Drinking College Students: An Experimental Study

    PubMed Central

    Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L

    2017-01-01

    Background Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. Objective The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users’ verbal responses, more closely mirroring a human-delivered motivational intervention. Methods We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Results Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Conclusions Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. PMID:28659259

  2. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  3. Physics-based enzyme design: predicting binding affinity and catalytic activity.

    PubMed

    Sirin, Sarah; Pearlman, David A; Sherman, Woody

    2014-12-01

    Computational enzyme design is an emerging field that has yielded promising success stories, but where numerous challenges remain. Accurate methods to rapidly evaluate possible enzyme design variants could provide significant value when combined with experimental efforts by reducing the number of variants needed to be synthesized and speeding the time to reach the desired endpoint of the design. To that end, extending our computational methods to model the fundamental physical-chemical principles that regulate activity in a protocol that is automated and accessible to a broad population of enzyme design researchers is essential. Here, we apply a physics-based implicit solvent MM-GBSA scoring approach to enzyme design and benchmark the computational predictions against experimentally determined activities. Specifically, we evaluate the ability of MM-GBSA to predict changes in affinity for a steroid binder protein, catalytic turnover for a Kemp eliminase, and catalytic activity for α-Gliadin peptidase variants. Using the enzyme design framework developed here, we accurately rank the most experimentally active enzyme variants, suggesting that this approach could provide enrichment of active variants in real-world enzyme design applications. © 2014 Wiley Periodicals, Inc.

  4. Structure and application of an interface program between a geographic-information system and a ground-water flow model

    USGS Publications Warehouse

    Van Metre, P.C.

    1990-01-01

    A computer-program interface between a geographic-information system and a groundwater flow model links two unrelated software systems for use in developing the flow models. The interface program allows the modeler to compile and manage geographic components of a groundwater model within the geographic information system. A significant savings of time and effort is realized in developing, calibrating, and displaying the groundwater flow model. Four major guidelines were followed in developing the interface program: (1) no changes to the groundwater flow model code were to be made; (2) a data structure was to be designed within the geographic information system that follows the same basic data structure as the groundwater flow model; (3) the interface program was to be flexible enough to support all basic data options available within the model; and (4) the interface program was to be as efficient as possible in terms of computer time used and online-storage space needed. Because some programs in the interface are written in control-program language, the interface will run only on a computer with the PRIMOS operating system. (USGS)

  5. Using Neural Networks to Improve the Performance of Radiative Transfer Modeling Used for Geometry Dependent LER Calculations

    NASA Astrophysics Data System (ADS)

    Fasnacht, Z.; Qin, W.; Haffner, D. P.; Loyola, D. G.; Joiner, J.; Krotkov, N. A.; Vasilkov, A. P.; Spurr, R. J. D.

    2017-12-01

    In order to estimate surface reflectance used in trace gas retrieval algorithms, radiative transfer models (RTM) such as the Vector Linearized Discrete Ordinate Radiative Transfer Model (VLIDORT) can be used to simulate the top of the atmosphere (TOA) radiances with advanced models of surface properties. With large volumes of satellite data, these model simulations can become computationally expensive. Look up table interpolation can improve the computational cost of the calculations, but the non-linear nature of the radiances requires a dense node structure if interpolation errors are to be minimized. In order to reduce our computational effort and improve the performance of look-up tables, neural networks can be trained to predict these radiances. We investigate the impact of using look-up table interpolation versus a neural network trained using the smart sampling technique, and show that neural networks can speed up calculations and reduce errors while using significantly less memory and RTM calls. In future work we will implement a neural network in operational processing to meet growing demands for reflectance modeling in support of high spatial resolution satellite missions.

  6. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  7. Computer-Aided Molecular Design of Bis-phosphine Oxide Lanthanide Extractants

    DOE PAGES

    McCann, Billy W.; Silva, Nuwan De; Windus, Theresa L.; ...

    2016-02-17

    Computer-aided molecular design and high-throughput screening of viable host architectures can significantly reduce the efforts in the design of novel ligands for efficient extraction of rare earth elements. This paper presents a computational approach to the deliberate design of bis-phosphine oxide host architectures that are structurally organized for complexation of trivalent lanthanides. Molecule building software, HostDesigner, was interfaced with molecular mechanics software, PCModel, providing a tool for generating and screening millions of potential R 2(O)P-link-P(O)R 2 ligand geometries. The molecular mechanics ranking of ligand structures is consistent with both the solution-phase free energies of complexation obtained with density functional theorymore » and the performance of known bis-phosphine oxide extractants. For the case where link is -CH 2-, evaluation of the ligand geometry provides the first characterization of a steric origin for the ‘anomalous aryl strengthening’ effect. The design approach has identified a number of novel bis-phosphine oxide ligands that are better organized for lanthanide complexation than previously studied examples.« less

  8. Unsteady Three-Dimensional Simulation of a Shear Coaxial GO2/GH2 Rocket Injector with RANS and Hybrid-RAN-LES/DES Using Flamelet Models

    NASA Technical Reports Server (NTRS)

    Westra, Doug G.; West, Jeffrey S.; Richardson, Brian R.

    2015-01-01

    Historically, the analysis and design of liquid rocket engines (LREs) has relied on full-scale testing and one-dimensional empirical tools. The testing is extremely expensive and the one-dimensional tools are not designed to capture the highly complex, and multi-dimensional features that are inherent to LREs. Recent advances in computational fluid dynamics (CFD) tools have made it possible to predict liquid rocket engine performance, stability, to assess the effect of complex flow features, and to evaluate injector-driven thermal environments, to mitigate the cost of testing. Extensive efforts to verify and validate these CFD tools have been conducted, to provide confidence for using them during the design cycle. Previous validation efforts have documented comparisons of predicted heat flux thermal environments with test data for a single element gaseous oxygen (GO2) and gaseous hydrogen (GH2) injector. The most notable validation effort was a comprehensive validation effort conducted by Tucker et al. [1], in which a number of different groups modeled a GO2/GH2 single element configuration by Pal et al [2]. The tools used for this validation comparison employed a range of algorithms, from both steady and unsteady Reynolds Averaged Navier-Stokes (U/RANS) calculations, large-eddy simulations (LES), detached eddy simulations (DES), and various combinations. A more recent effort by Thakur et al. [3] focused on using a state-of-the-art CFD simulation tool, Loci/STREAM, on a two-dimensional grid. Loci/STREAM was chosen because it has a unique, very efficient flamelet parameterization of combustion reactions that are too computationally expensive to simulate with conventional finite-rate chemistry calculations. The current effort focuses on further advancement of validation efforts, again using the Loci/STREAM tool with the flamelet parameterization, but this time with a three-dimensional grid. Comparisons to the Pal et al. heat flux data will be made for both RANS and Hybrid RANSLES/ Detached Eddy simulations (DES). Computation costs will be reported, along with comparison of accuracy and cost to much less expensive two-dimensional RANS simulations of the same geometry.

  9. Design of a modular digital computer system DRL 4 and 5. [design of airborne/spaceborne computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.

  10. Fault Tolerant Software Technology for Distributed Computer Systems

    DTIC Science & Technology

    1989-03-01

    RAY.) &-TR-88-296 I Fin;.’ Technical Report ,r 19,39 i A28 3329 F’ULT TOLERANT SOFTWARE TECHNOLOGY FOR DISTRIBUTED COMPUTER SYSTEMS Georgia Institute...GrfisABN 34-70IiWftlI NO0. IN?3. NO IACCESSION NO. 158 21 7 11. TITLE (Incld security Cassification) FAULT TOLERANT SOFTWARE FOR DISTRIBUTED COMPUTER ...Technology for Distributed Computing Systems," a two year effort performed at Georgia Institute of Technology as part of the Clouds Project. The Clouds

  11. Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)

    ScienceCinema

    Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2018-05-07

    Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  12. The Impact of a Library Flood on Computer Operations.

    ERIC Educational Resources Information Center

    Myles, Barbara

    2000-01-01

    Describes the efforts at Boston Public Library to recover from serious flooding that damaged computer equipment. Discusses vendor help in assessing the damage; the loss of installation disks; hiring consultants to help with financial matters; effects on staff; repairing and replacing damaged equipment; insurance issues; and disaster recovery…

  13. A Man-Machine System for Contemporary Counseling Practice: Diagnosis and Prediction.

    ERIC Educational Resources Information Center

    Roach, Arthur J.

    This paper looks at present and future capabilities for diagnosis and prediction in computer-based guidance efforts and reviews the problems and potentials which will accompany the implementation of such capabilities. In addition to necessary procedural refinement in prediction, future developments in computer-based educational and career…

  14. Other Cosmic Ray Links

    Science.gov Websites

    curriculum for its course Physics In and Through Cosmology. The Distributed Observatory aims to become the world's largest cosmic ray telescope, using the distributed sensing and computing power of the world's cell phones. Modeled after the distributed computing efforts of SETI@Home and Folding@Home, the

  15. Evaluation of Complex Human Performance: The Promise of Computer-Based Simulation

    ERIC Educational Resources Information Center

    Newsom, Robert S.; And Others

    1978-01-01

    For the training and placement of professional workers, multiple-choice instruments are the norm for wide-scale measurement and evaluation efforts. These instruments contain fundamental problems. Computer-based management simulations may provide solutions to these problems, appear scoreable and reliable, offer increased validity, and are better…

  16. Modifications Of Hydrostatic-Bearing Computer Program

    NASA Technical Reports Server (NTRS)

    Hibbs, Robert I., Jr.; Beatty, Robert F.

    1991-01-01

    Several modifications made to enhance utility of HBEAR, computer program for analysis and design of hydrostatic bearings. Modifications make program applicable to more realistic cases and reduce time and effort necessary to arrive at a suitable design. Uses search technique to iterate on size of orifice to obtain required pressure ratio.

  17. Recruiting Women into Computer Science and Information Systems

    ERIC Educational Resources Information Center

    Broad, Steven; McGee, Meredith

    2014-01-01

    While many technical disciplines have reached or are moving toward gender parity in the number of bachelors degrees in those fields, the percentage of women graduating in computer science remains stubbornly low. Many recent efforts to address this situation have focused on retention of undergraduate majors or graduate students, recruiting…

  18. The Classroom, Board Room, Chat Room, and Court Room: School Computers at the Crossroads.

    ERIC Educational Resources Information Center

    Stewart, Michael

    2000-01-01

    In schools' efforts to maximize technology's benefits, ethical considerations have often taken a back seat. Computer misuse is growing exponentially and assuming many forms: unauthorized data access, hacking, piracy, information theft, fraud, virus creation, harassment, defamation, and discrimination. Integrated-learning activities will help…

  19. UNIX Micros for Students Majoring in Computer Science and Personal Information Retrieval.

    ERIC Educational Resources Information Center

    Fox, Edward A.; Birch, Sandra

    1986-01-01

    Traces the history of Virginia Tech's requirement that incoming freshmen majoring in computer science each acquire a microcomputer running the UNIX operating system; explores rationale for the decision; explains system's key features; and describes program implementation and research and development efforts to provide personal information…

  20. Verification and Validation of Monte Carlo N-Particle 6 for Computing Gamma Protection Factors

    DTIC Science & Technology

    2015-03-26

    methods for evaluating RPFs, which it used for the subsequent 30 years. These approaches included computational modeling, radioisotopes , and a high...1.2.1. Past Methods of Experimental Evaluation ........................................................ 2 1.2.2. Modeling Efforts...Other Considerations ......................................................................................... 14 2.4. Monte Carlo Methods

Top