The Role of Visualization in Computer Science Education
ERIC Educational Resources Information Center
Fouh, Eric; Akbar, Monika; Shaffer, Clifford A.
2012-01-01
Computer science core instruction attempts to provide a detailed understanding of dynamic processes such as the working of an algorithm or the flow of information between computing entities. Such dynamic processes are not well explained by static media such as text and images, and are difficult to convey in lecture. The authors survey the history…
First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)
NASA Technical Reports Server (NTRS)
Denning, P. J.
1986-01-01
The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.
Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment
ERIC Educational Resources Information Center
Lin, Jing-Wen
2016-01-01
This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…
Using Ontologies for Knowledge Management: An Information Systems Perspective.
ERIC Educational Resources Information Center
Jurisica, Igor; Mylopoulos, John; Yu, Eric
1999-01-01
Surveys some of the basic concepts that have been used in computer science for the representation of knowledge and summarizes some of their advantages and drawbacks. Relates these techniques to information sciences theory and practice. Concepts are classified in four broad ontological categories: static ontology, dynamic ontology, intentional…
Evaluating Implementations of Service Oriented Architecture for Sensor Network via Simulation
2011-04-01
Subject: COMPUTER SCIENCE Approved: Boleslaw Szymanski , Thesis Adviser Rensselaer Polytechnic Institute Troy, New York April 2011 (For Graduation May 2011...simulation supports distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space...distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space. The second simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablonowski, Christiane
The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
ERIC Educational Resources Information Center
Akpinar, Ercan
2014-01-01
This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30…
2011-07-13
Anton A. Stoorvogel b, Håvard Fjær Grip a aSchool of Electrical Engineering and Computer Science, Washington State University, Pullman, WA 99164-2752...utwente.nl ( Anton A. Stoorvogel), grip@ieee.org (Håvard Fjær Grip). of a double integrator controlled by a saturating linear static state feedback...References Chitour, Y., 2001. On the Lp stabilization of the double integrator subject to input saturation. ESAIM: Control, Optimization and Calculus
NASA Astrophysics Data System (ADS)
de Oliveira, José Martins, Jr.; Mangini, F. Salvador; Carvalho Vila, Marta Maria Duarte; ViníciusChaud, Marco
2013-05-01
This work presents an alternative and non-conventional technique for evaluatingof physic-chemical properties of pharmaceutical dosage forms, i.e. we used computed tomography (CT) technique as a nondestructive technique to visualize internal structures of pharmaceuticals dosage forms and to conduct static and dynamical studies. The studies were conducted involving static and dynamic situations through the use of tomographic images, generated by the scanner at University of Sorocaba - Uniso. We have shown that through the use of tomographic images it is possible to conduct studies of porosity, densities, analysis of morphological parameters and performing studies of dissolution. Our results are in agreement with the literature, showing that CT is a powerful tool for use in the pharmaceutical sciences.
NASA Astrophysics Data System (ADS)
Hassan, Hesham Galal
This thesis explores the proper principles and rules for creating excellent infographics that communicate information successfully and effectively. Not only does this thesis examine the creation of Infographics, it also tries to answer which format, Static or Animated Infographics, is the most effective when used as a teaching-aid framework for complex science subjects, and if compelling Infographics in the preferred format facilitate the learning experience. The methodology includes the creation of infographic using two formats (Static and Animated) of a fairly complex science subject (Phases Of The Moon), which were then tested for their efficacy as a whole, and the two formats were compared in terms of information comprehension and retention. My hypothesis predicts that the creation of an infographic using the animated format would be more effective in communicating a complex science subject (Phases Of The Moon), specifically when using 3D computer animation to visualize the topic. This would also help different types of learners to easily comprehend science subjects. Most of the animated infographics produced nowadays are created for marketing and business purposes and do not implement the analytical design principles required for creating excellent information design. I believe that science learners are still in need of more variety in their methods of learning information, and that infographics can be of great assistance. The results of this thesis study suggests that using properly designed infographics would be of great help in teaching complex science subjects that involve spatial and temporal data. This could facilitate learning science subjects and consequently impact the interest of young learners in STEM.
NASA Astrophysics Data System (ADS)
Akpınar, Ercan
2014-08-01
This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30 students, and the control group of 27 students. The control group received normal instruction in which the teacher provided instruction by means of lecture, discussion and homework. Whereas in the experiment group, dynamic and interactive animations based on POE were used as a presentation tool. Data collection tools used in the study were static electricity concept test and open-ended questions. The static electricity concept test was used as pre-test before the implementation, as post-test at the end of the implementation and as delay test approximately 6 weeks after the implementation. Open-ended questions were used at the end of the implementation and approximately 6 weeks after the implementation. Results indicated that the interactive animations used as presentation tools were more effective on the students' understanding of static electricity concepts compared to normal instruction.
Static Scheduler for Hard Real-Time Tasks on Multiprocessor Systems
1992-09-01
Foundation of Computer Science, 1980 . [SIM83] Simons, B., "Multiprocessor Scheduling of Unit-Time Jobs with Arbitrary Release Times and Deadlines", SIAM...Research Office Attn: Dr. David Hislop P. O. Box 12211 Research Triangle Park, NC 27709-2211 31. Persistent Data Systems 75 W. Chapel Ridge Road Attn: Dr
Correlative visualization techniques for multidimensional data
NASA Technical Reports Server (NTRS)
Treinish, Lloyd A.; Goettsche, Craig
1989-01-01
Critical to the understanding of data is the ability to provide pictorial or visual representation of those data, particularly in support of correlative data analysis. Despite the advancement of visualization techniques for scientific data over the last several years, there are still significant problems in bringing today's hardware and software technology into the hands of the typical scientist. For example, there are other computer science domains outside of computer graphics that are required to make visualization effective such as data management. Well-defined, flexible mechanisms for data access and management must be combined with rendering algorithms, data transformation, etc. to form a generic visualization pipeline. A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences. Different classes of data representation techniques must be used within such a framework, which can range from simple, static two- and three-dimensional line plots to animation, surface rendering, and volumetric imaging. Static examples of actual data analyses will illustrate the importance of an effective pipeline in data visualization system.
Design Aids for Real-Time Systems (DARTS)
NASA Technical Reports Server (NTRS)
Szulewski, P. A.
1982-01-01
Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.
Adaptations in Electronic Structure Calculations in Heterogeneous Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talamudupula, Sai
Modern quantum chemistry deals with electronic structure calculations of unprecedented complexity and accuracy. They demand full power of high-performance computing and must be in tune with the given architecture for superior e ciency. To make such applications resourceaware, it is desirable to enable their static and dynamic adaptations using some external software (middleware), which may monitor both system availability and application needs, rather than mix science with system-related calls inside the application. The present work investigates scienti c application interlinking with middleware based on the example of the computational chemistry package GAMESS and middleware NICAN. The existing synchronous model ismore » limited by the possible delays due to the middleware processing time under the sustainable runtime system conditions. Proposed asynchronous and hybrid models aim at overcoming this limitation. When linked with NICAN, the fragment molecular orbital (FMO) method is capable of adapting statically and dynamically its fragment scheduling policy based on the computing platform conditions. Signi cant execution time and throughput gains have been obtained due to such static adaptations when the compute nodes have very di erent core counts. Dynamic adaptations are based on the main memory availability at run time. NICAN prompts FMO to postpone scheduling certain fragments, if there is not enough memory for their immediate execution. Hence, FMO may be able to complete the calculations whereas without such adaptations it aborts.« less
NASA Astrophysics Data System (ADS)
Nejat, Cyrus; Nejat, Narsis; Nejat, Najmeh
2014-06-01
This research project is part of Narsis Nejat Master of Science thesis project that it is done at Shiraz University. The goals of this research are to make a computer model to evaluate the thermal power, electrical power, amount of emitted/absorbed dose, and amount of emitted/absorbed dose rate for static Radioisotope Thermoelectric Generators (RTG)s that is include a comprehensive study of the types of RTG systems and in particular RTG’s fuel resulting from both natural and artificial isotopes, calculation of the permissible dose radioisotope selected from the above, and conceptual design modeling and comparison between several NASA made RTGs with the project computer model pointing out the strong and weakness points for using this model in nuclear industries for simulation. The heat is being converted to electricity by two major methods in RTGs: static conversion and dynamic conversion. The model that is created for this project is for RTGs that heat is being converted to electricity statically. The model approximates good results as being compared with SNAP-3, SNAP-19, MHW, and GPHS RTGs in terms of electrical power, efficiency, specific power, and types of the mission and amount of fuel mass that is required to accomplish the mission.
Student leadership in small group science inquiry
NASA Astrophysics Data System (ADS)
Oliveira, Alandeom W.; Boz, Umit; Broadwell, George A.; Sadler, Troy D.
2014-09-01
Background: Science educators have sought to structure collaborative inquiry learning through the assignment of static group roles. This structural approach to student grouping oversimplifies the complexities of peer collaboration and overlooks the highly dynamic nature of group activity. Purpose: This study addresses this issue of oversimplification of group dynamics by examining the social leadership structures that emerge in small student groups during science inquiry. Sample: Two small student groups investigating the burning of a candle under a jar participated in this study. Design and method: We used a mixed-method research approach that combined computational discourse analysis (computational quantification of social aspects of small group discussions) with microethnography (qualitative, in-depth examination of group discussions). Results: While in one group social leadership was decentralized (i.e., students shared control over topics and tasks), the second group was dominated by a male student (centralized social leadership). Further, decentralized social leadership was found to be paralleled by higher levels of student cognitive engagement. Conclusions: It is argued that computational discourse analysis can provide science educators with a powerful means of developing pedagogical models of collaborative science learning that take into account the emergent nature of group structures and highly fluid nature of student collaboration.
Francisco Rodríguez y Silva; Juan Ramón Molina Martínez; Miguel Ángel Herrera Machuca; Jesús Mª Rodríguez Leal
2013-01-01
Progress made in recent years in fire science, particularly as applied to forest fire protection, coupled with the increased power offered by mathematical processors integrated into computers, has led to important developments in the field of dynamic and static simulation of forest fires. Furthermore, and similarly, econometric models applied to economic...
ERIC Educational Resources Information Center
Price, Norman T.
2013-01-01
The availability and sophistication of visual display images, such as simulations, for use in science classrooms has increased exponentially however, it can be difficult for teachers to use these images to encourage and engage active student thinking. There is a need to describe flexible discussion strategies that use visual media to engage active…
NASA Technical Reports Server (NTRS)
So, Kenneth T.; Hall, John B., Jr.; Thompson, Clifford D.
1987-01-01
NASA's Langley and Goddard facilities have evaluated the effects of animal science experiments on the Space Station's Environmental Control and Life Support System (ECLSS) by means of computer-aided analysis, assuming an animal colony consisting of 96 rodents and eight squirrel monkeys. Thirteen ECLSS options were established for the reclamation of metabolic oxygen and waste water. Minimum cost and weight impacts on the ECLSS are found to accrue to the system's operation in off-nominal mode, using electrochemical CO2 removal and a static feed electrolyzer for O2 generation.
The FORTRAN static source code analyzer program (SAP) user's guide, revision 1
NASA Technical Reports Server (NTRS)
Decker, W.; Taylor, W.; Eslinger, S.
1982-01-01
The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.
Human and Organizational Risk Modeling: Critical Personnel and Leadership in Network Organizations
2006-08-01
NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University,School of Computer...Science,Pittsburgh,PA,15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S...organization can help improve performance and protect against the risk of loss. But the study of critical personnel has traditionally used static structural
In-Flight Pitot-Static Calibration
NASA Technical Reports Server (NTRS)
Foster, John V. (Inventor); Cunningham, Kevin (Inventor)
2016-01-01
A GPS-based pitot-static calibration system uses global output-error optimization. High data rate measurements of static and total pressure, ambient air conditions, and GPS-based ground speed measurements are used to compute pitot-static pressure errors over a range of airspeed. System identification methods rapidly compute optimal pressure error models with defined confidence intervals.
Impact of Multimedia and Network Services on an Introductory Level Course
NASA Technical Reports Server (NTRS)
Russ, John C.
1996-01-01
We will demonstrate and describe the impact of our use of multimedia and network connectivity on a sophomore-level introductory course in materials science. This class services all engineering students, resulting in large (more than 150) class sections with no hands-on laboratory. In 1990 we began to develop computer graphics that might substitute for some laboratory or real-world experiences, and demonstrate relationships hard to show with static textbook images or chalkboard drawings. We created a comprehensive series of modules that cover the entire course content. Called VIMS (Visualizations in Materials Science), these are available in the form of a CD-ROM and also via the internet.
NASA Astrophysics Data System (ADS)
Harris, Christopher
In the U.S., science and math are taking spotlight in education, and rightfully so as they directly impact economic progression. Curiously absent is computer science, which despite its numerous job opportunities and growth does not have as much focus. This thesis develops a source code analysis framework using language translation, and machine learning classifiers to analyze programs written in Bricklayer for the purposes of programmatically identifying relative success or failure of a students Bricklayer program, helping teachers scale in the amount of students they can support, and providing better messaging. The thesis uses as a case study a set of student programs to demonstrate the possibilities of the framework.
A Static Aeroelastic Analysis of a Flexible Wing Mini Unmanned Aerial Vehicle
2008-03-27
is the most favorable because it generally results in the greatest CL max and is less prone to hysteresis in the lift curve. Carmichael emphasized the...Defense, 2005. 8. Carmichael B. H. Low Reynolds Number Airfoil Survey . Technical Report, NASA, 1981. 9. Crabtree L. F. “Effects of Leading-Edge Separation...44th AIAA Aerospace Sciences Meeting and Exhibit . Jan 2006. 34. Stults J. A. Computational Aeroelastic Analysis of Micro Air Vehicle with Ex
Visualizing topography: Effects of presentation strategy, gender, and spatial ability
NASA Astrophysics Data System (ADS)
McAuliffe, Carla
2003-10-01
This study investigated the effect of different presentation strategies (2-D static visuals, 3-D animated visuals, and 3-D interactive, animated visuals) and gender on achievement, time-spent-on visual treatment, and attitude during a computer-based science lesson about reading and interpreting topographic maps. The study also examined the relationship of spatial ability and prior knowledge to gender, achievement, and time-spent-on visual treatment. Students enrolled in high school chemistry-physics were pretested and given two spatial ability tests. They were blocked by gender and randomly assigned to one of three levels of presentation strategy or the control group. After controlling for the effects of spatial ability and prior knowledge with analysis of covariance, three significant differences were found between the versions: (a) the 2-D static treatment group scored significantly higher on the posttest than the control group; (b) the 3-D animated treatment group scored significantly higher on the posttest than the control group; and (c) the 2-D static treatment group scored significantly higher on the posttest than the 3-D interactive animated treatment group. Furthermore, the 3-D interactive animated treatment group spent significantly more time on the visual screens than the 2-D static treatment group. Analyses of student attitudes revealed that most students felt the landform visuals in the computer-based program helped them learn, but not in a way they would describe as fun. Significant differences in attitude were found by treatment and by gender. In contrast to findings from other studies, no gender differences were found on either of the two spatial tests given in this study. Cognitive load, cognitive involvement, and solution strategy are offered as three key factors that may help explain the results of this study. Implications for instructional design include suggestions about the use of 2-D static, 3-D animated and 3-D interactive animations as well as a recommendation about the inclusion of pretests in similar instructional programs. Areas for future research include investigating the effects of combinations of presentation strategies, continuing to examine the role of spatial ability in science achievement, and gaining cognitive insights about what it is that students do when learning to read and interpret topographic maps.
Computer simulations for lab experiences in secondary physics
NASA Astrophysics Data System (ADS)
Murphy, David Shannon
Physical science instruction often involves modeling natural systems, such as electricity that possess particles which are invisible to the unaided eye. The effect of these particles' motion is observable, but the particles are not directly observable to humans. Simulations have been developed in physics, chemistry and biology that, under certain circumstances, have been found to allow students to gain insight into the operation of the systems they model. This study compared the use of a DC circuit simulation, a modified simulation, static graphics, and traditional bulbs and wires to compare gains in DC circuit knowledge as measured by the DIRECT instrument, a multiple choice instrument previously developed to assess DC circuit knowledge. Gender, prior DC circuit knowledge and subsets of DC circuit knowledge of students were also compared. The population (n=166) was comprised of high school freshmen students from an eastern Kentucky public school with a population of 1100 students and followed a quantitative quasi experimental research design. Differences between treatment groups were not statistically significant. Keywords: Simulations, Static Images, Science Education, DC Circuit Instruction, Phet.
NASA Astrophysics Data System (ADS)
Milojević, Slavka; Stojanovic, Vojislav
2017-04-01
Due to the continuous development of the seismic acquisition and processing method, the increase of the signal/fault ratio always represents a current target. The correct application of the latest software solutions improves the processing results and justifies their development. A correct computation and application of static corrections represents one of the most important tasks in pre-processing. This phase is of great importance for further processing steps. Static corrections are applied to seismic data in order to compensate the effects of irregular topography, the difference between the levels of source points and receipt in relation to the level of reduction, of close to the low-velocity surface layer (weathering correction), or any reasons that influence the spatial and temporal position of seismic routes. The refraction statics method is the most common method for computation of static corrections. It is successful in resolving of both the long-period statics problems and determining of the difference in the statics caused by abrupt lateral changes in velocity in close to the surface layer. XtremeGeo FlatironsTM is a program whose main purpose is computation of static correction through a refraction statics method and allows the application of the following procedures: picking of first arrivals, checking of geometry, multiple methods for analysis and modelling of statics, analysis of the refractor anisotropy and tomography (Eikonal Tomography). The exploration area is located on the southern edge of the Pannonian Plain, in the plain area with altitudes of 50 to 195 meters. The largest part of the exploration area covers Deliblato Sands, where the geological structure of the terrain and high difference in altitudes significantly affects the calculation of static correction. Software XtremeGeo FlatironsTM has powerful visualization and tools for statistical analysis which contributes to significantly more accurate assessment of geometry close to the surface layers and therefore more accurately computed static corrections.
Foreword: 18th Aps-Sccm and 24th Airapt
NASA Astrophysics Data System (ADS)
Collins, Gilbert; Moore, David S.; Yoo, Choong-Shik
2014-05-01
This second joint conference between the APS Topical Group on Shock Compression of Condensed Matter and the International Association for the Advancement of High Pressure Science and Technology (AIRAPT) demonstrates that static and dynamic compression of condensed matter continues to be a vibrant field of science and engineering. It is also by its nature an interdisciplinary field, incorporating chemistry, materials science, solid mechanics, plasma physics, and condensed matter physics, and utilizes theoretical, computational, and experimental tools. Recent years have brought about many advances in loading platforms, diagnostics, and computations that are leading to the emergence of many new avenues of research. These advances are also breathing new life into traditional topics such as equations of state, phase transformations, and chemistry at extreme conditions. The plenary lectures by Gennady Kanel, Karl Syassen, David Ceperley, Jon Eggert, Duck Young Kim, and Richard Kraus spanned the disciplines of static and dynamic high pressure physics and illustrated the breadth of the field. They also showed that interesting and important problems remain for researchers of the future to solve. The main guiding principal in the organization of this conference was to intertwine static and dynamical experimental alongside computational and theoretical studies of similar materials. To achieve this goal, we arranged the conference to include static, dynamic, and computational components in the same sessions, quite often taking presenters out of their comfort zone. The three special sessions on Deep Carbon Budget (organized by Giulia Galli and Rus Hemley), High Energy Density Materials (organized by Raymond Jeanloz and Jon Eggert), and Dynamic Response of Materials (organized by Yogendra Gupta and John Sarrao) furthered this guiding principal. We also endeavored to represent the breadth of static and dynamic high pressure science and technology, notably beyond that done at national laboratories. To this end, a significant fraction of the plenary, invited and contributed presentations showcased work done in academia, defense laboratories and industry, as well as internationally. Although travel distance and visa issues always present difficulties, the conference had strong representation from a record number of international participants, including sizable groups from Russia and China (thanks to Tony Zocher and Frank Cherne), as well as Japan, the United Kingdom, France, Canada, Germany, Israel, and Italy. It is our sincere hope that international interactions that occurred at the conference will lead to further collaborations in the future. Finally, we strived to increase student participation at the conference. Through the leadership of Scott Alexander and his committee, a new all-day student symposium was held the day before the main conference, with only student attendees and presenters, in order to acclimate the students to conference participation and help them network with their peers. In cooperation with the APS Topical Group and the AIRAPT and with additional support from DTRA and the AWE, the conference was able to provide financial assistance to a large number of students to attend the conference and present their research. This aid helped increase the number of student attendees significantly over previous conferences. Finally, the conference sponsored a networking lunch for students and representatives from a number of laboratories and other institutions, which was well attended. Seattle proved itself to be an excellent venue for the conference. The international flavor of the city provided ample dining options and numerous activity choices outside of the conference sessions. The major international airport made travel as easy as possible, as Seattle is a convenient central location for attendees from Europe and Asia. The conference was truly a team effort with critical contributions from many individuals. We deeply appreciate their contributions to the success of the conference and the publication of these proceedings. Gilbert (RIP) Collins David S Moore Choong-Shik Yoo
Tutorial videos of bioinformatics resources: online distribution trial in Japan named TogoTV.
Kawano, Shin; Ono, Hiromasa; Takagi, Toshihisa; Bono, Hidemasa
2012-03-01
In recent years, biological web resources such as databases and tools have become more complex because of the enormous amounts of data generated in the field of life sciences. Traditional methods of distributing tutorials include publishing textbooks and posting web documents, but these static contents cannot adequately describe recent dynamic web services. Due to improvements in computer technology, it is now possible to create dynamic content such as video with minimal effort and low cost on most modern computers. The ease of creating and distributing video tutorials instead of static content improves accessibility for researchers, annotators and curators. This article focuses on online video repositories for educational and tutorial videos provided by resource developers and users. It also describes a project in Japan named TogoTV (http://togotv.dbcls.jp/en/) and discusses the production and distribution of high-quality tutorial videos, which would be useful to viewer, with examples. This article intends to stimulate and encourage researchers who develop and use databases and tools to distribute how-to videos as a tool to enhance product usability.
Tutorial videos of bioinformatics resources: online distribution trial in Japan named TogoTV
Kawano, Shin; Ono, Hiromasa; Takagi, Toshihisa
2012-01-01
In recent years, biological web resources such as databases and tools have become more complex because of the enormous amounts of data generated in the field of life sciences. Traditional methods of distributing tutorials include publishing textbooks and posting web documents, but these static contents cannot adequately describe recent dynamic web services. Due to improvements in computer technology, it is now possible to create dynamic content such as video with minimal effort and low cost on most modern computers. The ease of creating and distributing video tutorials instead of static content improves accessibility for researchers, annotators and curators. This article focuses on online video repositories for educational and tutorial videos provided by resource developers and users. It also describes a project in Japan named TogoTV (http://togotv.dbcls.jp/en/) and discusses the production and distribution of high-quality tutorial videos, which would be useful to viewer, with examples. This article intends to stimulate and encourage researchers who develop and use databases and tools to distribute how-to videos as a tool to enhance product usability. PMID:21803786
The phase diagram of solid hydrogen at high pressure: A challenge for first principles calculations
NASA Astrophysics Data System (ADS)
Azadi, Sam; Foulkes, Matthew
2015-03-01
We present comprehensive results for the high-pressure phase diagram of solid hydrogen. We focus on the energetically most favorable molecular and atomic crystal structures. To obtain the ground-state static enthalpy and phase diagram, we use semi-local and hybrid density functional theory (DFT) as well as diffusion quantum Monte Carlo (DMC) methods. The closure of the band gap with increasing pressure is investigated utilizing quasi-particle many-body calculations within the GW approximation. The dynamical phase diagram is calculated by adding proton zero-point energies (ZPE) to static enthalpies. Density functional perturbation theory is employed to calculate the proton ZPE and the infra-red and Raman spectra. Our results clearly demonstrate the failure of DFT-based methods to provide an accurate static phase diagram, especially when comparing insulating and metallic phases. Our dynamical phase diagram obtained using fully many-body DMC calculations shows that the molecular-to-atomic phase transition happens at the experimentally accessible pressure of 374 GPa. We claim that going beyond mean-field schemes to obtain derivatives of the total energy and optimize crystal structures at the many-body level is crucial. This work was supported by the UK engineering and physics science research council under Grant EP/I030190/1, and made use of computing facilities provided by HECTOR, and by the Imperial College London high performance computing centre.
NASA Astrophysics Data System (ADS)
Yang, Hong-Yong; Lu, Lan; Cao, Ke-Cai; Zhang, Si-Ying
2010-04-01
In this paper, the relations of the network topology and the moving consensus of multi-agent systems are studied. A consensus-prestissimo scale-free network model with the static preferential-consensus attachment is presented on the rewired link of the regular network. The effects of the static preferential-consensus BA network on the algebraic connectivity of the topology graph are compared with the regular network. The robustness gain to delay is analyzed for variable network topology with the same scale. The time to reach the consensus is studied for the dynamic network with and without communication delays. By applying the computer simulations, it is validated that the speed of the convergence of multi-agent systems can be greatly improved in the preferential-consensus BA network model with different configuration.
Combining Static Analysis and Model Checking for Software Analysis
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)
2003-01-01
We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.
NASA Technical Reports Server (NTRS)
Hancock, Thomas
1993-01-01
This experiment investigated the integrity of static computer memory (floppy disk media) when exposed to the environment of low earth orbit. The experiment attempted to record soft-event upsets (bit-flips) in static computer memory. Typical conditions that exist in low earth orbit that may cause soft-event upsets include: cosmic rays, low level background radiation, charged fields, static charges, and the earth's magnetic field. Over the years several spacecraft have been affected by soft-event upsets (bit-flips), and these events have caused a loss of data or affected spacecraft guidance and control. This paper describes a commercial spin-off that is being developed from the experiment.
Knowledge Acquisition with Static and Animated Pictures in Computer-Based Learning.
ERIC Educational Resources Information Center
Schnotz, Wolfgang; Grzondziel, Harriet
In educational settings, computers provide specific possibilities of visualizing information for instructional purposes. Besides the use of static pictures, computers can present animated pictures which allow exploratory manipulation by the learner and display the dynamic behavior of a system. This paper develops a theoretical framework for…
NASA Astrophysics Data System (ADS)
Vaucouleur, Sebastien
2011-02-01
We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.
Chao, Edmund Y S; Armiger, Robert S; Yoshida, Hiroaki; Lim, Jonathan; Haraguchi, Naoki
2007-03-08
The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the "Virtual Human" reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of these unique database and simulation technology. This integrated system, model library and database will impact on orthopaedic education, basic research, device development and application, and clinical patient care related to musculoskeletal joint system reconstruction, trauma management, and rehabilitation.
Chao, Edmund YS; Armiger, Robert S; Yoshida, Hiroaki; Lim, Jonathan; Haraguchi, Naoki
2007-01-01
The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the "Virtual Human" reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of these unique database and simulation technology. This integrated system, model library and database will impact on orthopaedic education, basic research, device development and application, and clinical patient care related to musculoskeletal joint system reconstruction, trauma management, and rehabilitation. PMID:17343764
Mayer, Richard E; Hegarty, Mary; Mayer, Sarah; Campbell, Julie
2005-12-01
In 4 experiments, students received a lesson consisting of computer-based animation and narration or a lesson consisting of paper-based static diagrams and text. The lessons used the same words and graphics in the paper-based and computer-based versions to explain the process of lightning formation (Experiment 1), how a toilet tank works (Experiment 2), how ocean waves work (Experiment 3), and how a car's braking system works (Experiment 4). On subsequent retention and transfer tests, the paper group performed significantly better than the computer group on 4 of 8 comparisons, and there was no significant difference on the rest. These results support the static media hypothesis, in which static illustrations with printed text reduce extraneous processing and promote germane processing as compared with narrated animations.
ERIC Educational Resources Information Center
Lamb, Richard; Cavagnetto, Andy; Akmal, Tariq
2016-01-01
A critical problem with the examination of learning in education is that there is an underlying assumption that the dynamic systems associated with student information processing can be measured using static linear assessments. This static linear approach does not provide sufficient ability to characterize learning. Much of the modern research…
NASA Technical Reports Server (NTRS)
Greathouse, James S.; Schwing, Alan M.
2015-01-01
This paper explores use of computational fluid dynamics to study the e?ect of geometric porosity on static stability and drag for NASA's Multi-Purpose Crew Vehicle main parachute. Both of these aerodynamic characteristics are of interest to in parachute design, and computational methods promise designers the ability to perform detailed parametric studies and other design iterations with a level of control previously unobtainable using ground or flight testing. The approach presented here uses a canopy structural analysis code to define the inflated parachute shapes on which structured computational grids are generated. These grids are used by the computational fluid dynamics code OVERFLOW and are modeled as rigid, impermeable bodies for this analysis. Comparisons to Apollo drop test data is shown as preliminary validation of the technique. Results include several parametric sweeps through design variables in order to better understand the trade between static stability and drag. Finally, designs that maximize static stability with a minimal loss in drag are suggested for further study in subscale ground and flight testing.
ERIC Educational Resources Information Center
Lin, Huifen; Chen, Tsuiping; Dwyer, Francis M.
2006-01-01
The purpose of this experimental study was to compare the effects of using static visuals versus computer-generated animation to enhance learners' comprehension and retention of a content-based lesson in a computer-based learning environment for learning English as a foreign language (EFL). Fifty-eight students from two EFL reading sections were…
Evaluation of SAR in a human body model due to wireless power transmission in the 10 MHz band.
Laakso, Ilkka; Tsuchida, Shogo; Hirata, Akimasa; Kamimura, Yoshitsugu
2012-08-07
This study discusses a computational method for calculating the specific absorption rate (SAR) due to a wireless power transmission system in the 10 MHz frequency band. A two-step quasi-static method comprised of the method of moments and the scalar potential finite-difference method are proposed. The applicability of the quasi-static approximation for localized exposure in this frequency band is discussed by comparing the SAR in a lossy dielectric cylinder computed with a full-wave electromagnetic analysis and the quasi-static approximation. From the computational results, the input impedance of the resonant coils was affected by the existence of the cylinder. On the other hand, the magnetic field distribution in free space and considering the cylinder and an impedance matching circuit were in good agreement; the maximum difference in the amplitude of the magnetic field was 4.8%. For a cylinder-coil distance of 10 mm, the difference between the peak 10 g averaged SAR in the cylinder computed with the full-wave electromagnetic method and our quasi-static method was 7.8%. These results suggest that the quasi-static approach is applicable for conducting the dosimetry of wireless power transmission in the 10 MHz band. With our two-step quasi-static method, the SAR in the anatomically based model was computed for different exposure scenarios. From those computations, the allowable input power satisfying the limit of a peak 10 g averaged SAR of 2.0 W kg(-1) was 830 W in the worst case exposure scenario with a coil positioned at a distance of 30 mm from the chest.
Static aeroelastic analysis and tailoring of a single-element racing car wing
NASA Astrophysics Data System (ADS)
Sadd, Christopher James
This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.
Observations on military exploitation of explosives detection technologies
NASA Astrophysics Data System (ADS)
Faust, Anthony A.; de Ruiter, C. J.; Ehlerding, Anneli; McFee, John E.; Svinsås, Eirik; van Rheenen, Arthur D.
2011-06-01
Accurate and timely detection of explosives, energetic materials, and their associated compounds would provide valuable information to military commanders in a wide range of military operations: protection of fast moving convoys from mobile or static IED threats; more deliberate countermine and counter-IED operations during route or area clearance; and static roles such as hasty or deliberate checkpoints, critical infrastructure protection and support to public security. The detection of hidden explosive hazards is an extremely challenging problem, as evidenced by the fact that related research has been ongoing in many countries for at least seven decades and no general purpose solution has yet been found. Technologies investigated have spanned all major scientific fields, with emphasis on the physical sciences, life sciences, engineering, robotics, computer technology and mathematics. This paper will present a limited, operationally-focused overview of the current status of detection technologies. Emphasis will be on those technologies that directly detect the explosive hazard, as opposed to those that detect secondary properties of the threat, such as the casing, associated wires or electronics. Technologies that detect explosives include those based on nuclear radiation and terahertz radiation, as well as trace and biological detection techniques. Current research areas of the authors will be used to illustrate the practical applications.
Comparison of static and dynamic computer-assisted guidance methods in implantology.
Mischkowski, R A; Zinser, M J; Neugebauer, J; Kübler, A C; Zöller, J E
2006-01-01
The planning of dental implant position and its transfer to the operation site can be considered as one of the most important factors for the long-term success of implant-supported prosthetic and epithetic restorations. This study compares computer-assisted fabricated surgical templates as the static method with intro-operative image guided navigation as the dynamic method for transfer of three-dimensional pre-operative planning. For the static method, the systems Med3D, coDiagnostix/ gonyX, and SimPlant were used. For the dynamic method, the systems RoboDent und VectorVision2 were applied. A total of 746 implants were inserted between August 1999 and December 2005 in 206 patients. The static approach was used most frequently, accounting for 611 fixtures in 168 patients. The failure ratios within the first 6 months were 1.31% in the statically controlled insertion group compared to 2.96% in the dynamically controlled insertion group. Complications related to an incorrect position of the implants have not been observed so far in either group. All computer-assisted methods included in this study were successfully applied in a clinical setting after a certain start-up period. The indications for application of computer-assisted methods in implantology are currently given in difficult anatomical situations. Due to uncomplicated handling and low resource demands, the static template technique can be recommended as the method of choice for the majority of all cases falling into this category.
Graphic-based musculoskeletal model for biomechanical analyses and animation.
Chao, Edmund Y S
2003-04-01
The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.
Introduction to Reactor Statics Modules, RS-1. Nuclear Engineering Computer Modules.
ERIC Educational Resources Information Center
Edlund, Milton C.
The nine Reactor Statics Modules are designed to introduce students to the use of numerical methods and digital computers for calculation of neutron flux distributions in space and energy which are needed to calculate criticality, power distribution, and fuel burn-up for both slow neutron and fast neutron fission reactors. The diffusion…
The Roles of Mental Animations and External Animations in Understanding Mechanical Systems
ERIC Educational Resources Information Center
Hegarty, Mary; Kriz, Sarah; Cate, Christina
2003-01-01
The effects of computer animations and mental animation on people's mental models of a mechanical system are examined. In 3 experiments, students learned how a mechanical system works from various instructional treatments including viewing a static diagram of the machine, predicting motion from static diagrams, viewing computer animations, and…
NASA Technical Reports Server (NTRS)
Whitaker, Mike
1991-01-01
Severe precipitation static problems affecting the communication equipment onboard the P-3B aircraft were recently studied. The study was conducted after precipitation static created potential safety-of-flight problems on Naval Reserve aircraft. A specially designed flight test program was conducted in order to measure, record, analyze, and characterize potential precipitation static problem areas. The test program successfully characterized the precipitation static interference problems while the P-3B was flown in moderate to extreme precipitation conditions. Data up to 400 MHz were collected on the effects of engine charging, precipitation static, and extreme cross fields. These data were collected using a computer controlled acquisition system consisting of a signal generator, RF spectrum and audio analyzers, data recorders, and instrumented static dischargers. The test program is outlined and the computer controlled data acquisition system is described in detail which was used during flight and ground testing. The correlation of test results is also discussed which were recorded during the flight test program and those measured during ground testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlson, Neil; Jibben, Zechariah; Brady, Peter
2017-06-28
Pececillo is a proxy-app for the open source Truchas metal processing code (LA-CC-15-097). It implements many of the physics models used in Truchas: free-surface, incompressible Navier-Stokes fluid dynamics (e.g., water waves); heat transport, material phase change, view factor thermal radiation; species advection-diffusion; quasi-static, elastic/plastic solid mechanics with contact; electomagnetics (Maxwell's equations). The models are simplified versions that retain the fundamental computational complexity of the Truchas models while omitting many non-essential features and modeling capabilities. The purpose is to expose Truchas algorithms in a greatly simplified context where computer science problems related to parallel performance on advanced architectures can be moremore » easily investigated. While Pececillo is capable of performing simulations representative of typical Truchas metal casting, welding, and additive manufacturing simulations, it lacks many of the modeling capabilites needed for real applications.« less
Computational analysis of unmanned aerial vehicle (UAV)
NASA Astrophysics Data System (ADS)
Abudarag, Sakhr; Yagoub, Rashid; Elfatih, Hassan; Filipovic, Zoran
2017-01-01
A computational analysis has been performed to verify the aerodynamics properties of Unmanned Aerial Vehicle (UAV). The UAV-SUST has been designed and fabricated at the Department of Aeronautical Engineering at Sudan University of Science and Technology in order to meet the specifications required for surveillance and reconnaissance mission. It is classified as a medium range and medium endurance UAV. A commercial CFD solver is used to simulate steady and unsteady aerodynamics characteristics of the entire UAV. In addition to Lift Coefficient (CL), Drag Coefficient (CD), Pitching Moment Coefficient (CM) and Yawing Moment Coefficient (CN), the pressure and velocity contours are illustrated. The aerodynamics parameters are represented a very good agreement with the design consideration at angle of attack ranging from zero to 26 degrees. Moreover, the visualization of the velocity field and static pressure contours is indicated a satisfactory agreement with the proposed design. The turbulence is predicted by enhancing K-ω SST turbulence model within the computational fluid dynamics code.
1990-10-01
to economic, technological, spatial or logistic concerns, or involve training, man-machine interfaces, or integration into existing systems. Once the...probabilistic reasoning, mixed analysis- and simulation-oriented, mixed computation- and communication-oriented, nonpreemptive static priority...scheduling base, nonrandomized, preemptive static priority scheduling base, randomized, simulation-oriented, and static scheduling base. The selection of both
A 3D inversion for all-space magnetotelluric data with static shift correction
NASA Astrophysics Data System (ADS)
Zhang, Kun
2017-04-01
Base on the previous studies on the static shift correction and 3D inversion algorithms, we improve the NLCG 3D inversion method and propose a new static shift correction method which work in the inversion. The static shift correction method is based on the 3D theory and real data. The static shift can be detected by the quantitative analysis of apparent parameters (apparent resistivity and impedance phase) of MT in high frequency range, and completed correction with inversion. The method is an automatic processing technology of computer with 0 cost, and avoids the additional field work and indoor processing with good results. The 3D inversion algorithm is improved (Zhang et al., 2013) base on the NLCG method of Newman & Alumbaugh (2000) and Rodi & Mackie (2001). For the algorithm, we added the parallel structure, improved the computational efficiency, reduced the memory of computer and added the topographic and marine factors. So the 3D inversion could work in general PC with high efficiency and accuracy. And all the MT data of surface stations, seabed stations and underground stations can be used in the inversion algorithm.
User's Manual for Aerofcn: a FORTRAN Program to Compute Aerodynamic Parameters
NASA Technical Reports Server (NTRS)
Conley, Joseph L.
1992-01-01
The computer program AeroFcn is discussed. AeroFcn is a utility program that computes the following aerodynamic parameters: geopotential altitude, Mach number, true velocity, dynamic pressure, calibrated airspeed, equivalent airspeed, impact pressure, total pressure, total temperature, Reynolds number, speed of sound, static density, static pressure, static temperature, coefficient of dynamic viscosity, kinematic viscosity, geometric altitude, and specific energy for a standard- or a modified standard-day atmosphere using compressible flow and normal shock relations. Any two parameters that define a unique flight condition are selected, and their values are entered interactively. The remaining parameters are computed, and the solutions are stored in an output file. Multiple cases can be run, and the multiple case solutions can be stored in another output file for plotting. Parameter units, the output format, and primary constants in the atmospheric and aerodynamic equations can also be changed.
Hyperswitch Network For Hypercube Computer
NASA Technical Reports Server (NTRS)
Chow, Edward; Madan, Herbert; Peterson, John
1989-01-01
Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.
ERIC Educational Resources Information Center
Mayer, Richard E.; Hegarty, Mary; Mayer, Sarah; Campbell, Julie
2005-01-01
In 4 experiments, students received a lesson consisting of computer-based animation and narration or a lesson consisting of paper-based static diagrams and text. The lessons used the same words and graphics in the paper-based and computer-based versions to explain the process of lightning formation (Experiment 1), how a toilet tank works…
FOCAL PLANE WAVEFRONT SENSING USING RESIDUAL ADAPTIVE OPTICS SPECKLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Codona, Johanan L.; Kenworthy, Matthew, E-mail: jlcodona@gmail.com
2013-04-20
Optical imperfections, misalignments, aberrations, and even dust can significantly limit sensitivity in high-contrast imaging systems such as coronagraphs. An upstream deformable mirror (DM) in the pupil can be used to correct or compensate for these flaws, either to enhance the Strehl ratio or suppress the residual coronagraphic halo. Measurement of the phase and amplitude of the starlight halo at the science camera is essential for determining the DM shape that compensates for any non-common-path (NCP) wavefront errors. Using DM displacement ripples to create a series of probe and anti-halo speckles in the focal plane has been proposed for space-based coronagraphsmore » and successfully demonstrated in the lab. We present the theory and first on-sky demonstration of a technique to measure the complex halo using the rapidly changing residual atmospheric speckles at the 6.5 m MMT telescope using the Clio mid-IR camera. The AO system's wavefront sensor measurements are used to estimate the residual wavefront, allowing us to approximately compute the rapidly evolving phase and amplitude of speckle halo. When combined with relatively short, synchronized science camera images, the complex speckle estimates can be used to interferometrically analyze the images, leading to an estimate of the static diffraction halo with NCP effects included. In an operational system, this information could be collected continuously and used to iteratively correct quasi-static NCP errors or suppress imperfect coronagraphic halos.« less
Seki, Shiro; Tsuzuki, Seiji; Hayamizu, Kikuko; Serizawa, Nobuyuki; Ono, Shimpei; Takei, Katsuhito; Doi, Hiroyuki; Umebayashi, Yasuhiro
2014-05-01
We have measured physicochemical properties of five alkyltrimethylammonium cation-based room-temperature ionic liquids and compared them with those obtained from computational methods. We have found that static properties (density and refractive index) and transport properties (ionic conductivity, self-diffusion coefficient, and viscosity) of these ionic liquids show close relations with the length of the alkyl chain. In particular, static properties obtained by experimental methods exhibit a trend complementary to that by computational methods (refractive index ∝ [polarizability/molar volume]). Moreover, the self-diffusion coefficient obtained by molecular dynamics (MD) simulation was consistent with the data obtained by the pulsed-gradient spin-echo nuclear magnetic resonance technique, which suggests that computational methods can be supplemental tools to predict physicochemical properties of room-temperature ionic liquids.
Method and appartus for converting static in-ground vehicle scales into weigh-in-motion systems
Muhs, Jeffrey D.; Scudiere, Matthew B.; Jordan, John K.
2002-01-01
An apparatus and method for converting in-ground static weighing scales for vehicles to weigh-in-motion systems. The apparatus upon conversion includes the existing in-ground static scale, peripheral switches and an electronic module for automatic computation of the weight. By monitoring the velocity, tire position, axle spacing, and real time output from existing static scales as a vehicle drives over the scales, the system determines when an axle of a vehicle is on the scale at a given time, monitors the combined weight output from any given axle combination on the scale(s) at any given time, and from these measurements automatically computes the weight of each individual axle and gross vehicle weight by an integration, integration approximation, and/or signal averaging technique.
BurnMan: Towards a multidisciplinary toolkit for reproducible deep Earth science
NASA Astrophysics Data System (ADS)
Myhill, R.; Cottaar, S.; Heister, T.; Rose, I.; Unterborn, C. T.; Dannberg, J.; Martin-Short, R.
2016-12-01
BurnMan (www.burnman.org) is an open-source toolbox to compute thermodynamic and thermoelastic properties as a function of pressure and temperature using published mineral physical parameters and equations-of-state. The framework is user-friendly, written in Python, and modular, allowing the user to implement their own equations of state, endmember and solution model libraries, geotherms, and averaging schemes. Here we introduce various new modules, which can be used to: Fit thermodynamic variables to data from high pressure static and shock wave experiments, Calculate equilibrium assemblages given a bulk composition, pressure and temperature, Calculate chemical potentials and oxygen fugacities for given assemblages Compute 3D synthetic seismic models using output from geodynamic models and compare these results with global seismic tomographic models, Create input files for synthetic seismogram codes. Users can contribute scripts that reproduce the results from peer-reviewed articles and practical demonstrations (e.g. Cottaar et al., 2014).
Stability-Constrained Aerodynamic Shape Optimization with Applications to Flying Wings
NASA Astrophysics Data System (ADS)
Mader, Charles Alexander
A set of techniques is developed that allows the incorporation of flight dynamics metrics as an additional discipline in a high-fidelity aerodynamic optimization. Specifically, techniques for including static stability constraints and handling qualities constraints in a high-fidelity aerodynamic optimization are demonstrated. These constraints are developed from stability derivative information calculated using high-fidelity computational fluid dynamics (CFD). Two techniques are explored for computing the stability derivatives from CFD. One technique uses an automatic differentiation adjoint technique (ADjoint) to efficiently and accurately compute a full set of static and dynamic stability derivatives from a single steady solution. The other technique uses a linear regression method to compute the stability derivatives from a quasi-unsteady time-spectral CFD solution, allowing for the computation of static, dynamic and transient stability derivatives. Based on the characteristics of the two methods, the time-spectral technique is selected for further development, incorporated into an optimization framework, and used to conduct stability-constrained aerodynamic optimization. This stability-constrained optimization framework is then used to conduct an optimization study of a flying wing configuration. This study shows that stability constraints have a significant impact on the optimal design of flying wings and that, while static stability constraints can often be satisfied by modifying the airfoil profiles of the wing, dynamic stability constraints can require a significant change in the planform of the aircraft in order for the constraints to be satisfied.
NASA Astrophysics Data System (ADS)
Jacek, Laura Lee
This dissertation details an experiment designed to identify gender differences in learning using three experimental treatments: animation, static graphics, and verbal instruction alone. Three learning presentations were used in testing of 332 university students. Statistical analysis was performed using ANOVA, binomial tests for differences of proportion, and descriptive statistics. Results showed that animation significantly improved women's long-term learning over static graphics (p = 0.067), but didn't significantly improve men's long-term learning over static graphics. In all cases, women's scores improved with animation over both other forms of instruction for long-term testing, indicating that future research should not abandon the study of animation as a tool that may promote gender equity in science. Short-term test differences were smaller, and not statistically significant. Variation present in short-term scores was related more to presentation topic than treatment. This research also details characteristics of each of the three presentations, to identify variables (e.g. level of abstraction in presentation) affecting score differences within treatments. Differences between men's and women's scores were non-standard between presentations, but these differences were not statistically significant (long-term p = 0.2961, short-term p = 0.2893). In future research, experiments might be better designed to test these presentational variables in isolation, possibly yielding more distinctive differences between presentational scores. Differences in confidence interval overlaps between presentations suggested that treatment superiority may be somewhat dependent on the design or topic of the learning presentation. Confidence intervals greatly overlap in all situations. This undercut, to some degree, the surety of conclusions indicating superiority of one treatment type over the others. However, confidence intervals for animation were smaller, overlapped nearly completely for men and women (there was less overlap between the genders for the other two treatments), and centered around slightly higher means, lending further support to the conclusion that animation helped equalize men's and women's learning. The most important conclusion identified in this research is that gender is an important variable experimental populations testing animation as a learning device. Averages indicated that both men and women prefer to work with animation over either static graphics or verbal instruction alone.
Many-body excitations and deexcitations in trapped ultracold bosonic clouds
NASA Astrophysics Data System (ADS)
Theisen, Marcus; Streltsov, Alexej I.
2016-11-01
We employ the multiconfigurational time-dependent Hartree for bosons (MCTDHB) method to study excited states of interacting Bose-Einstein condensates confined by harmonic and double-well trap potentials. Two approaches to access excitations, one static and the other dynamic, are investigated and contrasted. In static simulations the low-lying excitations are computed by utilizing a linear-response theory constructed on top of a static MCTDHB solution (LR-MCTDHB). Complimentarily, we propose two dynamic protocols that address excitations by propagating the MCTDHB wave function. In particular, we investigate dipolelike oscillations induced by shifting the origin of the confining potential and breathinglike excitations by quenching the frequency of a parabolic part of the trap. To contrast static predictions and dynamic results we compute the time evolution and regard the respective Fourier transform of several local and nonlocal observables. Namely, we study the expectation value of the position operator
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
A comparison of dynamic and static economic models of uneven-aged stand management
Robert G. Haight
1985-01-01
Numerical techniques have been used to compute the discrete-time sequence of residual diameter distributions that maximize the present net worth (PNW) of harvestable volume from an uneven-aged stand. Results contradicted optimal steady-state diameter distributions determined with static analysis. In this paper, optimality conditions for solutions to dynamic and static...
Nonlinear mechanics of non-rigid origami: an efficient computational approach
NASA Astrophysics Data System (ADS)
Liu, K.; Paulino, G. H.
2017-10-01
Origami-inspired designs possess attractive applications to science and engineering (e.g. deployable, self-assembling, adaptable systems). The special geometric arrangement of panels and creases gives rise to unique mechanical properties of origami, such as reconfigurability, making origami designs well suited for tunable structures. Although often being ignored, origami structures exhibit additional soft modes beyond rigid folding due to the flexibility of thin sheets that further influence their behaviour. Actual behaviour of origami structures usually involves significant geometric nonlinearity, which amplifies the influence of additional soft modes. To investigate the nonlinear mechanics of origami structures with deformable panels, we present a structural engineering approach for simulating the nonlinear response of non-rigid origami structures. In this paper, we propose a fully nonlinear, displacement-based implicit formulation for performing static/quasi-static analyses of non-rigid origami structures based on `bar-and-hinge' models. The formulation itself leads to an efficient and robust numerical implementation. Agreement between real models and numerical simulations demonstrates the ability of the proposed approach to capture key features of origami behaviour.
Nonlinear mechanics of non-rigid origami: an efficient computational approach.
Liu, K; Paulino, G H
2017-10-01
Origami-inspired designs possess attractive applications to science and engineering (e.g. deployable, self-assembling, adaptable systems). The special geometric arrangement of panels and creases gives rise to unique mechanical properties of origami, such as reconfigurability, making origami designs well suited for tunable structures. Although often being ignored, origami structures exhibit additional soft modes beyond rigid folding due to the flexibility of thin sheets that further influence their behaviour. Actual behaviour of origami structures usually involves significant geometric nonlinearity, which amplifies the influence of additional soft modes. To investigate the nonlinear mechanics of origami structures with deformable panels, we present a structural engineering approach for simulating the nonlinear response of non-rigid origami structures. In this paper, we propose a fully nonlinear, displacement-based implicit formulation for performing static/quasi-static analyses of non-rigid origami structures based on 'bar-and-hinge' models. The formulation itself leads to an efficient and robust numerical implementation. Agreement between real models and numerical simulations demonstrates the ability of the proposed approach to capture key features of origami behaviour.
Sequential Test Strategies for Multiple Fault Isolation
NASA Technical Reports Server (NTRS)
Shakeri, M.; Pattipati, Krishna R.; Raghavan, V.; Patterson-Hine, Ann; Kell, T.
1997-01-01
In this paper, we consider the problem of constructing near optimal test sequencing algorithms for diagnosing multiple faults in redundant (fault-tolerant) systems. The computational complexity of solving the optimal multiple-fault isolation problem is super-exponential, that is, it is much more difficult than the single-fault isolation problem, which, by itself, is NP-hard. By employing concepts from information theory and Lagrangian relaxation, we present several static and dynamic (on-line or interactive) test sequencing algorithms for the multiple fault isolation problem that provide a trade-off between the degree of suboptimality and computational complexity. Furthermore, we present novel diagnostic strategies that generate a static diagnostic directed graph (digraph), instead of a static diagnostic tree, for multiple fault diagnosis. Using this approach, the storage complexity of the overall diagnostic strategy reduces substantially. Computational results based on real-world systems indicate that the size of a static multiple fault strategy is strictly related to the structure of the system, and that the use of an on-line multiple fault strategy can diagnose faults in systems with as many as 10,000 failure sources.
Method for Statically Checking an Object-oriented Computer Program Module
NASA Technical Reports Server (NTRS)
Bierhoff, Kevin M. (Inventor); Aldrich, Jonathan (Inventor)
2012-01-01
A method for statically checking an object-oriented computer program module includes the step of identifying objects within a computer program module, at least one of the objects having a plurality of references thereto, possibly from multiple clients. A discipline of permissions is imposed on the objects identified within the computer program module. The permissions enable tracking, from among a discrete set of changeable states, a subset of states each object might be in. A determination is made regarding whether the imposed permissions are violated by a potential reference to any of the identified objects. The results of the determination are output to a user.
The influence of computational assumptions on analysing abdominal aortic aneurysm haemodynamics.
Ene, Florentina; Delassus, Patrick; Morris, Liam
2014-08-01
The variation in computational assumptions for analysing abdominal aortic aneurysm haemodynamics can influence the desired output results and computational cost. Such assumptions for abdominal aortic aneurysm modelling include static/transient pressures, steady/transient flows and rigid/compliant walls. Six computational methods and these various assumptions were simulated and compared within a realistic abdominal aortic aneurysm model with and without intraluminal thrombus. A full transient fluid-structure interaction was required to analyse the flow patterns within the compliant abdominal aortic aneurysms models. Rigid wall computational fluid dynamics overestimates the velocity magnitude by as much as 40%-65% and the wall shear stress by 30%-50%. These differences were attributed to the deforming walls which reduced the outlet volumetric flow rate for the transient fluid-structure interaction during the majority of the systolic phase. Static finite element analysis accurately approximates the deformations and von Mises stresses when compared with transient fluid-structure interaction. Simplifying the modelling complexity reduces the computational cost significantly. In conclusion, the deformation and von Mises stress can be approximately found by static finite element analysis, while for compliant models a full transient fluid-structure interaction analysis is required for acquiring the fluid flow phenomenon. © IMechE 2014.
ERIC Educational Resources Information Center
Donovan, Deborah A.; Borda, Emily J.; Hanley, Daniel M.; Landel, Carolyn C.
2015-01-01
Despite significant pressure to reform science teaching and learning in K12 schools, and a concurrent call to reform undergraduate courses, higher education science content courses have remained relatively static. Higher education science faculty have few opportunities to explore research on how people learn, examine state or national science…
A static data flow simulation study at Ames Research Center
NASA Technical Reports Server (NTRS)
Barszcz, Eric; Howard, Lauri S.
1987-01-01
Demands in computational power, particularly in the area of computational fluid dynamics (CFD), led NASA Ames Research Center to study advanced computer architectures. One architecture being studied is the static data flow architecture based on research done by Jack B. Dennis at MIT. To improve understanding of this architecture, a static data flow simulator, written in Pascal, has been implemented for use on a Cray X-MP/48. A matrix multiply and a two-dimensional fast Fourier transform (FFT), two algorithms used in CFD work at Ames, have been run on the simulator. Execution times can vary by a factor of more than 2 depending on the partitioning method used to assign instructions to processing elements. Service time for matching tokens has proved to be a major bottleneck. Loop control and array address calculation overhead can double the execution time. The best sustained MFLOPS rates were less than 50% of the maximum capability of the machine.
CFD Assessment of Aerodynamic Degradation of a Subsonic Transport Due to Airframe Damage
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Pirzadeh, Shahyar Z.; Atkins, Harold L.; Viken, Sally A.; Morrison, Joseph H.
2010-01-01
A computational study is presented to assess the utility of two NASA unstructured Navier-Stokes flow solvers for capturing the degradation in static stability and aerodynamic performance of a NASA General Transport Model (GTM) due to airframe damage. The approach is to correlate computational results with a substantial subset of experimental data for the GTM undergoing progressive losses to the wing, vertical tail, and horizontal tail components. The ultimate goal is to advance the probability of inserting computational data into the creation of advanced flight simulation models of damaged subsonic aircraft in order to improve pilot training. Results presented in this paper demonstrate good correlations with slope-derived quantities, such as pitch static margin and static directional stability, and incremental rolling moment due to wing damage. This study further demonstrates that high fidelity Navier-Stokes flow solvers could augment flight simulation models with additional aerodynamic data for various airframe damage scenarios.
NASA Technical Reports Server (NTRS)
Bittker, D. A.; Scullin, V. J.
1972-01-01
A general chemical kinetics program is described for complex, homogeneous ideal-gas reactions in any chemical system. Its main features are flexibility and convenience in treating many different reaction conditions. The program solves numerically the differential equations describing complex reaction in either a static system or one-dimensional inviscid flow. Applications include ignition and combustion, shock wave reactions, and general reactions in a flowing or static system. An implicit numerical solution method is used which works efficiently for the extreme conditions of a very slow or a very fast reaction. The theory is described, and the computer program and users' manual are included.
A Feasibility Study of Synthesizing Subsurfaces Modeled with Computational Neural Networks
NASA Technical Reports Server (NTRS)
Wang, John T.; Housner, Jerrold M.; Szewczyk, Z. Peter
1998-01-01
This paper investigates the feasibility of synthesizing substructures modeled with computational neural networks. Substructures are modeled individually with computational neural networks and the response of the assembled structure is predicted by synthesizing the neural networks. A superposition approach is applied to synthesize models for statically determinate substructures while an interface displacement collocation approach is used to synthesize statically indeterminate substructure models. Beam and plate substructures along with components of a complicated Next Generation Space Telescope (NGST) model are used in this feasibility study. In this paper, the limitations and difficulties of synthesizing substructures modeled with neural networks are also discussed.
NASA Technical Reports Server (NTRS)
Mclain, A. G.; Rao, C. S. R.
1976-01-01
A hybrid chemical kinetic computer program was assembled which provides a rapid solution to problems involving flowing or static, chemically reacting, gas mixtures. The computer program uses existing subroutines for problem setup, initialization, and preliminary calculations and incorporates a stiff ordinary differential equation solution technique. A number of check cases were recomputed with the hybrid program and the results were almost identical to those previously obtained. The computational time saving was demonstrated with a propane-oxygen-argon shock tube combustion problem involving 31 chemical species and 64 reactions. Information is presented to enable potential users to prepare an input data deck for the calculation of a problem.
Static Fatigue of a Siliconized Silicon Carbide
1987-03-01
flexitral stress rupture and stepped temperature stress rupture (STSR) testing were performed to assess the static fatigue and creep resistances. Isothermal... stress rupture experiments were performed at 1200 0C in air for com- parison to previous results. - 10 STSR experiments 15 were under deadweight...temperature and stress levels that static fatigue and creep processes are active. The applied stresses were computed on the basis of the elastic
NASA Astrophysics Data System (ADS)
Moon, Hye Sun
Visuals are most extensively used as instructional tools in education to present spatially-based information. Recent computer technology allows the generation of 3D animated visuals to extend the presentation in computer-based instruction. Animated visuals in 3D representation not only possess motivational value that promotes positive attitudes toward instruction but also facilitate learning when the subject matter requires dynamic motion and 3D visual cue. In this study, three questions are explored: (1) how 3D graphics affects student learning and attitude, in comparison with 2D graphics; (2) how animated graphics affects student learning and attitude, in comparison with static graphics; and (3) whether the use of 3D graphics, when they are supported by interactive animation, is the most effective visual cues to improve learning and to develop positive attitudes. A total of 145 eighth-grade students participated in a 2 x 2 factorial design study. The subjects were randomly assigned to one of four computer-based instructions: 2D static; 2D animated; 3D static; and 3D animated. The results indicated that: (1) Students in the 3D graphic condition exhibited more positive attitudes toward instruction than those in the 2D graphic condition. No group differences were found between the posttest score of 3D graphic condition and that of 2D graphic condition. However, students in the 3D graphic condition took less time for information retrieval on posttest than those in the 2D graphic condition. (2) Students in the animated graphic condition exhibited slightly more positive attitudes toward instruction than those in the static graphic condition. No group differences were found between the posttest score of animated graphic condition and that of static graphic condition. However, students in the animated graphic condition took less time for information retrieval on posttest than those in the static graphic condition. (3) Students in the 3D animated graphic condition exhibited more positive attitudes toward instruction than those in other treatment conditions (2D static, 2D animated, and 3D static conditions). No group differences were found in the posttest scores among four treatment conditions. However, students in the 3D animated condition took less time for information retrieval on posttest than those in other treatment conditions.
NASA Technical Reports Server (NTRS)
Wang, C. R.; Hingst, W. R.; Porro, A. R.
1991-01-01
The properties of 2-D shock wave/turbulent boundary layer interaction flows were calculated by using a compressible turbulent Navier-Stokes numerical computational code. Interaction flows caused by oblique shock wave impingement on the turbulent boundary layer flow were considered. The oblique shock waves were induced with shock generators at angles of attack less than 10 degs in supersonic flows. The surface temperatures were kept at near-adiabatic (ratio of wall static temperature to free stream total temperature) and cold wall (ratio of wall static temperature to free stream total temperature) conditions. The computational results were studied for the surface heat transfer, velocity temperature correlation, and turbulent shear stress in the interaction flow fields. Comparisons of the computational results with existing measurements indicated that (1) the surface heat transfer rates and surface pressures could be correlated with Holden's relationship, (2) the mean flow streamwise velocity components and static temperatures could be correlated with Crocco's relationship if flow separation did not occur, and (3) the Baldwin-Lomax turbulence model should be modified for turbulent shear stress computations in the interaction flows.
Visual saliency in MPEG-4 AVC video stream
NASA Astrophysics Data System (ADS)
Ammar, M.; Mitrea, M.; Hasnaoui, M.; Le Callet, P.
2015-03-01
Visual saliency maps already proved their efficiency in a large variety of image/video communication application fields, covering from selective compression and channel coding to watermarking. Such saliency maps are generally based on different visual characteristics (like color, intensity, orientation, motion,…) computed from the pixel representation of the visual content. This paper resumes and extends our previous work devoted to the definition of a saliency map solely extracted from the MPEG-4 AVC stream syntax elements. The MPEG-4 AVC saliency map thus defined is a fusion of static and dynamic map. The static saliency map is in its turn a combination of intensity, color and orientation features maps. Despite the particular way in which all these elementary maps are computed, the fusion techniques allowing their combination plays a critical role in the final result and makes the object of the proposed study. A total of 48 fusion formulas (6 for combining static features and, for each of them, 8 to combine static to dynamic features) are investigated. The performances of the obtained maps are evaluated on a public database organized at IRCCyN, by computing two objective metrics: the Kullback-Leibler divergence and the area under curve.
Eskinazi, Ilan; Fregly, Benjamin J
2018-04-01
Concurrent estimation of muscle activations, joint contact forces, and joint kinematics by means of gradient-based optimization of musculoskeletal models is hindered by computationally expensive and non-smooth joint contact and muscle wrapping algorithms. We present a framework that simultaneously speeds up computation and removes sources of non-smoothness from muscle force optimizations using a combination of parallelization and surrogate modeling, with special emphasis on a novel method for modeling joint contact as a surrogate model of a static analysis. The approach allows one to efficiently introduce elastic joint contact models within static and dynamic optimizations of human motion. We demonstrate the approach by performing two optimizations, one static and one dynamic, using a pelvis-leg musculoskeletal model undergoing a gait cycle. We observed convergence on the order of seconds for a static optimization time frame and on the order of minutes for an entire dynamic optimization. The presented framework may facilitate model-based efforts to predict how planned surgical or rehabilitation interventions will affect post-treatment joint and muscle function. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.
Simulations to study the static polarization limit for RHIC lattice
NASA Astrophysics Data System (ADS)
Duan, Zhe; Qin, Qing
2016-01-01
A study of spin dynamics based on simulations with the Polymorphic Tracking Code (PTC) is reported, exploring the dependence of the static polarization limit on various beam parameters and lattice settings for a practical RHIC lattice. It is shown that the behavior of the static polarization limit is dominantly affected by the vertical motion, while the effect of beam-beam interaction is small. In addition, the “nonresonant beam polarization” observed and studied in the lattice-independent model is also observed in this lattice-dependent model. Therefore, this simulation study gives insights of polarization evolution at fixed beam energies, that are not available in simple spin tracking. Supported by the U.S. Department of Energy (DE-AC02-98CH10886), Hundred-Talent Program (Chinese Academy of Sciences), and National Natural Science Foundation of China (11105164)
A simplified gross thrust computing technique for an afterburning turbofan engine
NASA Technical Reports Server (NTRS)
Hamer, M. J.; Kurtenbach, F. J.
1978-01-01
A simplified gross thrust computing technique extended to the F100-PW-100 afterburning turbofan engine is described. The technique uses measured total and static pressures in the engine tailpipe and ambient static pressure to compute gross thrust. Empirically evaluated calibration factors account for three-dimensional effects, the effects of friction and mass transfer, and the effects of simplifying assumptions for solving the equations. Instrumentation requirements and the sensitivity of computed thrust to transducer errors are presented. NASA altitude facility tests on F100 engines (computed thrust versus measured thrust) are presented, and calibration factors obtained on one engine are shown to be applicable to the second engine by comparing the computed gross thrust. It is concluded that this thrust method is potentially suitable for flight test application and engine maintenance on production engines with a minimum amount of instrumentation.
2013-09-01
DATES COVERED (From - To) 1 Sep 2013–30 Sep 2013 4 . TITLE AND SUBTITLE Detecting Key Inter-Joint Distances and Anthropometry Effects for Static Gesture...13. SUPPLEMENTARY NOTES “Nintendo Wii” is a registered trademark of Nintendo Company, Ltd. “ PlayStation ” is a registered trademark of Sony...Computer Entertainment; PlayStation “Move” ® (Sony Computer Entertainment). “Kinect” is a registered trademark of Microsoft Corporation. Merriam-Webster
Performance tradeoffs in static and dynamic load balancing strategies
NASA Technical Reports Server (NTRS)
Iqbal, M. A.; Saltz, J. H.; Bokhart, S. H.
1986-01-01
The problem of uniformly distributing the load of a parallel program over a multiprocessor system was considered. A program was analyzed whose structure permits the computation of the optimal static solution. Then four strategies for load balancing were described and their performance compared. The strategies are: (1) the optimal static assignment algorithm which is guaranteed to yield the best static solution, (2) the static binary dissection method which is very fast but sub-optimal, (3) the greedy algorithm, a static fully polynomial time approximation scheme, which estimates the optimal solution to arbitrary accuracy, and (4) the predictive dynamic load balancing heuristic which uses information on the precedence relationships within the program and outperforms any of the static methods. It is also shown that the overhead incurred by the dynamic heuristic is reduced considerably if it is started off with a static assignment provided by either of the other three strategies.
NASA Astrophysics Data System (ADS)
Pan, Edward A.
Science, technology, engineering, and mathematics (STEM) education is a national focus. Engineering education, as part of STEM education, needs to adapt to meet the needs of the nation in a rapidly changing world. Using computer-based visualization tools and corresponding 3D printed physical objects may help nontraditional students succeed in engineering classes. This dissertation investigated how adding physical or virtual learning objects (called manipulatives) to courses that require mental visualization of mechanical systems can aid student performance. Dynamics is one such course, and tends to be taught using lecture and textbooks with static diagrams of moving systems. Students often fail to solve the problems correctly and an inability to mentally visualize the system can contribute to student difficulties. This study found no differences between treatment groups on quantitative measures of spatial ability and conceptual knowledge. There were differences between treatments on measures of mechanical reasoning ability, in favor of the use of physical and virtual manipulatives over static diagrams alone. There were no major differences in student performance between the use of physical and virtual manipulatives. Students used the physical and virtual manipulatives to test their theories about how the machines worked, however their actual time handling the manipulatives was extremely limited relative to the amount of time they spent working on the problems. Students used the physical and virtual manipulatives as visual aids when communicating about the problem with their partners, and this behavior was also seen with Traditional group students who had to use the static diagrams and gesture instead. The explanations students gave for how the machines worked provided evidence of mental simulation; however, their causal chain analyses were often flawed, probably due to attempts to decrease cognitive load. Student opinions about the static diagrams and dynamic models varied by type of model (static, physical, virtual), but were generally favorable. The Traditional group students, however, indicated that the lack of adequate representation of motion in the static diagrams was a problem, and wished they had access to the physical and virtual models.
ERIC Educational Resources Information Center
Petersen, Jacinta E.; Treagust, David F.
2014-01-01
Science in the Australian primary school context is in a state of renewal with the recent implementation of the Australian Curriculum: Science. Despite this curriculum renewal, the results of primary students in science have remained static. Science in Australia has been identified as one of the least taught subjects in the primary school…
Baity-Jesi, Marco; Calore, Enrico; Cruz, Andres; Fernandez, Luis Antonio; Gil-Narvión, José Miguel; Gordillo-Guerrero, Antonio; Iñiguez, David; Maiorano, Andrea; Marinari, Enzo; Martin-Mayor, Victor; Monforte-Garcia, Jorge; Muñoz Sudupe, Antonio; Navarro, Denis; Parisi, Giorgio; Perez-Gaviro, Sergio; Ricci-Tersenghi, Federico; Ruiz-Lorenzo, Juan Jesus; Schifano, Sebastiano Fabio; Tarancón, Alfonso; Tripiccione, Raffaele; Yllanes, David
2017-01-01
We have performed a very accurate computation of the nonequilibrium fluctuation–dissipation ratio for the 3D Edwards–Anderson Ising spin glass, by means of large-scale simulations on the special-purpose computers Janus and Janus II. This ratio (computed for finite times on very large, effectively infinite, systems) is compared with the equilibrium probability distribution of the spin overlap for finite sizes. Our main result is a quantitative statics-dynamics dictionary, which could allow the experimental exploration of important features of the spin-glass phase without requiring uncontrollable extrapolations to infinite times or system sizes. PMID:28174274
NASA Astrophysics Data System (ADS)
Jiang, Feng-Jian; Ye, Jian-Feng; Jiao, Zheng; Jiang, Jun; Ma, Kun; Yan, Xin-Hu; Lv, Hai-Jiang
2018-05-01
We perform a proof-of-principle experiment that uses a single negatively charged nitrogen–vacancy (NV) color center with a nearest neighbor 13C nuclear spin in diamond to detect the strength and direction (including both polar and azimuth angles) of a static vector magnetic field by optical detection magnetic resonance (ODMR) technique. With the known hyperfine coupling tensor between an NV center and a nearest neighbor 13C nuclear spin, we show that the information of static vector magnetic field could be extracted by observing the pulsed continuous wave (CW) spectrum. Project supported by the National Natural Science Foundation of China (Grant Nos. 11305074, 11135002, and 11275083), the Key Program of the Education Department Outstanding Youth Foundation of Anhui Province, China (Grant No. gxyqZD2017080), and the Education Department Natural Science Foundation of Anhui Province, China (Grant No. KJHS2015B09).
Active Flow Control in an Aggressive Transonic Diffuser
NASA Astrophysics Data System (ADS)
Skinner, Ryan W.; Jansen, Kenneth E.
2017-11-01
A diffuser exchanges upstream kinetic energy for higher downstream static pressure by increasing duct cross-sectional area. The resulting stream-wise and span-wise pressure gradients promote extensive separation in many diffuser configurations. The present computational work evaluates active flow control strategies for separation control in an asymmetric, aggressive diffuser of rectangular cross-section at inlet Mach 0.7 and Re 2.19M. Corner suction is used to suppress secondary flows, and steady/unsteady tangential blowing controls separation on both the single ramped face and the opposite flat face. We explore results from both Spalart-Allmaras RANS and DDES turbulence modeling frameworks; the former is found to miss key physics of the flow control mechanisms. Simulated baseline, steady, and unsteady blowing performance is validated against experimental data. Funding was provided by Northrop Grumman Corporation, and this research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
Static tensile and tensile creep testing of five ceramic fibers at elevated temperatures
NASA Technical Reports Server (NTRS)
Zimmerman, Richard S.; Adams, Donald F.
1989-01-01
Static tensile and tensile creep testing of five ceramic fibers at elevated temperature was performed. J.P. Stevens, Co., Astroquartz 9288 glass fiber; Nippon Carbon, Ltd., (Dow Corning) nicalon NLM-102 silicon carbide fiber; and 3M Company Nextel 312, 380, and 480 alumina/silica/boria fibers were supplied in unsized tows. Single fibers were separated from the tows and tested in static tension and tensile creep. Elevated test temperatures ranged from 400 C to 1300 C and varied for each fiber. Room temperature static tension was also performed. Computer software was written to reduce all single fiber test data into engineering constants using ASTM Standard Test Method D3379-75 as a reference. A high temperature furnace was designed and built to perform the single fiber elevated temperature testing up to 1300 C. A computerized single fiber creep apparatus was designed and constructed to perform four fiber creep tests simultaneously at temperatures up to 1300 C. Computer software was written to acquire and reduce all creep data.
Static tensile and tensile creep testing of five ceramic fibers at elevated temperatures
NASA Technical Reports Server (NTRS)
Zimmerman, Richard S.; Adams, Donald F.
1988-01-01
Static tensile and tensile creep testing of five ceramic fibers at elevated temperature was performed. J.P. Stevens, Co., Astroquartz 9288 glass fiber, Nippon Carbon, Ltd., (Dow Corning) Nicalon NLM-102 silicon carbide fiber, and 3M Company Nextel 312, 380, and 480 alumina/silica/boria fibers were supplied in unsized tows. Single fibers were separated from the tows and tested in static tension and tensile creep. Elevated test temperatures ranged from 400 to 1300 C and varied for each fiber. Room temperature static tension was also performed. Computer software was written to reduce all single fiber test data into engineering constants using ASTM Standard Test Method D3379-75 as a reference. A high temperature furnace was designed and built to perform the single fiber elevated temperature testing up to 1300 C. A computerized single fiber creep apparatus was designed and constructed to perform four fiber creep tests simultaneously at temperatures up to 1300 C. Computer software was written to acquire and reduce all creep data.
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Huber, David J.; Bhattacharyya, Rajan
2017-05-01
In this paper, we describe an algorithm and system for optimizing search and detection performance for "items of interest" (IOI) in large-sized images and videos that employ the Rapid Serial Visual Presentation (RSVP) based EEG paradigm and surprise algorithms that incorporate motion processing to determine whether static or video RSVP is used. The system works by first computing a motion surprise map on image sub-regions (chips) of incoming sensor video data and then uses those surprise maps to label the chips as either "static" or "moving". This information tells the system whether to use a static or video RSVP presentation and decoding algorithm in order to optimize EEG based detection of IOI in each chip. Using this method, we are able to demonstrate classification of a series of image regions from video with an azimuth value of 1, indicating perfect classification, over a range of display frequencies and video speeds.
An easily implemented static condensation method for structural sensitivity analysis
NASA Technical Reports Server (NTRS)
Gangadharan, S. N.; Haftka, R. T.; Nikolaidis, E.
1990-01-01
A black-box approach to static condensation for sensitivity analysis is presented with illustrative examples of a cube and a car structure. The sensitivity of the structural response with respect to joint stiffness parameter is calculated using the direct method, forward-difference, and central-difference schemes. The efficiency of the various methods for identifying joint stiffness parameters from measured static deflections of these structures is compared. The results indicate that the use of static condensation can reduce computation times significantly and the black-box approach is only slightly less efficient than the standard implementation of static condensation. The ease of implementation of the black-box approach recommends it for use with general-purpose finite element codes that do not have a built-in facility for static condensation.
2002-01-01
Prescribed by ANSI Std Z39-18 Research and Technology Department Dynamics and Diagnostics Division, Static High- Pressure Group Overall Research...Department Dynamics and Diagnostics Division, Static High- Pressure Group Impact of this Basic Research • This research generates phase and density...Static High- Pressure Group Experimental Methodology Use Diamond Anvil Cells (DAC) with coil Heaters (HDAC) to achieve • High pressures (P) to 10 GPa
Hodgkiss, Alex; Gilligan, Katie A; Tolmie, Andrew K; Thomas, Michael S C; Farran, Emily K
2018-01-22
Prior longitudinal and correlational research with adults and adolescents indicates that spatial ability is a predictor of science learning and achievement. However, there is little research to date with primary-school aged children that addresses this relationship. Understanding this association has the potential to inform curriculum design and support the development of early interventions. This study examined the relationship between primary-school children's spatial skills and their science achievement. Children aged 7-11 years (N = 123) completed a battery of five spatial tasks, based on a model of spatial ability in which skills fall along two dimensions: intrinsic-extrinsic; static-dynamic. Participants also completed a curriculum-based science assessment. Controlling for verbal ability and age, mental folding (intrinsic-dynamic spatial ability), and spatial scaling (extrinsic-static spatial ability) each emerged as unique predictors of overall science scores, with mental folding a stronger predictor than spatial scaling. These spatial skills combined accounted for 8% of the variance in science scores. When considered by scientific discipline, mental folding uniquely predicted both physics and biology scores, and spatial scaling accounted for additional variance in biology and variance in chemistry scores. The children's embedded figures task (intrinsic-static spatial ability) only accounted for variance in chemistry scores. The patterns of association were consistent across the age range. Spatial skills, particularly mental folding, spatial scaling, and disembedding, are predictive of 7- to 11-year-olds' science achievement. These skills make a similar contribution to performance for each age group. © 2018 The Authors. British Journal of Education Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Lamination residual stresses in hybrid composites, part 1
NASA Technical Reports Server (NTRS)
Daniel, I. M.; Liber, T.
1976-01-01
An experimental investigation was conducted to study lamination residual stresses for various material and loading parameters. The effects of hybridization on residual stresses and residual properties after thermal cycling under load were determined in angle-ply graphite/Kevlar/epoxy and graphite/S-glass/epoxy laminates. Residual strains in the graphite plies are not appreciably affected by the type and number of hybridizing plies. Computed residual stresses at room temperature in the S-glass plies reach values up to seventy-five percent of the transverse strength of the material. Computed residual stresses in the graphite plies exceed the static strength by approximately ten percent. In the case of Kevlar plies, computed residual stresses far exceed the static strength indicating possible early failure of these plies. Static testing of the hybrids above indicates that failure is governed by the ultimate strain of the graphite plies. In thermally cycled hybrids, in general, residual moduli were somewhat lower and residual strengths were higher than initial values.
NASA Astrophysics Data System (ADS)
Uysal, Ahmet; Zhou, Hua; Lee, Sang Soo; Fenter, Paul; Feng, Guang; Li, Song; Cummings, Peter; Fulvio, Pasquale; Dai, Sheng; McDonough, Jake; Gogotsi, Yury
2014-03-01
Electrical double layer capacitors (EDLCs) with room temperature ionic liquid (RTIL) electrolytes and carbon electrodes are promising candidates for energy storage devices with high power density and long cycle life. We studied the potential and time dependent changes in the electric double layer (EDL) structure of an imidazolium-based room temperature ionic liquid (RTIL) electrolyte at an epitaxial graphene (EG) surface. We used in situ x-ray reflectivity (XR) to determine the EDL structure at static potentials, during cyclic voltammetry (CV) and potential step measurements. The static potential structures were also investigated with fully atomistic molecular dynamics (MD) simulations. Combined XR and MD results show that the EDL structure has alternating anion/cation layers within the first nanometer of the interface. The dynamical response of the EDL to potential steps has a slow component (>10 s) and the RTIL structure shows hysteresis during CV scans. We propose a conceptual model that connects nanoscale interfacial structure to the macroscopic measurements. This material is based upon work supported as part of the Fluid Interface Reactions, Structures and Transport (FIRST) Center, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science (SC), Office of Basic Energy
ERIC Educational Resources Information Center
Su, King-Dow
2008-01-01
This study evaluated the performance of university students who learned science texts by using, information communication technologies (ICT) including animation, static figures, power point, and e-plus software. The characteristics of students and their achievements and attitudes toward 11 multimedia science courses were analyzed. The 11 samples…
Effective Use of Multimedia Presentations to Maximize Learning within High School Science Classrooms
ERIC Educational Resources Information Center
Rapp, Eric
2013-01-01
This research used an evidenced-based experimental 2 x 2 factorial design General Linear Model with Repeated Measures Analysis of Covariance (RMANCOVA). For this analysis, time served as the within-subjects factor while treatment group (i.e., static and signaling, dynamic and signaling, static without signaling, and dynamic without signaling)…
NASA Astrophysics Data System (ADS)
Bicudo, P.; Cardoso, M.; Oliveira, O.; Silva, P. J.
2017-10-01
We revisit the static potential for the Q Q Q ¯Q ¯ system using SU(3) lattice simulations, studying both the color singlets' ground state and first excited state. We consider geometries where the two static quarks and the two antiquarks are at the corners of rectangles of different sizes. We analyze the transition between a tetraquark system and a two-meson system with a two by two correlator matrix. We compare the potentials computed with quenched QCD and with dynamical quarks. We also compare our simulations with the results of previous studies and analyze quantitatively fits of our results with Ansätze inspired in the string flip-flop model and in its possible color excitations.
Improving agreement between static method and dynamic formula for driven cast-in-place piles.
DOT National Transportation Integrated Search
2013-06-01
This study focuses on comparing the capacities and lengths of piling necessary as determined with a static method and with a dynamic formula. Pile capacities and their required lengths are determined two ways: 1) using a design and computed method, s...
NASA Astrophysics Data System (ADS)
Darema, F.
2016-12-01
InfoSymbiotics/DDDAS embodies the power of Dynamic Data Driven Applications Systems (DDDAS), a concept whereby an executing application model is dynamically integrated, in a feed-back loop, with the real-time data-acquisition and control components, as well as other data sources of the application system. Advanced capabilities can be created through such new computational approaches in modeling and simulations, and in instrumentation methods, and include: enhancing the accuracy of the application model; speeding-up the computation to allow faster and more comprehensive models of a system, and create decision support systems with the accuracy of full-scale simulations; in addition, the notion of controlling instrumentation processes by the executing application results in more efficient management of application-data and addresses challenges of how to architect and dynamically manage large sets of heterogeneous sensors and controllers, an advance over the static and ad-hoc ways of today - with DDDAS these sets of resources can be managed adaptively and in optimized ways. Large-Scale-Dynamic-Data encompasses the next wave of Big Data, and namely dynamic data arising from ubiquitous sensing and control in engineered, natural, and societal systems, through multitudes of heterogeneous sensors and controllers instrumenting these systems, and where opportunities and challenges at these "large-scales" relate not only to data size but the heterogeneity in data, data collection modalities, fidelities, and timescales, ranging from real-time data to archival data. In tandem with this important dimension of dynamic data, there is an extended view of Big Computing, which includes the collective computing by networked assemblies of multitudes of sensors and controllers, this range from the high-end to the real-time seamlessly integrated and unified, and comprising the Large-Scale-Big-Computing. InfoSymbiotics/DDDAS engenders transformative impact in many application domains, ranging from the nano-scale to the terra-scale and to the extra-terra-scale. The talk will address opportunities for new capabilities together with corresponding research challenges, with illustrative examples from several application areas including environmental sciences, geosciences, and space sciences.
Sinha, Shriprakash
2016-12-01
Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2014-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2013-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
Analysis of dynamics and fit of diving suits
NASA Astrophysics Data System (ADS)
Mahnic Naglic, M.; Petrak, S.; Gersak, J.; Rolich, T.
2017-10-01
Paper presents research on dynamical behaviour and fit analysis of customised diving suits. Diving suits models are developed using the 3D flattening method, which enables the construction of a garment model directly on the 3D computer body model and separation of discrete 3D surfaces as well as transformation into 2D cutting parts. 3D body scanning of male and female test subjects was performed with the purpose of body measurements analysis in static and dynamic postures and processed body models were used for construction and simulation of diving suits prototypes. All necessary parameters, for 3D simulation were applied on obtained cutting parts, as well as parameters values for mechanical properties of neoprene material. Developed computer diving suits prototypes were used for stretch analysis on areas relevant for body dimensional changes according to dynamic anthropometrics. Garment pressures against the body in static and dynamic conditions was also analysed. Garments patterns for which the computer prototype verification was conducted were used for real prototype production. Real prototypes were also used for stretch and pressure analysis in static and dynamic conditions. Based on the obtained results, correlation analysis between body changes in dynamic positions and dynamic stress, determined on computer and real prototypes, was performed.
Computer Skill Acquisition and Retention: The Effects of Computer-Aided Self-Explanation
ERIC Educational Resources Information Center
Chi, Tai-Yin
2016-01-01
This research presents an experimental study to determine to what extent computer skill learners can benefit from generating self-explanation with the aid of different computer-based visualization technologies. Self-explanation was stimulated with dynamic visualization (Screencast), static visualization (Screenshot), or verbal instructions only,…
An Interactive Computer-Based Training Program for Beginner Personal Computer Maintenance.
ERIC Educational Resources Information Center
Summers, Valerie Brooke
A computer-assisted instructional program, which was developed for teaching beginning computer maintenance to employees of Unisys, covered external hardware maintenance, proper diskette care, making software backups, and electro-static discharge prevention. The procedure used in developing the program was based upon the Dick and Carey (1985) model…
Hypersonic and Supersonic Static Aerodynamics of Mars Science Laboratory Entry Vehicle
NASA Technical Reports Server (NTRS)
Dyakonov, Artem A.; Schoenenberger, Mark; Vannorman, John W.
2012-01-01
This paper describes the analysis of continuum static aerodynamics of Mars Science Laboratory (MSL) entry vehicle (EV). The method is derived from earlier work for Mars Exploration Rover (MER) and Mars Path Finder (MPF) and the appropriate additions are made in the areas where physics are different from what the prior entry systems would encounter. These additions include the considerations for the high angle of attack of MSL EV, ablation of the heatshield during entry, turbulent boundary layer, and other aspects relevant to the flight performance of MSL. Details of the work, the supporting data and conclusions of the investigation are presented.
Tetraquark resonances computed with static lattice QCD potentials and scattering theory
NASA Astrophysics Data System (ADS)
Bicudo, Pedro; Cardoso, Marco; Peters, Antje; Pflaumer, Martin; Wagner, Marc
2018-03-01
We study tetraquark resonances with lattice QCD potentials computed for two static quarks and two dynamical quarks, the Born-Oppenheimer approximation and the emergent wave method of scattering theory. As a proof of concept we focus on systems with isospin I = 0, but consider different relative angular momenta l of the heavy b quarks. We compute the phase shifts and search for S and T matrix poles in the second Riemann sheet. We predict a new tetraquark resonance for l = 1, decaying into two B mesons, with quantum numbers I(JP) = 0(1-), mass m = 10576-4+4 MeV and decay width Γ = 112-103+90 MeV.
Hybrid massively parallel fast sweeping method for static Hamilton-Jacobi equations
NASA Astrophysics Data System (ADS)
Detrixhe, Miles; Gibou, Frédéric
2016-10-01
The fast sweeping method is a popular algorithm for solving a variety of static Hamilton-Jacobi equations. Fast sweeping algorithms for parallel computing have been developed, but are severely limited. In this work, we present a multilevel, hybrid parallel algorithm that combines the desirable traits of two distinct parallel methods. The fine and coarse grained components of the algorithm take advantage of heterogeneous computer architecture common in high performance computing facilities. We present the algorithm and demonstrate its effectiveness on a set of example problems including optimal control, dynamic games, and seismic wave propagation. We give results for convergence, parallel scaling, and show state-of-the-art speedup values for the fast sweeping method.
Computational Simulation of Composite Structural Fatigue
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)
2005-01-01
Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.
Computational Simulation of Composite Structural Fatigue
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
2004-01-01
Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.
On the use of interaction error potentials for adaptive brain computer interfaces.
Llera, A; van Gerven, M A J; Gómez, V; Jensen, O; Kappen, H J
2011-12-01
We propose an adaptive classification method for the Brain Computer Interfaces (BCI) which uses Interaction Error Potentials (IErrPs) as a reinforcement signal and adapts the classifier parameters when an error is detected. We analyze the quality of the proposed approach in relation to the misclassification of the IErrPs. In addition we compare static versus adaptive classification performance using artificial and MEG data. We show that the proposed adaptive framework significantly improves the static classification methods. Copyright © 2011 Elsevier Ltd. All rights reserved.
PIP-II Cryogenic System and the Evolution of Superfluid Helium Cryogenic Plant Specifications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakravarty, Anindya; Rane, Tejas; Klebaner, Arkadiy
2017-01-01
PIP-II cryogenic system: Superfluid Helium Cryogenic Plant (SHCP) and Cryogenic Distribution System (CDS) connecting the SHCP and the SC Linac (25 cryomodules) PIP-II Cryogenic System Static and dynamic heat loads for the SC Linac and static load of CDS listed out Simulation study carried out to compute SHe flow requirements for each cryomodule Comparison between the flow requirements of the cryomodules for the CW and pulsed modes of operation presented From computed heat load and pressure drop values, SHCP basic specifications evolved.
NASA Technical Reports Server (NTRS)
Fertis, D. G.; Simon, A. L.
1981-01-01
The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.
Unsteady Aerodynamic Validation Experiences From the Aeroelastic Prediction Workshop
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Chawlowski, Pawel
2014-01-01
The AIAA Aeroelastic Prediction Workshop (AePW) was held in April 2012, bringing together communities of aeroelasticians, computational fluid dynamicists and experimentalists. The extended objective was to assess the state of the art in computational aeroelastic methods as practical tools for the prediction of static and dynamic aeroelastic phenomena. As a step in this process, workshop participants analyzed unsteady aerodynamic and weakly-coupled aeroelastic cases. Forced oscillation and unforced system experiments and computations have been compared for three configurations. This paper emphasizes interpretation of the experimental data, computational results and their comparisons from the perspective of validation of unsteady system predictions. The issues examined in detail are variability introduced by input choices for the computations, post-processing, and static aeroelastic modeling. The final issue addressed is interpreting unsteady information that is present in experimental data that is assumed to be steady, and the resulting consequences on the comparison data sets.
Profile of student critical thinking ability on static fluid concept
NASA Astrophysics Data System (ADS)
Sulasih; Suparmi, A.; Sarwanto
2017-11-01
Critical thinking ability is an important part of educational goals. It has higher complex processes, such as analyzing, synthesizing and evaluating, drawing conclusion and reflection. This study is aimed to know the critical thinking ability of students in learning static fluids of senior high school students. This research uses the descriptive method which its instruments based on the indicator of critical thinking ability developed according to Ennis. The population of this research is XIth grade science class Public Senior High School, SMA N 1, Sambungmacan, Sragen, Central Java. The static fluid teaching material is delivered using Problem Based Learning Model through class experiment. The results of this study shows that the average student of XIth science class have high critical thinking skills, particularly in the ability of providing simple explanation, build basic skill, and provide advanced explanation, but they do not have high enough in ability of drawing conclusion and strategic and tactical components of critical thinking ability in the study of static fluid teaching material. The average of students critical thinking ability is 72.94, with 27,94% of students are in a low category and 72,22% of students in the high category of critical thinking ability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feo, J.T.
1993-10-01
This report contain papers on: Programmability and performance issues; The case of an iterative partial differential equation solver; Implementing the kernal of the Australian Region Weather Prediction Model in Sisal; Even and quarter-even prime length symmetric FFTs and their Sisal Implementations; Top-down thread generation for Sisal; Overlapping communications and computations on NUMA architechtures; Compiling technique based on dataflow analysis for funtional programming language Valid; Copy elimination for true multidimensional arrays in Sisal 2.0; Increasing parallelism for an optimization that reduces copying in IF2 graphs; Caching in on Sisal; Cache performance of Sisal Vs. FORTRAN; FFT algorithms on a shared-memory multiprocessor;more » A parallel implementation of nonnumeric search problems in Sisal; Computer vision algorithms in Sisal; Compilation of Sisal for a high-performance data driven vector processor; Sisal on distributed memory machines; A virtual shared addressing system for distributed memory Sisal; Developing a high-performance FFT algorithm in Sisal for a vector supercomputer; Implementation issues for IF2 on a static data-flow architechture; and Systematic control of parallelism in array-based data-flow computation. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.« less
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Bennett, Robert M.
1990-01-01
The CAP-TSD (Computational Aeroelasticity Program - Transonic Small Disturbance) code, developed at the NASA - Langley Research Center, is applied to the Active Flexible Wing (AFW) wind tunnel model for prediction of the model's transonic aeroelastic behavior. Static aeroelastic solutions using CAP-TSD are computed. Dynamic (flutter) analyses are then performed as perturbations about the static aeroelastic deformations of the AFW. The accuracy of the static aeroelastic procedure is investigated by comparing analytical results to those from previous AFW wind tunnel experiments. Dynamic results are presented in the form of root loci at different Mach numbers for a heavy gas and air. The resultant flutter boundaries for both gases are also presented. The effects of viscous damping and angle-of-attack, on the flutter boundary in air, are presented as well.
NASA Astrophysics Data System (ADS)
Chien, Yu-Ta; Chang, Chun-Yen
2012-02-01
This study developed three forms of computer-based multimedia, including Static Graphics (SG), Simple Learner-Pacing Animation (SLPA), and Full Learner-Pacing Animation (FLPA), to assist students in learning topographic measuring. The interactive design of FLPA allowed students to physically manipulate the virtual measuring mechanism, rather than passively observe dynamic or static images. The students were randomly assigned to different multimedia groups. The results of a one-way ANOVA analysis indicated that (1) there was a significant difference with a large effect size ( f = .69) in mental effort ratings among three groups, and the post-hoc test indicated that FLPA imposed less cognitive load on students than did SG ( p = .007); (2) the differences of practical performance scores among groups reached the statistic significant level with a large effect size ( f = .76), and the post-hoc test indicated that FLPA fostered better learning outcomes than both SLPA and SG ( p = .004 and p = .05, respectively); (3) the difference in instructional efficiency that was computed by the z-score combination of students' mental effort ratings and practical performance scores among the three groups obtained the statistic significant level with a large effect size ( f = .79), and the post-hoc test indicated that FLPA brought students higher instructional efficiency than those of both SLPA and SG ( p = .01 and .005, respectively); (4) no significant effect was found in instructional time-spans between groups ( p = .637). Overall, FLPA was recommended as the best multimedia form to facilitate topographic measurement learning. The implications of instructional multimedia design were discussed from the perspective of cognitive load theory.
Student Leadership in Small Group Science Inquiry
ERIC Educational Resources Information Center
Oliveira, Alandeom W.; Boz, Umit; Broadwell, George A.; Sadler, Troy D.
2014-01-01
Background: Science educators have sought to structure collaborative inquiry learning through the assignment of static group roles. This structural approach to student grouping oversimplifies the complexities of peer collaboration and overlooks the highly dynamic nature of group activity. Purpose: This study addresses this issue of…
Liu, Jun; Zhang, Liqun; Cao, Dapeng; Wang, Wenchuan
2009-12-28
Polymer nanocomposites (PNCs) often exhibit excellent mechanical, thermal, electrical and optical properties, because they combine the performances of both polymers and inorganic or organic nanoparticles. Recently, computer modeling and simulation are playing an important role in exploring the reinforcement mechanism of the PNCs and even the design of functional PNCs. This report provides an overview of the progress made in past decades in the investigation of the static, rheological and mechanical properties of polymer nanocomposites studied by computer modeling and simulation. Emphases are placed on exploring the mechanisms at the molecular level for the dispersion of nanoparticles in nanocomposites, the effects of nanoparticles on chain conformation and glass transition temperature (T(g)), as well as viscoelastic and mechanical properties. Finally, some future challenges and opportunities in computer modeling and simulation of PNCs are addressed.
14 CFR 29.725 - Limit drop test.
Code of Federal Regulations, 2011 CFR
2011-01-01
....), equal to the static reaction on the particular unit with the rotorcraft in the most critical attitude. A rational method may be used in computing a main gear static reaction, taking into consideration the moment arm between the main wheel reaction and the rotorcraft center of gravity. W=W N for nose gear units...
14 CFR 29.725 - Limit drop test.
Code of Federal Regulations, 2010 CFR
2010-01-01
....), equal to the static reaction on the particular unit with the rotorcraft in the most critical attitude. A rational method may be used in computing a main gear static reaction, taking into consideration the moment arm between the main wheel reaction and the rotorcraft center of gravity. W=W N for nose gear units...
Visualizing risks in cancer communication: A systematic review of computer-supported visual aids.
Stellamanns, Jan; Ruetters, Dana; Dahal, Keshav; Schillmoeller, Zita; Huebner, Jutta
2017-08-01
Health websites are becoming important sources for cancer information. Lay users, patients and carers seek support for critical decisions, but they are prone to common biases when quantitative information is presented. Graphical representations of risk data can facilitate comprehension, and interactive visualizations are popular. This review summarizes the evidence on computer-supported graphs that present risk data and their effects on various measures. The systematic literature search was conducted in several databases, including MEDLINE, EMBASE and CINAHL. Only studies with a controlled design were included. Relevant publications were carefully selected and critically appraised by two reviewers. Thirteen studies were included. Ten studies evaluated static graphs and three dynamic formats. Most decision scenarios were hypothetical. Static graphs could improve accuracy, comprehension, and behavioural intention. But the results were heterogeneous and inconsistent among the studies. Dynamic formats were not superior or even impaired performance compared to static formats. Static graphs show promising but inconsistent results, while research on dynamic visualizations is scarce and must be interpreted cautiously due to methodical limitations. Well-designed and context-specific static graphs can support web-based cancer risk communication in particular populations. The application of dynamic formats cannot be recommended and needs further research. Copyright © 2017 Elsevier B.V. All rights reserved.
Structural integrity of a confinement vessel for testing nuclear fuels for space propulsion
NASA Astrophysics Data System (ADS)
Bergmann, V. L.
Nuclear propulsion systems for rockets could significantly reduce the travel time to distant destinations in space. However, long before such a concept can become reality, a significant effort must be invested in analysis and ground testing to guide the development of nuclear fuels. Any testing in support of development of nuclear fuels for space propulsion must be safely contained to prevent the release of radioactive materials. This paper describes analyses performed to assess the structural integrity of a test confinement vessel. The confinement structure, a stainless steel pressure vessel with bolted flanges, was designed for operating static pressures in accordance with the ASME Boiler and Pressure Vessel Code. In addition to the static operating pressures, the confinement barrier must withstand static overpressures from off-normal conditions without releasing radioactive material. Results from axisymmetric finite element analyses are used to evaluate the response of the confinement structure under design and accident conditions. For the static design conditions, the stresses computed from the ASME code are compared with the stresses computed by the finite element method.
Heredia, Alejandro; Colín-García, María; Puig, Teresa Pi I; Alba-Aldave, Leticia; Meléndez, Adriana; Cruz-Castañeda, Jorge A; Basiuk, Vladimir A; Ramos-Bernal, Sergio; Mendoza, Alicia Negrón
2017-12-01
Ionizing radiation may have played a relevant role in chemical reactions for prebiotic biomolecule formation on ancient Earth. Environmental conditions such as the presence of water and magnetic fields were possibly relevant in the formation of organic compounds such as amino acids. ATR-FTIR, Raman, EPR and X-ray spectroscopies provide valuable information about molecular organization of different glycine polymorphs under static magnetic fields. γ-glycine polymorph formation increases in irradiated samples interacting with static magnetic fields. The increase in γ-glycine polymorph agrees with the computer simulations. The AM1 semi-empirical simulations show a change in the catalyst behavior and dipole moment values in α and γ-glycine interaction with the static magnetic field. The simulated crystal lattice energy in α-glycine is also affected by the free radicals under the magnetic field, which decreases its stability. Therefore, solid α and γ-glycine containing free radicals under static magnetic fields might have affected the prebiotic scenario on ancient Earth by causing the oligomerization of glycine in prebiotic reactions. Copyright © 2017 Elsevier B.V. All rights reserved.
The 1991 International Aerospace and Ground Conference on Lightning and Static Electricity, volume 2
NASA Technical Reports Server (NTRS)
1991-01-01
The proceedings of the conference are reported. The conference focussed on lightning protection, detection, and forecasting. The conference was divided into 26 sessions based on research in lightning, static electricity, modeling, and mapping. These sessions spanned the spectrum from basic science to engineering, concentrating on lightning prediction and detection and on safety for ground facilities, aircraft, and aerospace vehicles.
The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy
ERIC Educational Resources Information Center
Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce
2013-01-01
There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…
Hybrid massively parallel fast sweeping method for static Hamilton–Jacobi equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Detrixhe, Miles, E-mail: mdetrixhe@engineering.ucsb.edu; University of California Santa Barbara, Santa Barbara, CA, 93106; Gibou, Frédéric, E-mail: fgibou@engineering.ucsb.edu
The fast sweeping method is a popular algorithm for solving a variety of static Hamilton–Jacobi equations. Fast sweeping algorithms for parallel computing have been developed, but are severely limited. In this work, we present a multilevel, hybrid parallel algorithm that combines the desirable traits of two distinct parallel methods. The fine and coarse grained components of the algorithm take advantage of heterogeneous computer architecture common in high performance computing facilities. We present the algorithm and demonstrate its effectiveness on a set of example problems including optimal control, dynamic games, and seismic wave propagation. We give results for convergence, parallel scaling,more » and show state-of-the-art speedup values for the fast sweeping method.« less
The Role of Trust in Information Science and Technology.
ERIC Educational Resources Information Center
Marsh, Stephen; Dibben, Mark R.
2003-01-01
Discusses the notion of trust as it relates to information science and technology, specifically user interfaces, autonomous agents, and information systems. Highlights include theoretical meaning of trust; trust and levels of analysis, including organizational trust; electronic commerce, user interfaces, and static trust; dynamic trust; and trust…
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zubko, I. Yu., E-mail: zoubko@list.ru; Kochurov, V. I.
2015-10-27
For the aim of the crystal temperature control the computational-statistical approach to studying thermo-mechanical properties for finite sized crystals is presented. The approach is based on the combination of the high-performance computational techniques and statistical analysis of the crystal response on external thermo-mechanical actions for specimens with the statistically small amount of atoms (for instance, nanoparticles). The heat motion of atoms is imitated in the statics approach by including the independent degrees of freedom for atoms connected with their oscillations. We obtained that under heating, graphene material response is nonsymmetric.
Computational Investigation of the Aerodynamic Effects on Fluidic Thrust Vectoring
NASA Technical Reports Server (NTRS)
Deere, K. A.
2000-01-01
A computational investigation of the aerodynamic effects on fluidic thrust vectoring has been conducted. Three-dimensional simulations of a two-dimensional, convergent-divergent (2DCD) nozzle with fluidic injection for pitch vector control were run with the computational fluid dynamics code PAB using turbulence closure and linear Reynolds stress modeling. Simulations were computed with static freestream conditions (M=0.05) and at Mach numbers from M=0.3 to 1.2, with scheduled nozzle pressure ratios (from 3.6 to 7.2) and secondary to primary total pressure ratios of p(sub t,s)/p(sub t,p)=0.6 and 1.0. Results indicate that the freestream flow decreases vectoring performance and thrust efficiency compared with static (wind-off) conditions. The aerodynamic penalty to thrust vector angle ranged from 1.5 degrees at a nozzle pressure ratio of 6 with M=0.9 freestream conditions to 2.9 degrees at a nozzle pressure ratio of 5.2 with M=0.7 freestream conditions, compared to the same nozzle pressure ratios with static freestream conditions. The aerodynamic penalty to thrust ratio decreased from 4 percent to 0.8 percent as nozzle pressure ratio increased from 3.6 to 7.2. As expected, the freestream flow had little influence on discharge coefficient.
Assessment of the Reconstructed Aerodynamics of the Mars Science Laboratory Entry Vehicle
NASA Technical Reports Server (NTRS)
Schoenenberger, Mark; Van Norman, John W.; Dyakonov, Artem A.; Karlgaard, Christopher D.; Way, David W.; Kutty, Prasad
2013-01-01
On August 5, 2012, the Mars Science Laboratory entry vehicle successfully entered Mars atmosphere, flying a guided entry until parachute deploy. The Curiosity rover landed safely in Gale crater upon completion of the Entry Descent and Landing sequence. This paper compares the aerodynamics of the entry capsule extracted from onboard flight data, including Inertial Measurement Unit (IMU) accelerometer and rate gyro information, and heatshield surface pressure measurements. From the onboard data, static force and moment data has been extracted. This data is compared to preflight predictions. The information collected by MSL represents the most complete set of information collected during Mars entry to date. It allows the separation of aerodynamic performance from atmospheric conditions. The comparisons show the MSL aerodynamic characteristics have been identified and resolved to an accuracy better than the aerodynamic database uncertainties used in preflight simulations. A number of small anomalies have been identified and are discussed. This data will help revise aerodynamic databases for future missions and will guide computational fluid dynamics (CFD) development to improved prediction codes.
Influence of the Angle of Attack on the Aerothermodynamics of the Mars Science Laboratory
NASA Technical Reports Server (NTRS)
Dyakonov, Artem A.; Edquist, Karl T.; Schoenenberger, Mark
2006-01-01
An investigation of the effects of the incidence angle on the aerothermodynamic environments of the Mars Science Laboratory has been conducted. Flight conditions of peak heating, peak deceleration and chute deploy are selected and the effects of the angle of attack on the aerodynamics and aerothermodynamics are analyzed. The investigation found that static aerodynamics are well behaved within the considered range of incidence angles. Leeside laminar and turbulent computed heating rates decrease with incidence, despite the increase in the leeside running length. Stagnation point was found to stay on the conical flank at all angles of attack, and this is linked to the rapid flow expansion around the shoulder. Hypersonic lift to drag ratio is limited by the heating rates in the region of the windside shoulder. The effects of the high angle of incidence on the dynamic aero at low Mach remains to be determined. Influence of the angle of attack on the smooth-wall transition parameter indicates, that higher angle of attack flight may result in delayed turbulence onset, however, a coupled analysis, involving flight trajectory simulation is necessary.
Two Eyes, 3D: A New Project to Study Stereoscopy in Astronomy Education
NASA Astrophysics Data System (ADS)
Price, Aaron; SubbaRao, M.; Wyatt, R.
2012-01-01
"Two Eyes, 3D" is a 3-year NSF funded research project to study the educational impacts of using stereoscopic representations in informal settings. The project funds two experimental studies. The first is focused on how children perceive various spatial qualities of scientific objects displayed in static 2D and 3D formats. The second is focused on how adults perceive various spatial qualities of scientific objects and processes displayed in 2D and 3D movie formats. As part of the project, two brief high-definition films about variable stars will be developed. Both studies will be mixed-method and look at prior spatial ability and other demographic variables as covariates. The project is run by the American Association of Variable Star Observers, Boston Museum of Science and the Adler Planetarium and Astronomy Museum with consulting from the California Academy of Sciences. Early pilot results will be presented. All films will be released into the public domain, as will the assessment software designed to run on tablet computers (iOS or Android).
NASA Astrophysics Data System (ADS)
Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.
2015-12-01
A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.
Pheromone Static Routing Strategy for Complex Networks
NASA Astrophysics Data System (ADS)
Hu, Mao-Bin; Henry, Y. K. Lau; Ling, Xiang; Jiang, Rui
2012-12-01
We adopt the concept of using pheromones to generate a set of static paths that can reach the performance of global dynamic routing strategy [Phys. Rev. E 81 (2010) 016113]. The path generation method consists of two stages. In the first stage, a pheromone is dropped to the nodes by packets forwarded according to the global dynamic routing strategy. In the second stage, pheromone static paths are generated according to the pheromone density. The output paths can greatly improve traffic systems' overall capacity on different network structures, including scale-free networks, small-world networks and random graphs. Because the paths are static, the system needs much less computational resources than the global dynamic routing strategy.
Adaptive function allocation reduces performance costs of static automation
NASA Technical Reports Server (NTRS)
Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian
1993-01-01
Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.
Measuring Cognitive Load in Test Items: Static Graphics versus Animated Graphics
ERIC Educational Resources Information Center
Dindar, M.; Kabakçi Yurdakul, I.; Inan Dönmez, F.
2015-01-01
The majority of multimedia learning studies focus on the use of graphics in learning process but very few of them examine the role of graphics in testing students' knowledge. This study investigates the use of static graphics versus animated graphics in a computer-based English achievement test from a cognitive load theory perspective. Three…
[Three-dimensional reconstruction of functional brain images].
Inoue, M; Shoji, K; Kojima, H; Hirano, S; Naito, Y; Honjo, I
1999-08-01
We consider PET (positron emission tomography) measurement with SPM (Statistical Parametric Mapping) analysis to be one of the most useful methods to identify activated areas of the brain involved in language processing. SPM is an effective analytical method that detects markedly activated areas over the whole brain. However, with the conventional presentations of these functional brain images, such as horizontal slices, three directional projection, or brain surface coloring, makes understanding and interpreting the positional relationships among various brain areas difficult. Therefore, we developed three-dimensionally reconstructed images from these functional brain images to improve the interpretation. The subjects were 12 normal volunteers. The following three types of images were constructed: 1) routine images by SPM, 2) three-dimensional static images, and 3) three-dimensional dynamic images, after PET images were analyzed by SPM during daily dialog listening. The creation of images of both the three-dimensional static and dynamic types employed the volume rendering method by VTK (The Visualization Toolkit). Since the functional brain images did not include original brain images, we synthesized SPM and MRI brain images by self-made C++ programs. The three-dimensional dynamic images were made by sequencing static images with available software. Images of both the three-dimensional static and dynamic types were processed by a personal computer system. Our newly created images showed clearer positional relationships among activated brain areas compared to the conventional method. To date, functional brain images have been employed in fields such as neurology or neurosurgery, however, these images may be useful even in the field of otorhinolaryngology, to assess hearing and speech. Exact three-dimensional images based on functional brain images are important for exact and intuitive interpretation, and may lead to new developments in brain science. Currently, the surface model is the most common method of three-dimensional display. However, the volume rendering method may be more effective for imaging regions such as the brain.
Developing a Repertoire of Activities for Teaching Physical Science.
ERIC Educational Resources Information Center
Cain, Peggy W.
This activity manual is divided into 15 units which focus on: the nature of science; metric measurements; properties of matter; energy; atomic structure; chemical reactions; acids, bases, and salts; temperature and heat; readioactivity; mechanics; wave motion, sound, and light; static charges and current electricity magnetism and electromagnetism;…
NASA Technical Reports Server (NTRS)
White, C. W.
1981-01-01
The computational efficiency of the impedance type loads prediction method was studied. Three goals were addressed: devise a method to make the impedance method operate more efficiently in the computer; assess the accuracy and convenience of the method for determining the effect of design changes; and investigate the use of the method to identify design changes for reduction of payload loads. The method is suitable for calculation of dynamic response in either the frequency or time domain. It is concluded that: the choice of an orthogonal coordinate system will allow the impedance method to operate more efficiently in the computer; the approximate mode impedance technique is adequate for determining the effect of design changes, and is applicable for both statically determinate and statically indeterminate payload attachments; and beneficial design changes to reduce payload loads can be identified by the combined application of impedance techniques and energy distribution review techniques.
'spup' - an R package for uncertainty propagation in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2016-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
NASA Astrophysics Data System (ADS)
Lai, Tianwei; Fu, Bao; Chen, Shuangtao; Zhang, Qiyong; Hou, Yu
2017-02-01
The EAST superconducting tokamak, an advanced steady-state plasma physics experimental device, has been built at the Institute of Plasma Physics, Chinese Academy of Sciences. All the toroidal field magnets and poloidal field magnets, made of NbTi/Cu cable-in-conduit conductor, are cooled with forced flow supercritical helium at 3.8 K. The cryogenic system of EAST consists of a 2 kW/4 K helium refrigerator and a helium distribution system for the cooling of coils, structures, thermal shields, bus-lines, etc. The high-speed turbo-expander is an important refrigerating component of the EAST cryogenic system. In the turbo-expander, the axial supporting technology is critical for the smooth operation of the rotor bearing system. In this paper, hydrostatic thrust bearings are designed based on the axial load of the turbo-expander. Thereafter, a computational fluid dynamics-based numerical model of the aerostatic thrust bearing is set up to evaluate the bearing performance. Tilting effect on the pressure distribution and bearing load is analyzed for the thrust bearing. Bearing load and stiffness are compared with different static supply pressures. The net force from the thrust bearings can be calculated for different combinations of bearing clearance and supply pressure.
High Temperature Composite Analyzer (HITCAN) demonstration manual, version 1.0
NASA Technical Reports Server (NTRS)
Singhal, S. N; Lackney, J. J.; Murthy, P. L. N.
1993-01-01
This manual comprises a variety of demonstration cases for the HITCAN (HIgh Temperature Composite ANalyzer) code. HITCAN is a general purpose computer program for predicting nonlinear global structural and local stress-strain response of arbitrarily oriented, multilayered high temperature metal matrix composite structures. HITCAN is written in FORTRAN 77 computer language and has been configured and executed on the NASA Lewis Research Center CRAY XMP and YMP computers. Detailed description of all program variables and terms used in this manual may be found in the User's Manual. The demonstration includes various cases to illustrate the features and analysis capabilities of the HITCAN computer code. These cases include: (1) static analysis, (2) nonlinear quasi-static (incremental) analysis, (3) modal analysis, (4) buckling analysis, (5) fiber degradation effects, (6) fabrication-induced stresses for a variety of structures; namely, beam, plate, ring, shell, and built-up structures. A brief discussion of each demonstration case with the associated input data file is provided. Sample results taken from the actual computer output are also included.
NASA Technical Reports Server (NTRS)
Tanner, John A.
1996-01-01
A computational procedure is presented for the solution of frictional contact problems for aircraft tires. A Space Shuttle nose-gear tire is modeled using a two-dimensional laminated anisotropic shell theory which includes the effects of variations in material and geometric parameters, transverse-shear deformation, and geometric nonlinearities. Contact conditions are incorporated into the formulation by using a perturbed Lagrangian approach with the fundamental unknowns consisting of the stress resultants, the generalized displacements, and the Lagrange multipliers associated with both contact and friction conditions. The contact-friction algorithm is based on a modified Coulomb friction law. A modified two-field, mixed-variational principle is used to obtain elemental arrays. This modification consists of augmenting the functional of that principle by two terms: the Lagrange multiplier vector associated with normal and tangential node contact-load intensities and a regularization term that is quadratic in the Lagrange multiplier vector. These capabilities and computational features are incorporated into an in-house computer code. Experimental measurements were taken to define the response of the Space Shuttle nose-gear tire to inflation-pressure loads and to inflation-pressure loads combined with normal static loads against a rigid flat plate. These experimental results describe the meridional growth of the tire cross section caused by inflation loading, the static load-deflection characteristics of the tire, the geometry of the tire footprint under static loading conditions, and the normal and tangential load-intensity distributions in the tire footprint for the various static vertical loading conditions. Numerical results were obtained for the Space Shuttle nose-gear tire subjected to inflation pressure loads and combined inflation pressure and contact loads against a rigid flat plate. The experimental measurements and the numerical results are compared.
NASA Technical Reports Server (NTRS)
Tanner, John A.
1996-01-01
A computational procedure is presented for the solution of frictional contact problems for aircraft tires. A Space Shuttle nose-gear tire is modeled using a two-dimensional laminated anisotropic shell theory which includes the effects of variations in material and geometric parameters, transverse-shear deformation, and geometric nonlinearities. Contact conditions are incorporated into the formulation by using a perturbed Lagrangian approach with the fundamental unknowns consisting of the stress resultants, the generalized displacements, and the Lagrange multipliers associated with both contact and friction conditions. The contact-friction algorithm is based on a modified Coulomb friction law. A modified two-field, mixed-variational principle is used to obtain elemental arrays. This modification consists of augmenting the functional of that principle by two terms: the Lagrange multiplier vector associated with normal and tangential node contact-load intensities and a regularization term that is quadratic in the Lagrange multiplier vector. These capabilities and computational features are incorporated into an in-house computer code. Experimental measurements were taken to define the response of the Space Shuttle nose-gear tire to inflation-pressure loads and to inflation-pressure loads combined with normal static loads against a rigid flat plate. These experimental results describe the meridional growth of the tire cross section caused by inflation loading, the static load-deflection characteristics of the tire, the geometry of the tire footprint under static loading conditions, and the normal and tangential load-intensity distributions in the tire footprint for the various static vertical-loading conditions. Numerical results were obtained for the Space Shuttle nose-gear tire subjected to inflation pressure loads and combined inflation pressure and contact loads against a rigid flat plate. The experimental measurements and the numerical results are compared.
Biologically inspired intelligent robots
NASA Astrophysics Data System (ADS)
Bar-Cohen, Yoseph; Breazeal, Cynthia
2003-07-01
Humans throughout history have always sought to mimic the appearance, mobility, functionality, intelligent operation, and thinking process of biological creatures. This field of biologically inspired technology, having the moniker biomimetics, has evolved from making static copies of human and animals in the form of statues to the emergence of robots that operate with realistic behavior. Imagine a person walking towards you where suddenly you notice something weird about him--he is not real but rather he is a robot. Your reaction would probably be "I can't believe it but this robot looks very real" just as you would react to an artificial flower that is a good imitation. You may even proceed and touch the robot to check if your assessment is correct but, as oppose to the flower case, the robot may be programmed to respond physical and verbally. This science fiction scenario could become a reality as the current trend continues in developing biologically inspired technologies. Technology evolution led to such fields as artificial muscles, artificial intelligence, and artificial vision as well as biomimetic capabilities in materials science, mechanics, electronics, computing science, information technology and many others. This paper will review the state of the art and challenges to biologically-inspired technologies and the role that EAP is expected to play as the technology evolves.
Learning about static electricity and magnetism in a fourth-grade classroom
NASA Astrophysics Data System (ADS)
Henry, David Roy
Students begin to develop mental models to explain electrostatic and magnetic phenomena throughout childhood, middle childhood and high school, although these mental models are often incoherent and unscientific (Borges, Tenico, & Gilbert, 1998; Maloney, 1985). This is a case study of a classroom of grade four students and the mental models of magnetism and static electricity they used during a six-week science unit. The 22 students studied magnetism and static electricity using inquiry activities structured to create an environment where students would be likely to construct powerful scientific ideas (Goldberg & Bendall, 1995). Multiple data sources, including students' writing, student assessments, teacher interviews, student interviews, teacher journals, and classroom video and audio recordings were used to uncover how fourth grade students made sense of static electricity and magnetism before, during, and after instruction. The data were analyzed using a social constructivist framework to determine if students were able to develop target scientific ideas about static electricity and magnetism. In general, students were found to have three core mental models prior to instruction: (1) Static electricity and magnetism are the same "substance"; (2) This substance exists on the surface of a magnet or a charged object and can be rubbed off, and (3) Opposite substances attract. During the activities, students had many opportunities to observe evidence that contradicted these core mental models. Using evidence from direct observations, the students practiced differentiating between evidence and ideas. Through group and class discussions, they developed evidenced-based (scientific) ideas. Final assessments revealed that students were able to construct target ideas such as: (1) static electricity and magnetism are fundamentally different; (2) there are two kinds of static "charge;" (3) magnet-rubbed wires act like a magnet; and (4) opposite substances move toward each other, like substances push away from each other. Some target ideas, such as "Magnetic materials are made up of magnetic domains that align to give an overall magnetic effect" were found to be difficult for students this age to develop. This case study will augment research about effective science teaching, teacher development and the support necessary for curriculum change.
Creating executable architectures using Visual Simulation Objects (VSO)
NASA Astrophysics Data System (ADS)
Woodring, John W.; Comiskey, John B.; Petrov, Orlin M.; Woodring, Brian L.
2005-05-01
Investigations have been performed to identify a methodology for creating executable models of architectures and simulations of architecture that lead to an understanding of their dynamic properties. Colored Petri Nets (CPNs) are used to describe architecture because of their strong mathematical foundations, the existence of techniques for their verification and graph theory"s well-established history of success in modern science. CPNs have been extended to interoperate with legacy simulations via a High Level Architecture (HLA) compliant interface. It has also been demonstrated that an architecture created as a CPN can be integrated with Department of Defense Architecture Framework products to ensure consistency between static and dynamic descriptions. A computer-aided tool, Visual Simulation Objects (VSO), which aids analysts in specifying, composing and executing architectures, has been developed to verify the methodology and as a prototype commercial product.
Impact of mobility structure on optimization of small-world networks of mobile agents
NASA Astrophysics Data System (ADS)
Lee, Eun; Holme, Petter
2016-06-01
In ad hoc wireless networking, units are connected to each other rather than to a central, fixed, infrastructure. Constructing and maintaining such networks create several trade-off problems between robustness, communication speed, power consumption, etc., that bridges engineering, computer science and the physics of complex systems. In this work, we address the role of mobility patterns of the agents on the optimal tuning of a small-world type network construction method. By this method, the network is updated periodically and held static between the updates. We investigate the optimal updating times for different scenarios of the movement of agents (modeling, for example, the fat-tailed trip distances, and periodicities, of human travel). We find that these mobility patterns affect the power consumption in non-trivial ways and discuss how these effects can best be handled.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
Static respiratory muscle work during immersion with positive and negative respiratory loading.
Taylor, N A; Morrison, J B
1999-10-01
Upright immersion imposes a pressure imbalance across the thorax. This study examined the effects of air-delivery pressure on inspiratory muscle work during upright immersion. Eight subjects performed respiratory pressure-volume relaxation maneuvers while seated in air (control) and during immersion. Hydrostatic, respiratory elastic (lung and chest wall), and resultant static respiratory muscle work components were computed. During immersion, the effects of four air-delivery pressures were evaluated: mouth pressure (uncompensated); the pressure at the lung centroid (PL,c); and at PL,c +/-0.98 kPa. When breathing at pressures less than the PL,c, subjects generally defended an expiratory reserve volume (ERV) greater than the immersed relaxation volume, minus residual volume, resulting in additional inspiratory muscle work. The resultant static inspiratory muscle work, computed over a 1-liter tidal volume above the ERV, increased from 0.23 J. l(-1), when subjects were breathing at PL,c, to 0.83 J. l(-1) at PL,c -0.98 kPa (P < 0.05), and to 1.79 J. l(-1) at mouth pressure (P < 0.05). Under the control state, and during the above experimental conditions, static expiratory work was minimal. When breathing at PL,c +0.98 kPa, subjects adopted an ERV less than the immersed relaxation volume, minus residual volume, resulting in 0.36 J. l(-1) of expiratory muscle work. Thus static inspiratory muscle work varied with respiratory loading, whereas PL,c air supply minimized this work during upright immersion, restoring lung-tissue, chest-wall, and static muscle work to levels obtained in the control state.
Hierarchial parallel computer architecture defined by computational multidisciplinary mechanics
NASA Technical Reports Server (NTRS)
Padovan, Joe; Gute, Doug; Johnson, Keith
1989-01-01
The goal is to develop an architecture for parallel processors enabling optimal handling of multi-disciplinary computation of fluid-solid simulations employing finite element and difference schemes. The goals, philosphical and modeling directions, static and dynamic poly trees, example problems, interpolative reduction, the impact on solvers are shown in viewgraph form.
Computer program user's manual for advanced general aviation propeller study
NASA Technical Reports Server (NTRS)
Worobel, R.
1972-01-01
A user's manual is presented for a computer program for predicting the performance (static, flight, and reverse), noise, weight and cost of propellers for advanced general aviation aircraft of the 1980 time period. Complete listings of this computer program with detailed instructions and samples of input and output are included.
ERIC Educational Resources Information Center
Macek, Victor C.
The nine Reactor Statics Modules are designed to introduce students to the use of numerical methods and digital computers for calculation of neutron flux distributions in space and energy which are needed to calculate criticality, power distribution, and fuel burnup for both slow neutron and fast neutron fission reactors. The last module, RS-9,…
Cutting, James E
2002-01-01
Representing motion in a picture is a challenge to artists, scientists, and all other imagemakers. Moreover, it presents a problem that will not go away with electronic and digital media, because often the pedagogical purpose of the representation of motion is more important than the motion itself. All satisfactory solutions evoke motion-for example, dynamic balance (or broken symmetry), stroboscopic sequences, affine shear (or forward lean), and photographic blur-but they also typically sacrifice the accuracy of the motion represented, a solution often unsuitable for science. Vector representations superimposed on static images allow for accuracy, but are not applicable to all situations. Workable solutions are almost certainly case specific and subject to continual evolution through exploration by imagemakers.
In situ determination of the static inductance and resistance of a plasma focus capacitor bank.
Saw, S H; Lee, S; Roy, F; Chong, P L; Vengadeswaran, V; Sidik, A S M; Leong, Y W; Singh, A
2010-05-01
The static (unloaded) electrical parameters of a capacitor bank are of utmost importance for the purpose of modeling the system as a whole when the capacitor bank is discharged into its dynamic electromagnetic load. Using a physical short circuit across the electromagnetic load is usually technically difficult and is unnecessary. The discharge can be operated at the highest pressure permissible in order to minimize current sheet motion, thus simulating zero dynamic load, to enable bank parameters, static inductance L(0), and resistance r(0) to be obtained using lightly damped sinusoid equations given the bank capacitance C(0). However, for a plasma focus, even at the highest permissible pressure it is found that there is significant residual motion, so that the assumption of a zero dynamic load introduces unacceptable errors into the determination of the circuit parameters. To overcome this problem, the Lee model code is used to fit the computed current trace to the measured current waveform. Hence the dynamics is incorporated into the solution and the capacitor bank parameters are computed using the Lee model code, and more accurate static bank parameters are obtained.
In situ determination of the static inductance and resistance of a plasma focus capacitor bank
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saw, S. H.; Institute for Plasma Focus Studies, 32 Oakpark Drive, Chadstone, Victoria 3148; Lee, S.
2010-05-15
The static (unloaded) electrical parameters of a capacitor bank are of utmost importance for the purpose of modeling the system as a whole when the capacitor bank is discharged into its dynamic electromagnetic load. Using a physical short circuit across the electromagnetic load is usually technically difficult and is unnecessary. The discharge can be operated at the highest pressure permissible in order to minimize current sheet motion, thus simulating zero dynamic load, to enable bank parameters, static inductance L{sub 0}, and resistance r{sub 0} to be obtained using lightly damped sinusoid equations given the bank capacitance C{sub 0}. However, formore » a plasma focus, even at the highest permissible pressure it is found that there is significant residual motion, so that the assumption of a zero dynamic load introduces unacceptable errors into the determination of the circuit parameters. To overcome this problem, the Lee model code is used to fit the computed current trace to the measured current waveform. Hence the dynamics is incorporated into the solution and the capacitor bank parameters are computed using the Lee model code, and more accurate static bank parameters are obtained.« less
Lord, Louis-David; Stevner, Angus B.; Kringelbach, Morten L.
2017-01-01
To survive in an ever-changing environment, the brain must seamlessly integrate a rich stream of incoming information into coherent internal representations that can then be used to efficiently plan for action. The brain must, however, balance its ability to integrate information from various sources with a complementary capacity to segregate information into modules which perform specialized computations in local circuits. Importantly, evidence suggests that imbalances in the brain's ability to bind together and/or segregate information over both space and time is a common feature of several neuropsychiatric disorders. Most studies have, however, until recently strictly attempted to characterize the principles of integration and segregation in static (i.e. time-invariant) representations of human brain networks, hence disregarding the complex spatio-temporal nature of these processes. In the present Review, we describe how the emerging discipline of whole-brain computational connectomics may be used to study the causal mechanisms of the integration and segregation of information on behaviourally relevant timescales. We emphasize how novel methods from network science and whole-brain computational modelling can expand beyond traditional neuroimaging paradigms and help to uncover the neurobiological determinants of the abnormal integration and segregation of information in neuropsychiatric disorders. This article is part of the themed issue ‘Mathematical methods in medicine: neuroscience, cardiology and pathology’. PMID:28507228
NASA Technical Reports Server (NTRS)
Kleis, Stanley J.; Truong, Tuan; Goodwin, Thomas J,
2004-01-01
This report is a documentation of a fluid dynamic analysis of the proposed Automated Static Culture System (ASCS) cell module mixing protocol. The report consists of a review of some basic fluid dynamics principles appropriate for the mixing of a patch of high oxygen content media into the surrounding media which is initially depleted of oxygen, followed by a computational fluid dynamics (CFD) study of this process for the proposed protocol over a range of the governing parameters. The time histories of oxygen concentration distributions and mechanical shear levels generated are used to characterize the mixing process for different parameter values.
NASA Technical Reports Server (NTRS)
Park, Michael A.; Green, Lawrence L.; Montgomery, Raymond C.; Raney, David L.
1999-01-01
With the recent interest in novel control effectors there is a need to determine the stability and control derivatives of new aircraft configurations early in the design process. These derivatives are central to most control law design methods and would allow the determination of closed-loop control performance of the vehicle. Early determination of the static and dynamic behavior of an aircraft may permit significant improvement in configuration weight, cost, stealth, and performance through multidisciplinary design. The classical method of determining static stability and control derivatives - constructing and testing wind tunnel models - is expensive and requires a long lead time for the resultant data. Wind tunnel tests are also limited to the preselected control effectors of the model. To overcome these shortcomings, computational fluid dynamics (CFD) solvers are augmented via automatic differentiation, to directly calculate the stability and control derivatives. The CFD forces and moments are differentiated with respect to angle of attack, angle of sideslip, and aircraft shape parameters to form these derivatives. A subset of static stability and control derivatives of a tailless aircraft concept have been computed by two differentiated inviscid CFD codes and verified for accuracy with central finite-difference approximations and favorable comparisons to a simulation database.
NASA Astrophysics Data System (ADS)
Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut
2017-04-01
Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).
Static and Current Electricity.
ERIC Educational Resources Information Center
Schlenker, Richard M.; Murtha, Kathy T.
This is a copy of the script for the electrical relationships unit in an auto-tutorial physical science course for non-science majors, offered at the University of Maine at Orono. The unit includes 15 simple experiments designed to allow the student to discover various fundamental electrical relationships. The student has the option of reading the…
Static and dynamic stability analysis of the space shuttle vehicle-orbiter
NASA Technical Reports Server (NTRS)
Chyu, W. J.; Cavin, R. K.; Erickson, L. L.
1978-01-01
The longitudinal static and dynamic stability of a Space Shuttle Vehicle-Orbiter (SSV Orbiter) model is analyzed using the FLEXSTAB computer program. Nonlinear effects are accounted for by application of a correction technique in the FLEXSTAB system; the technique incorporates experimental force and pressure data into the linear aerodynamic theory. A flexible Orbiter model is treated in the static stability analysis for the flight conditions of Mach number 0.9 for rectilinear flight (1 g) and for a pull-up maneuver (2.5 g) at an altitude of 15.24 km. Static stability parameters and structural deformations of the Orbiter are calculated at trim conditions for the dynamic stability analysis, and the characteristics of damping in pitch are investigated for a Mach number range of 0.3 to 1.2. The calculated results for both the static and dynamic stabilities are compared with the available experimental data.
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Linear and quadratic static response functions and structure functions in Yukawa liquids.
Magyar, Péter; Donkó, Zoltán; Kalman, Gabor J; Golden, Kenneth I
2014-08-01
We compute linear and quadratic static density response functions of three-dimensional Yukawa liquids by applying an external perturbation potential in molecular dynamics simulations. The response functions are also obtained from the equilibrium fluctuations (static structure factors) in the system via the fluctuation-dissipation theorems. The good agreement of the quadratic response functions, obtained in the two different ways, confirms the quadratic fluctuation-dissipation theorem. We also find that the three-point structure function may be factorizable into two-point structure functions, leading to a cluster representation of the equilibrium triplet correlation function.
Effects of static tensile load on the thermal expansion of Gr/PI composite material
NASA Technical Reports Server (NTRS)
Farley, G. L.
1981-01-01
The effect of static tensile load on the thermal expansion of Gr/PI composite material was measured for seven different laminate configurations. A computer program was developed which implements laminate theory in a piecewise linear fashion to predict the coupled nonlinear thermomechanical behavior. Static tensile load significantly affected the thermal expansion characteristics of the laminates tested. This effect is attributed to a fiber instability micromechanical behavior of the constituent materials. Analytical results correlated reasonably well with free thermal expansion tests (no load applied to the specimen). However, correlation was poor for tests with an applied load.
Summary Report of the Orbital X-34 Wing Static Aeroelastic Study
NASA Technical Reports Server (NTRS)
Prabhn, Ramadas K.; Weilmuenster, K. J. (Technical Monitor)
2001-01-01
This report documents the results of a computational study conducted on the Orbital Sciences X-34 vehicle to compute its inviscid aerodynamic characteristics taking into account the wing structural flexibility. This was a joint exercise between LaRC and SDRC of California. SDRC modeled the structural details of the wing, and provided the structural deformation for a given pressure distribution on its surfaces. This study was done for a Mach number of 1.35 and an angle of attack of 9 deg.; the freestream dynamic pressure was assumed to be 607 lb/sq ft. Only the wing and the body were simulated in the CFD computations. Two wing configurations were examined. The first had the elevons in the undeflected position and the second had the elevons deflected 20 deg. up. The results indicated that with elevon undeflected, the wing twists by about 1.5 deg. resulting in a reduction in the angle of attack at the wing tip to by 1.5 deg. The maximum vertical deflection of the wing is about 3.71 inches at the wing tip. For the wing with the undeflected elevons, the effect of this wing deformation is to reduce the normal force coefficient (C(sub N)) by 0.012 and introduce a noise up pitching moment coefficient (C(sub m)) of 0.042.
Model implementation for dynamic computation of system cost
NASA Astrophysics Data System (ADS)
Levri, J.; Vaccari, D.
The Advanced Life Support (ALS) Program metric is the ratio of the equivalent system mass (ESM) of a mission based on International Space Station (ISS) technology to the ESM of that same mission based on ALS technology. ESM is a mission cost analog that converts the volume, power, cooling and crewtime requirements of a mission into mass units to compute an estimate of the life support system emplacement cost. Traditionally, ESM has been computed statically, using nominal values for system sizing. However, computation of ESM with static, nominal sizing estimates cannot capture the peak sizing requirements driven by system dynamics. In this paper, a dynamic model for a near-term Mars mission is described. The model is implemented in Matlab/Simulink' for the purpose of dynamically computing ESM. This paper provides a general overview of the crew, food, biomass, waste, water and air blocks in the Simulink' model. Dynamic simulations of the life support system track mass flow, volume and crewtime needs, as well as power and cooling requirement profiles. The mission's ESM is computed, based upon simulation responses. Ultimately, computed ESM values for various system architectures will feed into an optimization search (non-derivative) algorithm to predict parameter combinations that result in reduced objective function values.
Computer program determines gas flow rates in piping systems
NASA Technical Reports Server (NTRS)
Franke, R.
1966-01-01
Computer program calculates the steady state flow characteristics of an ideal compressible gas in a complex piping system. The program calculates the stagnation and total temperature, static and total pressure, loss factor, and forces on each element in the piping system.
An Algorithm for Converting Static Earth Sensor Measurements into Earth Observation Vectors
NASA Technical Reports Server (NTRS)
Harman, R.; Hashmall, Joseph A.; Sedlak, Joseph
2004-01-01
An algorithm has been developed that converts penetration angles reported by Static Earth Sensors (SESs) into Earth observation vectors. This algorithm allows compensation for variation in the horizon height including that caused by Earth oblateness. It also allows pitch and roll to be computed using any number (greater than 1) of simultaneous sensor penetration angles simplifying processing during periods of Sun and Moon interference. The algorithm computes body frame unit vectors through each SES cluster. It also computes GCI vectors from the spacecraft to the position on the Earth's limb where each cluster detects the Earth's limb. These body frame vectors are used as sensor observation vectors and the GCI vectors are used as reference vectors in an attitude solution. The attitude, with the unobservable yaw discarded, is iteratively refined to provide the Earth observation vector solution.
Static Schedulers for Embedded Real-Time Systems
1989-12-01
Because of the need for having efficient scheduling algorithms in large scale real time systems , software engineers put a lot of effort on developing...provide static schedulers for he Embedded Real Time Systems with single processor using Ada programming language. The independent nonpreemptable...support the Computer Aided Rapid Prototyping for Embedded Real Time Systems so that we determine whether the system, as designed, meets the required
Static Extended Trailing Edge for Lift Enhancement: Experimental and Computational Studies
2007-06-01
3rd International Symposium on Integrating CFD and Experiments in Aerodynamics 20-21 June 2007 U.S. Air Force Academy, CO, USA Static Extended...is not significantly increased. Experiments and calculations are conducted to compare the aerodynamic characteristics of the extended trailing edge...basic configuration, has a good potential to improve the cruise flight efficiency. Key words: trailing edge, airfoil, wing, lift, drag, aerodynamics
A Planar Quasi-Static Constraint Mode Tire Model
2015-07-10
strikes a balance between simple tire models that lack the fidelity to make accurate chassis load predictions and computationally intensive models that...strikes a balance between heuristic tire models (such as a linear point-follower) that lack the fidelity to make accurate chassis load predictions...UNCLASSIFIED: Distribution Statement A. Cleared for public release A PLANAR QUASI-STATIC CONSTRAINT MODE TIRE MODEL Rui Maa John B. Ferris
Theoretical analysis of transcranial Hall-effect stimulation based on passive cable model
NASA Astrophysics Data System (ADS)
Yuan, Yi; Li, Xiao-Li
2015-12-01
Transcranial Hall-effect stimulation (THS) is a new stimulation method in which an ultrasonic wave in a static magnetic field generates an electric field in an area of interest such as in the brain to modulate neuronal activities. However, the biophysical basis of simulating the neurons remains unknown. To address this problem, we perform a theoretical analysis based on a passive cable model to investigate the THS mechanism of neurons. Nerve tissues are conductive; an ultrasonic wave can move ions embedded in the tissue in a static magnetic field to generate an electric field (due to Lorentz force). In this study, a simulation model for an ultrasonically induced electric field in a static magnetic field is derived. Then, based on the passive cable model, the analytical solution for the voltage distribution in a nerve tissue is determined. The simulation results showthat THS can generate a voltage to stimulate neurons. Because the THS method possesses a higher spatial resolution and a deeper penetration depth, it shows promise as a tool for treating or rehabilitating neuropsychiatric disorders. Project supported by the National Natural Science Foundation of China (Grant Nos. 61273063 and 61503321), the China Postdoctoral Science Foundation (Grant No. 2013M540215), the Natural Science Foundation of Hebei Province, China (Grant No. F2014203161), and the Youth Research Program of Yanshan University, China (Grant No. 02000134).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.
1995-12-31
In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less
Single-Word Multiple-Bit Upsets in Static Random Access Devices
1998-01-15
Transactions on Nuclear Science, NS-33, 1616- 1619,1986. Criswell, T.L., P.R. Measel , and K.L. Walin, "Single Event Upset Testing with Relativistic...Heavy Ions," IEEE Transactions on Nuclear Science, NS-31, 1559- 1561,1984. 1946 3. Criswell, T.L., D.L. Oberg, J.L. Wert, P.R. Measel , and W.E
Tethered variable gravity laboratory study: Low gravity process identification report
NASA Technical Reports Server (NTRS)
Briccarello, M.
1989-01-01
Experiments are described performable in the variable gravity environment, and the related compatible/beneficial residual accelerations, both for pure and applied research in the fields of Fluid Mechanics (static and dynamic), Materials Sciences (Crystal Growth, Metal and Alloy Solidification, Glasses, etc.), and Life Sciences, so as to assess the relevance of a variable G-level laboratory.
Alignment of dynamic networks.
Vijayan, V; Critchlow, D; Milenkovic, T
2017-07-15
Network alignment (NA) aims to find a node mapping that conserves similar regions between compared networks. NA is applicable to many fields, including computational biology, where NA can guide the transfer of biological knowledge from well- to poorly-studied species across aligned network regions. Existing NA methods can only align static networks. However, most complex real-world systems evolve over time and should thus be modeled as dynamic networks. We hypothesize that aligning dynamic network representations of evolving systems will produce superior alignments compared to aligning the systems' static network representations, as is currently done. For this purpose, we introduce the first ever dynamic NA method, DynaMAGNA ++. This proof-of-concept dynamic NA method is an extension of a state-of-the-art static NA method, MAGNA++. Even though both MAGNA++ and DynaMAGNA++ optimize edge as well as node conservation across the aligned networks, MAGNA++ conserves static edges and similarity between static node neighborhoods, while DynaMAGNA++ conserves dynamic edges (events) and similarity between evolving node neighborhoods. For this purpose, we introduce the first ever measure of dynamic edge conservation and rely on our recent measure of dynamic node conservation. Importantly, the two dynamic conservation measures can be optimized with any state-of-the-art NA method and not just MAGNA++. We confirm our hypothesis that dynamic NA is superior to static NA, on synthetic and real-world networks, in computational biology and social domains. DynaMAGNA++ is parallelized and has a user-friendly graphical interface. http://nd.edu/∼cone/DynaMAGNA++/ . tmilenko@nd.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Vijayan, V.; Critchlow, D.; Milenković, T.
2017-01-01
Abstract Motivation: Network alignment (NA) aims to find a node mapping that conserves similar regions between compared networks. NA is applicable to many fields, including computational biology, where NA can guide the transfer of biological knowledge from well- to poorly-studied species across aligned network regions. Existing NA methods can only align static networks. However, most complex real-world systems evolve over time and should thus be modeled as dynamic networks. We hypothesize that aligning dynamic network representations of evolving systems will produce superior alignments compared to aligning the systems’ static network representations, as is currently done. Results: For this purpose, we introduce the first ever dynamic NA method, DynaMAGNA ++. This proof-of-concept dynamic NA method is an extension of a state-of-the-art static NA method, MAGNA++. Even though both MAGNA++ and DynaMAGNA++ optimize edge as well as node conservation across the aligned networks, MAGNA++ conserves static edges and similarity between static node neighborhoods, while DynaMAGNA++ conserves dynamic edges (events) and similarity between evolving node neighborhoods. For this purpose, we introduce the first ever measure of dynamic edge conservation and rely on our recent measure of dynamic node conservation. Importantly, the two dynamic conservation measures can be optimized with any state-of-the-art NA method and not just MAGNA++. We confirm our hypothesis that dynamic NA is superior to static NA, on synthetic and real-world networks, in computational biology and social domains. DynaMAGNA++ is parallelized and has a user-friendly graphical interface. Availability and implementation: http://nd.edu/∼cone/DynaMAGNA++/. Contact: tmilenko@nd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28881980
Elastic scattering of low-energy electrons by nitromethane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopes, A. R.; D'A Sanchez, S.; Bettega, M. H. F.
2011-06-15
In this work, we present integral, differential, and momentum transfer cross sections for elastic scattering of low-energy electrons by nitromethane, for energies up to 10 eV. We calculated the cross sections using the Schwinger multichannel method with pseudopotentials, in the static-exchange and in the static-exchange plus polarization approximations. The computed integral cross sections show a {pi}* shape resonance at 0.70 eV in the static-exchange-polarization approximation, which is in reasonable agreement with experimental data. We also found a {sigma}* shape resonance at 4.8 eV in the static-exchange-polarization approximation, which has not been previously characterized by the experiment. We also discuss howmore » these resonances may play a role in the dissociation process of this molecule.« less
Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations
NASA Astrophysics Data System (ADS)
Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying
2010-09-01
Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).
A daily living activity remote monitoring system for solitary elderly people.
Maki, Hiromichi; Ogawa, Hidekuni; Matsuoka, Shingo; Yonezawa, Yoshiharu; Caldwell, W Morton
2011-01-01
A daily living activity remote monitoring system has been developed for supporting solitary elderly people. The monitoring system consists of a tri-axis accelerometer, six low-power active filters, a low-power 8-bit microcontroller (MC), a 1GB SD memory card (SDMC) and a 2.4 GHz low transmitting power mobile phone (PHS). The tri-axis accelerometer attached to the subject's chest can simultaneously measure dynamic and static acceleration forces produced by heart sound, respiration, posture and behavior. The heart rate, respiration rate, activity, posture and behavior are detected from the dynamic and static acceleration forces. These data are stored in the SD. The MC sends the data to the server computer every hour. The server computer stores the data and makes a graphic chart from the data. When the caregiver calls from his/her mobile phone to the server computer, the server computer sends the graphical chart via the PHS. The caregiver's mobile phone displays the chart to the monitor graphically.
A 4DCT imaging-based breathing lung model with relative hysteresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miyawaki, Shinjiro; Choi, Sanghun; Hoffman, Eric A.
To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for bothmore » models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry. - Highlights: • We developed a breathing human lung CFD model based on 4D-dynamic CT images. • The 4DCT-based breathing lung model is able to capture lung relative hysteresis. • A new boundary condition for lung model based on one static CT image was proposed. • The difference between lung models based on 4D and static CT images was quantified.« less
SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX-80
NASA Astrophysics Data System (ADS)
Kamat, Manohar P.; Watson, Brian C.
1992-11-01
The finite element method has proven to be an invaluable tool for analysis and design of complex, high performance systems, such as bladed-disk assemblies in aircraft turbofan engines. However, as the problem size increase, the computation time required by conventional computers can be prohibitively high. Parallel processing computers provide the means to overcome these computation time limits. This report summarizes the results of a research activity aimed at providing a finite element capability for analyzing turbomachinery bladed-disk assemblies in a vector/parallel processing environment. A special purpose code, named with the acronym SAPNEW, has been developed to perform static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements. SAPNEW provides a stand alone capability for static and eigen analysis on the Alliant FX/80, a parallel processing computer. A preprocessor, named with the acronym NTOS, has been developed to accept NASTRAN input decks and convert them to the SAPNEW format to make SAPNEW more readily used by researchers at NASA Lewis Research Center.
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Static and Dynamic Model Update of an Inflatable/Rigidizable Torus Structure
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, mercedes C.
2006-01-01
The present work addresses the development of an experimental and computational procedure for validating finite element models. A torus structure, part of an inflatable/rigidizable Hexapod, is used to demonstrate the approach. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with optimization is used to modify key model parameters. Static test results are used to update stiffness parameters and dynamic test results are used to update the mass distribution. Updated parameters are computed using gradient and non-gradient based optimization algorithms. Results show significant improvements in model predictions after parameters are updated. Lessons learned in the areas of test procedures, modeling approaches, and uncertainties quantification are presented.
User document for computer programs for ring-stiffened shells of revolution
NASA Technical Reports Server (NTRS)
Cohen, G. A.
1973-01-01
A user manual and related program documentation is presented for six compatible computer programs for structural analysis of axisymmetric shell structures. The programs apply to a common structural model but analyze different modes of structural response. In particular, they are: (1) Linear static response under asymmetric loads; (2) Buckling of linear states under asymmetric loads; (3) Nonlinear static response under axisymmetric loads; (4) Buckling nonlinear states under axisymmetric (5) Imperfection sensitivity of buckling modes under axisymmetric loads; and (6) Vibrations about nonlinear states under axisymmetric loads. These programs treat branched shells of revolution with an arbitrary arrangement of a large number of open branches but with at most one closed branch.
Computation of viscous transonic flow about a lifting airfoil
NASA Technical Reports Server (NTRS)
Walitt, L.; Liu, C. Y.
1976-01-01
The viscous transonic flow about a stationary body in free air was numerically investigated. The geometry chosen was a symmetric NACA 64A010 airfoil at a freestream Mach number of 0.8, a Reynolds number of 4 million based on chord, and angles of attack of 0 and 2 degrees. These conditions were such that, at 2 degrees incidence unsteady periodic motion was calculated along the aft portion of the airfoil and in its wake. Although no unsteady measurements were made for the NACA 64A010 airfoil at these flow conditions, interpolated steady measurements of lift, drag, and surface static pressures compared favorably with corresponding computed time-averaged lift, drag, and surface static pressures.
NASA Technical Reports Server (NTRS)
Camarda, C. J.; Adelman, H. M.
1984-01-01
The implementation of static and dynamic structural-sensitivity derivative calculations in a general purpose, finite-element computer program denoted the Engineering Analysis Language (EAL) System is described. Derivatives are calculated with respect to structural parameters, specifically, member sectional properties including thicknesses, cross-sectional areas, and moments of inertia. Derivatives are obtained for displacements, stresses, vibration frequencies and mode shapes, and buckling loads and mode shapes. Three methods for calculating derivatives are implemented (analytical, semianalytical, and finite differences), and comparisons of computer time and accuracy are made. Results are presented for four examples: a swept wing, a box beam, a stiffened cylinder with a cutout, and a space radiometer-antenna truss.
Instability of a gravity gradient satellite due to thermal distortion
NASA Technical Reports Server (NTRS)
Goldman, R. L.
1975-01-01
A nonlinear analytical model and a corresponding computer program were developed to study the influence of solar heating on the anomalous low frequency, orbital instability of the Naval Research Laboratory's gravity gradient satellite 164. The model's formulation was based on a quasi-static approach in which deflections of the satellite's booms were determined in terms of thermally induced bending without consideration of boom vibration. Calculations, which were made for variations in absorptivity, sun angle, thermal lag, and hinge stiffness, demonstrated that, within the confines of a relatively narrow stability criteria, the quasi-static model of NRL 164 not only becomes unstable, but, in a number of cases, responses were computed that closely resembled flight data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gartling, D.K.
The theoretical and numerical background for the finite element computer program, TORO II, is presented in detail. TORO II is designed for the multi-dimensional analysis of nonlinear, electromagnetic field problems described by the quasi-static form of Maxwell`s equations. A general description of the boundary value problems treated by the program is presented. The finite element formulation and the associated numerical methods used in TORO II are also outlined. Instructions for the use of the code are documented in SAND96-0903; examples of problems analyzed with the code are also provided in the user`s manual. 24 refs., 8 figs.
NASTRAN/FLEXSTAB procedure for static aeroelastic analysis
NASA Technical Reports Server (NTRS)
Schuster, L. S.
1984-01-01
Presented is a procedure for using the FLEXSTAB External Structural Influence Coefficients (ESIC) computer program to produce the structural data necessary for the FLEXSTAB Stability Derivatives and Static Stability (SD&SS) program. The SD&SS program computes trim state, stability derivatives, and pressure and deflection data for a flexible airplane having a plane of symmetry. The procedure used a NASTRAN finite-element structural model as the source of structural data in the form of flexibility matrices. Selection of a set of degrees of freedom, definition of structural nodes and panels, reordering and reformatting of the flexibility matrix, and redistribution of existing point mass data are among the topics discussed. Also discussed are boundary conditions and the NASTRAN substructuring technique.
A Reduced Dimension Static, Linearized Kalman Filter and Smoother
NASA Technical Reports Server (NTRS)
Fukumori, I.
1995-01-01
An approximate Kalman filter and smoother, based on approximations of the state estimation error covariance matrix, is described. Approximations include a reduction of the effective state dimension, use of a static asymptotic error limit, and a time-invariant linearization of the dynamic model for error integration. The approximations lead to dramatic computational savings in applying estimation theory to large complex systems. Examples of use come from TOPEX/POSEIDON.
Animation, audio, and spatial ability: Optimizing multimedia for scientific explanations
NASA Astrophysics Data System (ADS)
Koroghlanian, Carol May
This study investigated the effects of audio, animation and spatial ability in a computer based instructional program for biology. The program presented instructional material via text or audio with lean text and included eight instructional sequences presented either via static illustrations or animations. High school students enrolled in a biology course were blocked by spatial ability and randomly assigned to one of four treatments (Text-Static Illustration Audio-Static Illustration, Text-Animation, Audio-Animation). The study examined the effects of instructional mode (Text vs. Audio), illustration mode (Static Illustration vs. Animation) and spatial ability (Low vs. High) on practice and posttest achievement, attitude and time. Results for practice achievement indicated that high spatial ability participants achieved more than low spatial ability participants. Similar results for posttest achievement and spatial ability were not found. Participants in the Static Illustration treatments achieved the same as participants in the Animation treatments on both the practice and posttest. Likewise, participants in the Text treatments achieved the same as participants in the Audio treatments on both the practice and posttest. In terms of attitude, participants responded favorably to the computer based instructional program. They found the program interesting, felt the static illustrations or animations made the explanations easier to understand and concentrated on learning the material. Furthermore, participants in the Animation treatments felt the information was easier to understand than participants in the Static Illustration treatments. However, no difference for any attitude item was found for participants in the Text as compared to those in the Audio treatments. Significant differences were found by Spatial Ability for three attitude items concerning concentration and interest. In all three items, the low spatial ability participants responded more positively than high spatial ability participants. In addition, low spatial ability participants reported greater mental effort than high spatial ability participants. Findings for time-in-program and time-in-instruction indicated that participants in the Animation treatments took significantly more time than participants in the Static Illustration treatments. No time differences of any type were found for participants in the Text versus Audio treatments. Implications for the design of multimedia instruction and topics for future research are included in the discussion.
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
NASTRAN computer system level 12.1
NASA Technical Reports Server (NTRS)
Butler, T. G.
1971-01-01
Program uses finite element displacement method for solving linear response of large, three-dimensional structures subject to static, dynamic, thermal, and random loadings. Program adapts to computers of different manufacture, permits up-dating and extention, allows interchange of output and input information between users, and is extensively documented.
Visual Displays and Contextual Presentations in Computer-Based Instruction.
ERIC Educational Resources Information Center
Park, Ok-choon
1998-01-01
Investigates the effects of two instructional strategies, visual display (animation, and static graphics with and without motion cues) and contextual presentation, in the acquisition of electronic troubleshooting skills using computer-based instruction. Study concludes that use of visual displays and contextual presentation be based on the…
Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick
2018-01-01
When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Static telescope aberration measurement using lucky imaging techniques
NASA Astrophysics Data System (ADS)
López-Marrero, Marcos; Rodríguez-Ramos, Luis Fernando; Marichal-Hernández, José Gil; Rodríguez-Ramos, José Manuel
2012-07-01
A procedure has been developed to compute static aberrations once the telescope PSF has been measured with the lucky imaging technique, using a nearby star close to the object of interest as the point source to probe the optical system. This PSF is iteratively turned into a phase map at the pupil using the Gerchberg-Saxton algorithm and then converted to the appropriate actuation information for a deformable mirror having low actuator number but large stroke capability. The main advantage of this procedure is related with the capability of correcting static aberration at the specific pointing direction and without the need of a wavefront sensor.
Thermospheric temperature, density, and composition: New models
NASA Technical Reports Server (NTRS)
Jacchia, L. G.
1977-01-01
The models essentially consist of two parts: the basic static models, which give temperature and density profiles for the relevant atmospheric constituents for any specified exospheric temperature, and a set of formulae to compute the exospheric temperature and the expected deviations from the static models as a result of all the recognized types of thermospheric variation. For the basic static models, tables are given for heights from 90 to 2,500 km and for exospheric temperatures from 500 to 2600 K. In the formulae for the variations, an attempt has been made to represent the changes in composition observed by mass spectrometers on the OGO 6 and ESRO 4 satellites.
Error analysis and prevention of cosmic ion-induced soft errors in static CMOS RAMs
NASA Astrophysics Data System (ADS)
Diehl, S. E.; Ochoa, A., Jr.; Dressendorfer, P. V.; Koga, P.; Kolasinski, W. A.
1982-12-01
Cosmic ray interactions with memory cells are known to cause temporary, random, bit errors in some designs. The sensitivity of polysilicon gate CMOS static RAM designs to logic upset by impinging ions has been studied using computer simulations and experimental heavy ion bombardment. Results of the simulations are confirmed by experimental upset cross-section data. Analytical models have been extended to determine and evaluate design modifications which reduce memory cell sensitivity to cosmic ions. A simple design modification, the addition of decoupling resistance in the feedback path, is shown to produce static RAMs immune to cosmic ray-induced bit errors.
Mobility analysis, simulation, and scale model testing for the design of wheeled planetary rovers
NASA Technical Reports Server (NTRS)
Lindemann, Randel A.; Eisen, Howard J.
1993-01-01
The use of computer based techniques to model and simulate wheeled rovers on rough natural terrains is considered. Physical models of a prototype vehicle can be used to test the correlation of the simulations in scaled testing. The computer approaches include a quasi-static planar or two dimensional analysis and design tool based on the traction necessary for the vehicle to have imminent mobility. The computer program modeled a six by six wheel drive vehicle of original kinematic configuration, called the Rocker Bogie. The Rocker Bogie was optimized using the quasi-static software with respect to its articulation parameters prior to fabrication of a prototype. In another approach used, the dynamics of the Rocker Bogie vehicle in 3-D space was modeled on an engineering workstation using commercial software. The model included the complex and nonlinear interaction of the tire and terrain. The results of the investigation yielded numerical and graphical results of the rover traversing rough terrain on the earth, moon, and Mars. In addition, animations of the rover excursions were also generated. A prototype vehicle was then used in a series of testbed and field experiments. Correspondence was then established between the computer models and the physical model. The results indicated the utility of the quasi-static tool for configurational design, as well as the predictive ability of the 3-D simulation to model the dynamic behavior of the vehicle over short traverses.
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
What I Taught My STEM Instructor about Teaching: What a Deaf Student Hears That Others Cannot
ERIC Educational Resources Information Center
Ross, Annemarie; Yerrick, Randy K.
2015-01-01
Overall, science teaching at the university level has remained in a relatively static state. There is much research and debate among university faculty regarding the most effective methods of teaching science. But it remains largely rhetoric. The traditional lecture model in STEM higher education is limping along in its march toward inclusion and…
Geoinquiries: Maps and Data for Everyone
ERIC Educational Resources Information Center
Baker, Thomas R.
2015-01-01
Ever want to take a quick, deep-dive into a map found in students' textbooks? Ever want to use a web-based map to bring that static, print map to life? Maybe the map would be better with interactive or near real-time data. This article discusses the new Earth Science GeoInquiries! Earth Science GeoInquiries from Esri are instructional resources…
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
The Use of Audio and Animation in Computer Based Instruction.
ERIC Educational Resources Information Center
Koroghlanian, Carol; Klein, James D.
This study investigated the effects of audio, animation, and spatial ability in a computer-based instructional program for biology. The program presented instructional material via test or audio with lean text and included eight instructional sequences presented either via static illustrations or animations. High school students enrolled in a…
Computer Power. Part 2: Electrical Power Problems and Their Amelioration.
ERIC Educational Resources Information Center
Price, Bennett J.
1989-01-01
Describes electrical power problems that affect computer users, including spikes, sags, outages, noise, frequency variations, and static electricity. Ways in which these problems may be diagnosed and cured are discussed. Sidebars consider transformers; power distribution units; surge currents/linear and non-linear loads; and sizing the power…
Learning about Locomotion Patterns from Visualizations: Effects of Presentation Format and Realism
ERIC Educational Resources Information Center
Imhof, Birgit; Scheiter, Katharina; Gerjets, Peter
2011-01-01
The rapid development of computer graphics technology has made possible an easy integration of dynamic visualizations into computer-based learning environments. This study examines the relative effectiveness of dynamic visualizations, compared either to sequentially or simultaneously presented static visualizations. Moreover, the degree of realism…
Tableau Economique: Teaching Economics with a Tablet Computer
ERIC Educational Resources Information Center
Scott, Robert H., III
2011-01-01
The typical method of instruction in economics is chalk and talk. Economics courses often require writing equations and drawing graphs and charts, which are all best done in freehand. Unlike static PowerPoint presentations, tablet computers create dynamic nonlinear presentations. Wireless technology allows professors to write on their tablets and…
Sawja: Static Analysis Workshop for Java
NASA Astrophysics Data System (ADS)
Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine
Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. Efficiency and precision of such a tool rely partly on low level components which only depend on the syntactic structure of the language and therefore should not be redesigned for each implementation of a new static analysis. This paper describes the Sawja library: a static analysis workshop fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including i) efficient functional data-structures for representing a program with implicit sharing and lazy parsing, ii) an intermediate stack-less representation, and iii) fast computation and manipulation of complete programs. We provide experimental evaluations of the different features with respect to time, memory and precision.
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
Dynamic simulation of Static Var Compensators in distribution systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koessler, R.J.
1992-08-01
This paper is a system study guide for the correction of voltage dips due to large motor startups with Static Var Compensators (SVCs). The method utilizes time simulations, which are an important aid in the equipment design and specification. The paper illustrates the process of setting-up a computer model and performing time simulations. The study process is demonstrated through an example, the Shawnee feeder in the Niagara Mohawk Power Corporation service area.
NASA Technical Reports Server (NTRS)
Svalbonas, V.
1973-01-01
The theoretical analysis background for the STARS-2 (shell theory automated for rotational structures) program is presented. The theory involved in the axisymmetric nonlinear and unsymmetric linear static analyses, and the stability and vibrations (including critical rotation speed) analyses involving axisymmetric prestress are discussed. The theory for nonlinear static, stability, and vibrations analyses, involving shells with unsymmetric loadings are included.
Ambusam, S; Baharudin, O; Roslizawati, N; Leonard, J
2015-01-01
Document holder is used as a remedy to address occupational neck pain among computer users. An understanding on the effects of the document holder along with other work related risk factors while working in computer workstation requires attention. A comprehensive knowledge on the optimal location of the document holder in computer use and associated work related factors that may contribute to neck pain reviewed in this article. A literature search has been conducted over the past 14 years based on the published articles from January 1990 to January 2014 in both Science Direct and PubMed databases. Medical Subject Headings (MeSH) keywords for search were neck muscle OR head posture OR muscle tension' OR muscle activity OR work related disorders OR neck pain AND/OR document location OR document holder OR source document OR copy screen holder.Document holder placed lateral to the screen was most preferred to reduce neck discomfort among occupational typists. Document without a holder was placed flat on the surface is least preferred. The head posture and muscle activity increases when the document is placed flat on the surface compared to when placed on the document holder. Work related factors such as static posture, repetitive movement, prolong sitting and awkward positions were the risk factors for chronic neck pain. This review highlights the optimal location for document holder for computer users to reduce neck pain. Together, the importance of work related risk factors for to neck pain on occupational typist is emphasized for the clinical management.
Teaching citizen science skills online: Implications for invasive species training programs
Newman, G.; Crall, A.; Laituri, M.; Graham, J.; Stohlgren, T.; Moore, J.C.; Kodrich, K.; Holfelder, K.A.
2010-01-01
Citizen science programs are emerging as an efficient way to increase data collection and help monitor invasive species. Effective invasive species monitoring requires rigid data quality assurances if expensive control efforts are to be guided by volunteer data. To achieve data quality, effective online training is needed to improve field skills and reach large numbers of remote sentinel volunteers critical to early detection and rapid response. The authors evaluated the effectiveness of online static and multimedia tutorials to teach citizen science volunteers (n = 54) how to identify invasive plants; establish monitoring plots; measure percent cover; and use Global Positioning System (GPS) units. Participants trained using static and multimedia tutorials provided less (p <.001) correct species identifications (63% and 67%) than did professionals (83%) across all species, but they did not differ (p =.125) between each other. However, their ability to identify conspicuous species was comparable to that of professionals. The variability in percent plant cover estimates between static (??10%) and multimedia (??13%) participants did not differ (p =.86 and.08, respectively) from those of professionals (??9%). Trained volunteers struggled with plot setup and GPS skills. Overall, the online approach used did not influence conferred field skills and abilities. Traditional or multimedia online training augmented with more rigorous, repeated, and hands-on, in-person training in specialized skills required for more difficult tasks will likely improve volunteer abilities, data quality, and overall program effectiveness. ?? Taylor & Francis Group, LLC.
NASA Astrophysics Data System (ADS)
Dong, H.; Kun, Z.; Zhang, L.
2015-12-01
This magnetotelluric (MT) system contains static shift correction and 3D inversion. The correction method is based on the data study on 3D forward modeling and field test. The static shift can be detected by the quantitative analysis of apparent parameters (apparent resistivity and impedance phase) of MT in high frequency range, and completed correction with inversion. The method is an automatic processing technology of computer with zero-cost, and avoids the additional field work and indoor processing with good results shown in Figure 1a-e. Figure 1a shows a normal model (I) without any local heterogeneity. Figure 1b shows a static-shifted model (II) with two local heterogeneous bodies (10 and 1000 ohm.m). Figure 1c is the inversion result (A) for the synthetic data generated from model I. Figure 1d is the inversion result (B) for the static-shifted data generated from model II. Figure 1e is the inversion result (C) for the static-shifted data from model II, but with static shift correction. The results show that the correction method is useful. The 3D inversion algorithm is improved base on the NLCG method of Newman & Alumbaugh (2000) and Rodi & Mackie (2001). For the algorithm, we added the frequency based parallel structure, improved the computational efficiency, reduced the memory of computer, added the topographic and marine factors, and added the constraints of geology and geophysics. So the 3D inversion could even work in PAD with high efficiency and accuracy. The application example of theoretical assessment in oil and gas exploration is shown in Figure 1f-i. The synthetic geophysical model consists of five layers (from top to downwards): shale, limestone, gas, oil, groundwater and limestone overlying a basement rock. Figure 1f-g show the 3D model and central profile. Figure 1h shows the centrel section of 3D inversion, the resultsd show a high degree of reduction in difference on the synthetic model. Figure 1i shows the seismic waveform reflects the interfaces of every layer overall, but the relative positions of the interface of the two-way travel time vary, and the interface between limestone and oil at the sides of the section is not reflected. So 3-D MT can make up for the deficiency of the seismic results such as the fake sync-phase axis and multiple waves.
Neuronify: An Educational Simulator for Neural Circuits.
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Våvang Solbrå, Andreas; Tennøe, Simen; Hafreager, Anders; Malthe-Sørenssen, Anders; Fyhn, Marianne; Hafting, Torkel; Einevoll, Gaute T
2017-01-01
Educational software (apps) can improve science education by providing an interactive way of learning about complicated topics that are hard to explain with text and static illustrations. However, few educational apps are available for simulation of neural networks. Here, we describe an educational app, Neuronify, allowing the user to easily create and explore neural networks in a plug-and-play simulation environment. The user can pick network elements with adjustable parameters from a menu, i.e., synaptically connected neurons modelled as integrate-and-fire neurons and various stimulators (current sources, spike generators, visual, and touch) and recording devices (voltmeter, spike detector, and loudspeaker). We aim to provide a low entry point to simulation-based neuroscience by allowing students with no programming experience to create and simulate neural networks. To facilitate the use of Neuronify in teaching, a set of premade common network motifs is provided, performing functions such as input summation, gain control by inhibition, and detection of direction of stimulus movement. Neuronify is developed in C++ and QML using the cross-platform application framework Qt and runs on smart phones (Android, iOS) and tablet computers as well personal computers (Windows, Mac, Linux).
Neuronify: An Educational Simulator for Neural Circuits
Hafreager, Anders; Malthe-Sørenssen, Anders; Fyhn, Marianne
2017-01-01
Abstract Educational software (apps) can improve science education by providing an interactive way of learning about complicated topics that are hard to explain with text and static illustrations. However, few educational apps are available for simulation of neural networks. Here, we describe an educational app, Neuronify, allowing the user to easily create and explore neural networks in a plug-and-play simulation environment. The user can pick network elements with adjustable parameters from a menu, i.e., synaptically connected neurons modelled as integrate-and-fire neurons and various stimulators (current sources, spike generators, visual, and touch) and recording devices (voltmeter, spike detector, and loudspeaker). We aim to provide a low entry point to simulation-based neuroscience by allowing students with no programming experience to create and simulate neural networks. To facilitate the use of Neuronify in teaching, a set of premade common network motifs is provided, performing functions such as input summation, gain control by inhibition, and detection of direction of stimulus movement. Neuronify is developed in C++ and QML using the cross-platform application framework Qt and runs on smart phones (Android, iOS) and tablet computers as well personal computers (Windows, Mac, Linux). PMID:28321440
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Chattopadhyaya, M; Murugan, N Arul; Rinkevicius, Zilvinas
2016-09-15
We study the linear and nonlinear optical properties of a well-known acid-base indicator, bromophenol blue (BPB), in aqueous solution by employing static and integrated approaches. In the static approach, optical properties have been calculated using time-dependent density functional theory (TD-DFT) on the fully relaxed geometries of the neutral and different unprotonated forms of BPB. Moreover, both closed and open forms of BPB were considered. In the integrated approach, the optical properties have been computed over many snapshots extracted from molecular dynamics simulation using a hybrid time-dependent density functional theory/molecular mechanics approach. The static approach suggests closed neutral ⇒ anionic interconversion as the dominant mechanism for the red shift in the absorption spectra of BPB due to a change from acidic to basic pH. It is found by employing an integrated approach that the two interconversions, namely open neutral ⇒ anionic and open neutral ⇒ dianionic, can contribute to the pH-dependent shift in the absorption spectra of BPB. Even though both static and integrated approaches reproduce the pH-dependent red shift in the absorption spectra of BPB, the latter one is suitable to determine both the spectra and spectral broadening. Finally, the computed static first hyperpolarizability for various protonated and deprotonated forms of BPB reveals that this molecule can be used as a nonlinear optical probe for pH sensing in addition to its highly exploited use as an optical probe.
Employing static excitation control and tie line reactance to stabilize wind turbine generators
NASA Technical Reports Server (NTRS)
Hwang, H. H.; Mozeico, H. V.; Guo, T.
1978-01-01
An analytical representation of a wind turbine generator is presented which employs blade pitch angle feedback control. A mathematical model was formulated. With the functioning MOD-0 wind turbine serving as a practical case study, results of computer simulations of the model as applied to the problem of dynamic stability at rated load are also presented. The effect of the tower shadow was included in the input to the system. Different configurations of the drive train, and optimal values of the tie line reactance were used in the simulations. Computer results revealed that a static excitation control system coupled with optimal values of the tie line reactance would effectively reduce oscillations of the power output, without the use of a slip clutch.
Color fields of the static pentaquark system computed in SU(3) lattice QCD
NASA Astrophysics Data System (ADS)
Cardoso, Nuno; Bicudo, Pedro
2013-02-01
We compute the color fields of SU(3) lattice QCD created by static pentaquark systems, in a 243×48 lattice at β=6.2 corresponding to a lattice spacing a=0.07261(85)fm. We find that the pentaquark color fields are well described by a multi-Y-type shaped flux tube. The flux tube junction points are compatible with Fermat-Steiner points minimizing the total flux tube length. We also compare the pentaquark flux tube profile with the diquark-diantiquark central flux tube profile in the tetraquark and the quark-antiquark fundamental flux tube profile in the meson, and they match, thus showing that the pentaquark flux tubes are composed of fundamental flux tubes.
Vision-based calibration of parallax barrier displays
NASA Astrophysics Data System (ADS)
Ranieri, Nicola; Gross, Markus
2014-03-01
Static and dynamic parallax barrier displays became very popular over the past years. Especially for single viewer applications like tablets, phones and other hand-held devices, parallax barriers provide a convenient solution to render stereoscopic content. In our work we present a computer vision based calibration approach to relate image layer and barrier layer of parallax barrier displays with unknown display geometry for static or dynamic viewer positions using homographies. We provide the math and methods to compose the required homographies on the fly and present a way to compute the barrier without the need of any iteration. Our GPU implementation is stable and general and can be used to reduce latency and increase refresh rate of existing and upcoming barrier methods.
A Financial Technology Entrepreneurship Program for Computer Science Students
ERIC Educational Resources Information Center
Lawler, James P.; Joseph, Anthony
2011-01-01
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
Neural dynamics of motion perception: direction fields, apertures, and resonant grouping.
Grossberg, S; Mingolla, E
1993-03-01
A neural network model of global motion segmentation by visual cortex is described. Called the motion boundary contour system (BCS), the model clarifies how ambiguous local movements on a complex moving shape are actively reorganized into a coherent global motion signal. Unlike many previous researchers, we analyze how a coherent motion signal is imparted to all regions of a moving figure, not only to regions at which unambiguous motion signals exist. The model hereby suggests a solution to the global aperture problem. The motion BCS describes how preprocessing of motion signals by a motion oriented contrast (MOC) filter is joined to long-range cooperative grouping mechanisms in a motion cooperative-competitive (MOCC) loop to control phenomena such as motion capture. The motion BCS is computed in parallel with the static BCS of Grossberg and Mingolla (1985a, 1985b, 1987). Homologous properties of the motion BCS and the static BCS, specialized to process motion directions and static orientations, respectively, support a unified explanation of many data about static form perception and motion form perception that have heretofore been unexplained or treated separately. Predictions about microscopic computational differences of the parallel cortical streams V1-->MT and V1-->V2-->MT are made--notably, the magnocellular thick stripe and parvocellular interstripe streams. It is shown how the motion BCS can compute motion directions that may be synthesized from multiple orientations with opposite directions of contrast. Interactions of model simple cells, complex cells, hyper-complex cells, and bipole cells are described, with special emphasis given to new functional roles in direction disambiguation for endstopping at multiple processing stages and to the dynamic interplay of spatially short-range and long-range interactions.
ERIC Educational Resources Information Center
Menekse, Muhsin
2015-01-01
While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…
Seol, Ye-In; Kim, Young-Kuk
2014-01-01
Power-aware scheduling reduces CPU energy consumption in hard real-time systems through dynamic voltage scaling (DVS). In this paper, we deal with pinwheel task model which is known as static and predictable task model and could be applied to various embedded or ubiquitous systems. In pinwheel task model, each task's priority is static and its execution sequence could be predetermined. There have been many static approaches to power-aware scheduling in pinwheel task model. But, in this paper, we will show that the dynamic priority scheduling results in power-aware scheduling could be applied to pinwheel task model. This method is more effective than adopting the previous static priority scheduling methods in saving energy consumption and, for the system being still static, it is more tractable and applicable to small sized embedded or ubiquitous computing. Also, we introduce a novel power-aware scheduling algorithm which exploits all slacks under preemptive earliest-deadline first scheduling which is optimal in uniprocessor system. The dynamic priority method presented in this paper could be applied directly to static systems of pinwheel task model. The simulation results show that the proposed algorithm with the algorithmic complexity of O(n) reduces the energy consumption by 10-80% over the existing algorithms.
2014-01-01
Power-aware scheduling reduces CPU energy consumption in hard real-time systems through dynamic voltage scaling (DVS). In this paper, we deal with pinwheel task model which is known as static and predictable task model and could be applied to various embedded or ubiquitous systems. In pinwheel task model, each task's priority is static and its execution sequence could be predetermined. There have been many static approaches to power-aware scheduling in pinwheel task model. But, in this paper, we will show that the dynamic priority scheduling results in power-aware scheduling could be applied to pinwheel task model. This method is more effective than adopting the previous static priority scheduling methods in saving energy consumption and, for the system being still static, it is more tractable and applicable to small sized embedded or ubiquitous computing. Also, we introduce a novel power-aware scheduling algorithm which exploits all slacks under preemptive earliest-deadline first scheduling which is optimal in uniprocessor system. The dynamic priority method presented in this paper could be applied directly to static systems of pinwheel task model. The simulation results show that the proposed algorithm with the algorithmic complexity of O(n) reduces the energy consumption by 10–80% over the existing algorithms. PMID:25121126
LaZerte, Stefanie E; Reudink, Matthew W; Otter, Ken A; Kusack, Jackson; Bailey, Jacob M; Woolverton, Austin; Paetkau, Mark; de Jong, Adriaan; Hill, David J
2017-10-01
Radio frequency identification (RFID) provides a simple and inexpensive approach for examining the movements of tagged animals, which can provide information on species behavior and ecology, such as habitat/resource use and social interactions. In addition, tracking animal movements is appealing to naturalists, citizen scientists, and the general public and thus represents a tool for public engagement in science and science education. Although a useful tool, the large amount of data collected using RFID may quickly become overwhelming. Here, we present an R package (feedr) we have developed for loading, transforming, and visualizing time-stamped, georeferenced data, such as RFID data collected from static logger stations. Using our package, data can be transformed from raw RFID data to visits, presence (regular detections by a logger over time), movements between loggers, displacements, and activity patterns. In addition, we provide several conversion functions to allow users to format data for use in functions from other complementary R packages. Data can also be visualized through static or interactive maps or as animations over time. To increase accessibility, data can be transformed and visualized either through R directly, or through the companion site: http://animalnexus.ca, an online, user-friendly, R-based Shiny Web application. This system can be used by professional and citizen scientists alike to view and study animal movements. We have designed this package to be flexible and to be able to handle data collected from other stationary sources (e.g., hair traps, static very high frequency (VHF) telemetry loggers, observations of marked individuals in colonies or staging sites), and we hope this framework will become a meeting point for science, education, and community awareness of the movements of animals. We aim to inspire citizen engagement while simultaneously enabling robust scientific analysis.
Computer Science | Classification | College of Engineering & Applied
EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer
Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?
ERIC Educational Resources Information Center
Schrock, John Richard
1984-01-01
Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…
NASA Technical Reports Server (NTRS)
Mcardle, J. G.; Homyak, L.; Moore, A. S.
1979-01-01
The performance of a YF-102 turbofan engine was measured in an outdoor test stand with a bellmouth inlet and seven exhaust-system configurations. The configurations consisted of three separate-flow systems of various fan and core nozzle sizes and four confluent-flow systems of various nozzle sizes and shapes. A computer program provided good estimates of the engine performance and of thrust at maximum rating for each exhaust configuration. The internal performance of two different-shaped core nozzles for confluent-flow configurations was determined to be satisfactory. Pressure and temperature surveys were made with a traversing probe in the exhaust-nozzle flow for some confluent-flow configurations. The survey data at the mixing plane, plus the measured flow rates, were used to calculate the static-pressure variation along the exhaust nozzle length. The computed pressures compared well with experimental wall static-pressure data. External-flow surveys were made, for some confluent-flow configurations, with a large fixed rake at various locations in the exhaust plume.
Gravitational quasi-normal modes of static R 2 Anti-de Sitter black holes
NASA Astrophysics Data System (ADS)
Ma, Hong; Li, Jin
2017-06-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11205254, 11178018, 11375279, and 11605015), the Fundamental Research Funds for the Central Universities, China (Grant Nos. 106112016CDJXY300002 and CDJRC10300003), the Chinese State Scholarship Fund, FAPESP (Grant No. 2012/08934-0), and the Natural Science Foundation Project of CQ CSTC (Grant No. 2011BB0052).
ERIC Educational Resources Information Center
Tobin, Kenneth, Ed.; Elmesky, Rowhea, Ed.; Seiler, Gale, Ed.
2005-01-01
Many would argue that the state of urban science education has been static for the past several decades and that there is little to learn from it. Rather than accepting this deficit perspective, this book strives to recognize and understand the successes that exist there by systematically documenting seven years of research into issues salient to…
1990-08-29
Multiple lightning bolts struck the Technology Test Bed, formerly the S-IC Static Test Stand, at the Marshall Space Flight Center (MSFC) during a thunderstorm. This spectacular image of lightning was photographed by MSFC photographer Dernis Olive on August 29, 1990.
Infrared realization of dS2 in AdS2
NASA Astrophysics Data System (ADS)
Anninos, Dionysios; Hofman, Diego M.
2018-04-01
We describe a two-dimensional geometry that smoothly interpolates between an asymptotically AdS2 geometry and the static patch of dS2. We find this ‘centaur’ geometry to be a solution of dilaton gravity with a specific class of potentials for the dilaton. We interpret the centaur geometry as a thermal state in the putative quantum mechanics dual to the AdS2 evolved with the global Hamiltonian. We compute the thermodynamic properties and observe that the centaur state has finite entropy and positive specific heat. The static patch is the infrared part of the centaur geometry. We discuss boundary observables sensitive to the static patch region.
NASA Technical Reports Server (NTRS)
Whiffen, Gregory J.
2006-01-01
Mystic software is designed to compute, analyze, and visualize optimal high-fidelity, low-thrust trajectories, The software can be used to analyze inter-planetary, planetocentric, and combination trajectories, Mystic also provides utilities to assist in the operation and navigation of low-thrust spacecraft. Mystic will be used to design and navigate the NASA's Dawn Discovery mission to orbit the two largest asteroids, The underlying optimization algorithm used in the Mystic software is called Static/Dynamic Optimal Control (SDC). SDC is a nonlinear optimal control method designed to optimize both 'static variables' (parameters) and dynamic variables (functions of time) simultaneously. SDC is a general nonlinear optimal control algorithm based on Bellman's principal.
Theory for solubility in static systems
NASA Astrophysics Data System (ADS)
Gusev, Andrei A.; Suter, Ulrich W.
1991-06-01
A theory for the solubility of small particles in static structures has been developed. The distribution function of the solute in a frozen solid has been derived in analytical form for the quantum and the quasiclassical cases. The solubility at infinitesimal gas pressure (Henry's constant) as well as the pressure dependence of the solute concentration at elevated pressures has been found from the statistical equilibrium between the solute in the static matrix and the ideal-gas phase. The distribution function of a solute containing different particles has been evaluated in closed form. An application of the theory to the sorption of methane in the computed structures of glassy polycarbonate has resulted in a satisfactory agreement with experimental data.
Kinect-based sign language recognition of static and dynamic hand movements
NASA Astrophysics Data System (ADS)
Dalawis, Rando C.; Olayao, Kenneth Deniel R.; Ramos, Evan Geoffrey I.; Samonte, Mary Jane C.
2017-02-01
A different approach of sign language recognition of static and dynamic hand movements was developed in this study using normalized correlation algorithm. The goal of this research was to translate fingerspelling sign language into text using MATLAB and Microsoft Kinect. Digital input image captured by Kinect devices are matched from template samples stored in a database. This Human Computer Interaction (HCI) prototype was developed to help people with communication disability to express their thoughts with ease. Frame segmentation and feature extraction was used to give meaning to the captured images. Sequential and random testing was used to test both static and dynamic fingerspelling gestures. The researchers explained some factors they encountered causing some misclassification of signs.
Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades.
Orchard, Garrick; Jayawant, Ajinkya; Cohen, Gregory K; Thakor, Nitish
2015-01-01
Creating datasets for Neuromorphic Vision is a challenging task. A lack of available recordings from Neuromorphic Vision sensors means that data must typically be recorded specifically for dataset creation rather than collecting and labeling existing data. The task is further complicated by a desire to simultaneously provide traditional frame-based recordings to allow for direct comparison with traditional Computer Vision algorithms. Here we propose a method for converting existing Computer Vision static image datasets into Neuromorphic Vision datasets using an actuated pan-tilt camera platform. Moving the sensor rather than the scene or image is a more biologically realistic approach to sensing and eliminates timing artifacts introduced by monitor updates when simulating motion on a computer monitor. We present conversion of two popular image datasets (MNIST and Caltech101) which have played important roles in the development of Computer Vision, and we provide performance metrics on these datasets using spike-based recognition algorithms. This work contributes datasets for future use in the field, as well as results from spike-based algorithms against which future works can compare. Furthermore, by converting datasets already popular in Computer Vision, we enable more direct comparison with frame-based approaches.
Climate Analytics as a Service. Chapter 11
NASA Technical Reports Server (NTRS)
Schnase, John L.
2016-01-01
Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
78 FR 10180 - Annual Computational Science Symposium; Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...
NASA Astrophysics Data System (ADS)
Liu, J. X.; Deng, S. C.; Liang, N. G.
2008-02-01
Concrete is heterogeneous and usually described as a three-phase material, where matrix, aggregate and interface are distinguished. To take this heterogeneity into consideration, the Generalized Beam (GB) lattice model is adopted. The GB lattice model is much more computationally efficient than the beam lattice model. Numerical procedures of both quasi-static method and dynamic method are developed to simulate fracture processes in uniaxial tensile tests conducted on a concrete panel. Cases of different loading rates are compared with the quasi-static case. It is found that the inertia effect due to load increasing becomes less important and can be ignored with the loading rate decreasing, but the inertia effect due to unstable crack propagation remains considerable no matter how low the loading rate is. Therefore, an unrealistic result will be obtained if a fracture process including unstable cracking is simulated by the quasi-static procedure.
McHugh, Stuart
1976-01-01
The material in this report is concerned with the effects of a vertically oriented rectangular dislocation loop on the tilts observed at the free surface of an elastic half-space. Part I examines the effect of a spatially variable static strike-slip distribution across the slip surface. The tilt components as a function of distance parallel, or perpendicular, to the strike of the slip surface are displayed for different slip-versus-distance profiles. Part II examines the effect of spatially and temporally variable slip distributions across the dislocation loop on the quasi-static tilts at the free surface of an elastic half space. The model discussed in part II may be used to generate theoretical tilt versus time curves produced by creep events.
NASA Technical Reports Server (NTRS)
Lopez, Armando E.; Buell, Donald A.; Tinling, Bruce E.
1959-01-01
Wind-tunnel measurements were made of the static and dynamic rotary stability derivatives of an airplane model having sweptback wing and tail surfaces. The Mach number range of the tests was from 0.23 to 0.94. The components of the model were tested in various combinations so that the separate contribution to the stability derivatives of the component parts and the interference effects could be determined. Estimates of the dynamic rotary derivatives based on some of the simpler existing procedures which utilize static force data were found to be in reasonable agreement with the experimental results at low angles of attack. The results of the static and dynamic measurements were used to compute the short-period oscillatory characteristics of an airplane geometrically similar to the test model. The results of these calculations are compared with military flying qualities requirements.
Evaluation of conductive concrete for anti-static flooring applications
NASA Astrophysics Data System (ADS)
Yehia, Sherif; Qaddoumi, Nasser; Hassan, Mohamed; Swaked, Bassam
2015-04-01
Static electricity, exchange of electrons, and retention of charge between any two materials due to contact and separation are affected by the condition of the materials being nonconductive or insulated from ground. Several work environments, such as electronics industry, hospitals, offices, and computer rooms all require electro-static discharge (ESD) mitigation. Carpet Tile, Carpet Broadloom, Vinyl Tile, Vinyl sheet, Epoxy and Rubber are examples of existing flooring systems in the market. However, each system has its advantages and limitations. Conductive concrete is a relatively new material technology developed to achieve high electrical conductivity and high mechanical strength. The conductive concrete material can be an economical alternative for these ESD flooring systems. In this paper, the effectiveness of conductive concrete as an anti-static flooring system was evaluated. The initial results indicated that the proposed conductive concrete flooring and ground system met the acceptance criteria stated by ASTM F150.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)
1993-01-01
Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.
Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations
ERIC Educational Resources Information Center
Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa
2013-01-01
The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
High school computer science education paves the way for higher education: the Israeli case
NASA Astrophysics Data System (ADS)
Armoni, Michal; Gal-Ezer, Judith
2014-07-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
. Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Bernier, David
2011-01-01
Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
NASA Astrophysics Data System (ADS)
Koch, Melissa; Gorges, Torie
2016-10-01
Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.
The NASA computer science research program plan
NASA Technical Reports Server (NTRS)
1983-01-01
A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.
Steady-State Computation of Constant Rotational Rate Dynamic Stability Derivatives
NASA Technical Reports Server (NTRS)
Park, Michael A.; Green, Lawrence L.
2000-01-01
Dynamic stability derivatives are essential to predicting the open and closed loop performance, stability, and controllability of aircraft. Computational determination of constant-rate dynamic stability derivatives (derivatives of aircraft forces and moments with respect to constant rotational rates) is currently performed indirectly with finite differencing of multiple time-accurate computational fluid dynamics solutions. Typical time-accurate solutions require excessive amounts of computational time to complete. Formulating Navier-Stokes (N-S) equations in a rotating noninertial reference frame and applying an automatic differentiation tool to the modified code has the potential for directly computing these derivatives with a single, much faster steady-state calculation. The ability to rapidly determine static and dynamic stability derivatives by computational methods can benefit multidisciplinary design methodologies and reduce dependency on wind tunnel measurements. The CFL3D thin-layer N-S computational fluid dynamics code was modified for this study to allow calculations on complex three-dimensional configurations with constant rotation rate components in all three axes. These CFL3D modifications also have direct application to rotorcraft and turbomachinery analyses. The modified CFL3D steady-state calculation is a new capability that showed excellent agreement with results calculated by a similar formulation. The application of automatic differentiation to CFL3D allows the static stability and body-axis rate derivatives to be calculated quickly and exactly.
The super-Turing computational power of plastic recurrent neural networks.
Cabessa, Jérémie; Siegelmann, Hava T
2014-12-01
We study the computational capabilities of a biologically inspired neural model where the synaptic weights, the connectivity pattern, and the number of neurons can evolve over time rather than stay static. Our study focuses on the mere concept of plasticity of the model so that the nature of the updates is assumed to be not constrained. In this context, we show that the so-called plastic recurrent neural networks (RNNs) are capable of the precise super-Turing computational power--as the static analog neural networks--irrespective of whether their synaptic weights are modeled by rational or real numbers, and moreover, irrespective of whether their patterns of plasticity are restricted to bi-valued updates or expressed by any other more general form of updating. Consequently, the incorporation of only bi-valued plastic capabilities in a basic model of RNNs suffices to break the Turing barrier and achieve the super-Turing level of computation. The consideration of more general mechanisms of architectural plasticity or of real synaptic weights does not further increase the capabilities of the networks. These results support the claim that the general mechanism of plasticity is crucially involved in the computational and dynamical capabilities of biological neural networks. They further show that the super-Turing level of computation reflects in a suitable way the capabilities of brain-like models of computation.
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
1988-07-08
Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36
Empirical Determination of Competence Areas to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia
2014-01-01
The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…
Factors Influencing Exemplary Science Teachers' Levels of Computer Use
ERIC Educational Resources Information Center
Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen
2011-01-01
The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
The whole space three-dimensional magnetotelluric inversion algorithm with static shift correction
NASA Astrophysics Data System (ADS)
Zhang, K.
2016-12-01
Base on the previous studies on the static shift correction and 3D inversion algorithms, we improve the NLCG 3D inversion method and propose a new static shift correction method which work in the inversion. The static shift correction method is based on the 3D theory and real data. The static shift can be detected by the quantitative analysis of apparent parameters (apparent resistivity and impedance phase) of MT in high frequency range, and completed correction with inversion. The method is an automatic processing technology of computer with 0 cost, and avoids the additional field work and indoor processing with good results.The 3D inversion algorithm is improved (Zhang et al., 2013) base on the NLCG method of Newman & Alumbaugh (2000) and Rodi & Mackie (2001). For the algorithm, we added the parallel structure, improved the computational efficiency, reduced the memory of computer and added the topographic and marine factors. So the 3D inversion could work in general PC with high efficiency and accuracy. And all the MT data of surface stations, seabed stations and underground stations can be used in the inversion algorithm. The verification and application example of 3D inversion algorithm is shown in Figure 1. From the comparison of figure 1, the inversion model can reflect all the abnormal bodies and terrain clearly regardless of what type of data (impedance/tipper/impedance and tipper). And the resolution of the bodies' boundary can be improved by using tipper data. The algorithm is very effective for terrain inversion. So it is very useful for the study of continental shelf with continuous exploration of land, marine and underground.The three-dimensional electrical model of the ore zone reflects the basic information of stratum, rock and structure. Although it cannot indicate the ore body position directly, the important clues are provided for prospecting work by the delineation of diorite pluton uplift range. The test results show that, the high quality of the data processing and efficient inversion method for electromagnetic method is an important guarantee for porphyry ore.
OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2005-01-01
Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.
European Science Notes Information Bulletin Reports on Current European/ Middle Eastern Science
1988-07-01
41 Recently concluded and planned experiments are described which arc based on static Kerr effect dis- persion- spectroscopy , optical Kerr effect...studies, and both electric and magnetic resonance 6 molecular beam spectroscopy . These advanced techniques are used to measure with great precision...simuhItio , using thrcc-dimcn- flight using temperature-sensitive films and infrared sional singularities in a panel ncthod. camecras placed on either side
NASA Technical Reports Server (NTRS)
Baumeister, K. J.; Horowitz, S. J.
1982-01-01
An iterative finite element integral technique is used to predict the sound field radiated from the JT15D turbofan inlet. The sound field is divided into two regions: the sound field within and near the inlet which is computed using the finite element method and the radiation field beyond the inlet which is calculated using an integral solution technique. The velocity potential formulation of the acoustic wave equation was employed in the program. For some single mode JT15D data, the theory and experiment are in good agreement for the far field radiation pattern as well as suppressor attenuation. Also, the computer program is used to simulate flight effects that cannot be performed on a ground static test stand.
Development of the Orion Crew Module Static Aerodynamic Database. Par 2; Supersonic/Subsonic
NASA Technical Reports Server (NTRS)
Bibb, Karen L.; Walker, Eric L.; Brauckmann, Gregory J.; Robinson, Phil
2011-01-01
This work describes the process of developing the nominal static aerodynamic coefficients and associated uncertainties for the Orion Crew Module for Mach 8 and below. The database was developed from wind tunnel test data and computational simulations of the smooth Crew Module geometry, with no asymmetries or protuberances. The database covers the full range of Reynolds numbers seen in both entry and ascent abort scenarios. The basic uncertainties were developed as functions of Mach number and total angle of attack from variations in the primary data as well as computations at lower Reynolds numbers, on the baseline geometry, and using different flow solvers. The resulting aerodynamic database represents the Crew Exploration Vehicle Aerosciences Project's best estimate of the nominal aerodynamics for the current Crew Module vehicle.
Explosion safety in industrial electrostatics
NASA Astrophysics Data System (ADS)
Szabó, S. V.; Kiss, I.; Berta, I.
2011-01-01
Complicated industrial systems are often endangered by electrostatic hazards, both from atmospheric (lightning phenomenon, primary and secondary lightning protection) and industrial (technological problems caused by static charging and fire and explosion hazards.) According to the classical approach protective methods have to be used in order to remove electrostatic charging and to avoid damages, however no attempt to compute the risk before and after applying the protective method is made, relying instead on well-educated and practiced expertise. The Budapest School of Electrostatics - in close cooperation with industrial partners - develops new suitable solutions for probability based decision support (Static Control Up-to-date Technology, SCOUT) using soft computing methods. This new approach can be used to assess and audit existing systems and - using the predictive power of the models - to design and plan activities in industrial electrostatics.
Pinkney, S; Fernie, G
2001-01-01
A three-dimensional (3D) lumped-parameter model of a powered wheelchair was created to aid the development of the Rocket prototype wheelchair and to help explore the effect of innovative design features on its stability. The model was developed using simulation software, specifically Working Model 3D. The accuracy of the model was determined by comparing both its static stability angles and dynamic behavior as it passed down a 4.8-cm (1.9") road curb at a heading of 45 degrees with the performance of the actual wheelchair. The model's predictions of the static stability angles in the forward, rearward, and lateral directions were within 9.3, 7.1, and 3.8% of the measured values, respectively. The average absolute error in the predicted position of the wheelchair as it moved down the curb was 2.2 cm/m (0.9" per 3'3") traveled. The accuracy was limited by the inability to model soft bodies, the inherent difficulties in modeling a statically indeterminate system, and the computing time. Nevertheless, it was found to be useful in investigating the effect of eight design alterations on the lateral stability of the wheelchair. Stability was quantified by determining the static lateral stability angles and the maximum height of a road curb over which the wheelchair could successfully drive on a diagonal heading. The model predicted that the stability was more dependent on the configuration of the suspension system than on the dimensions and weight distribution of the wheelchair. Furthermore, for the situations and design alterations studied, predicted improvements in static stability were not correlated with improvements in dynamic stability.
NASA Astrophysics Data System (ADS)
Ghamgosar, M.; Erarslan, N.
2016-03-01
The development of fracture process zones (FPZ) in the Cracked Chevron Notched Brazilian Disc (CCNBD) monsonite and Brisbane tuff specimens was investigated to evaluate the mechanical behaviour of brittle rocks under static and various cyclic loadings. An FPZ is a region that involves different types of damage around the pre-existing and/or stress-induced crack tips in engineering materials. This highly damaged area includes micro- and meso-cracks, which emerge prior to the main fracture growth or extension and ultimately coalescence to macrofractures, leading to the failure. The experiments and numerical simulations were designed for this study to investigate the following features of FPZ in rocks: (1) ligament connections and (2) microcracking and its coalescence in FPZ. A Computed Tomography (CT) scan technique was also used to investigate the FPZ behaviour in selected rock specimens. The CT scan results showed that the fracturing velocity is entirely dependent on the appropriate amount of fracture energy absorbed in rock specimens due to the change of frequency and amplitudes of the dynamic loading. Extended Finite Element Method (XFEM) was used to compute the displacements, tensile stress distribution and plastic energy dissipation around the propagating crack tip in FPZ. One of the most important observations, the shape of FPZ and its extension around the crack tip, was made using numerical and experimental results, which supported the CT scan results. When the static rupture and the cyclic rupture were compared, the main differences are twofold: (1) the number of fragments produced is much greater under cyclic loading than under static loading, and (2) intergranular cracks are formed due to particle breakage under cyclic loading compared with smooth and bright cracks along cleavage planes under static loading.
Performance analysis of a large-grain dataflow scheduling paradigm
NASA Technical Reports Server (NTRS)
Young, Steven D.; Wills, Robert W.
1993-01-01
A paradigm for scheduling computations on a network of multiprocessors using large-grain data flow scheduling at run time is described and analyzed. The computations to be scheduled must follow a static flow graph, while the schedule itself will be dynamic (i.e., determined at run time). Many applications characterized by static flow exist, and they include real-time control and digital signal processing. With the advent of computer-aided software engineering (CASE) tools for capturing software designs in dataflow-like structures, macro-dataflow scheduling becomes increasingly attractive, if not necessary. For parallel implementations, using the macro-dataflow method allows the scheduling to be insulated from the application designer and enables the maximum utilization of available resources. Further, by allowing multitasking, processor utilizations can approach 100 percent while they maintain maximum speedup. Extensive simulation studies are performed on 4-, 8-, and 16-processor architectures that reflect the effects of communication delays, scheduling delays, algorithm class, and multitasking on performance and speedup gains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zachary M. Prince; Jean C. Ragusa; Yaqi Wang
Because of the recent interest in reactor transient modeling and the restart of the Transient Reactor (TREAT) Facility, there has been a need for more efficient, robust methods in computation frameworks. This is the impetus of implementing the Improved Quasi-Static method (IQS) in the RATTLESNAKE/MOOSE framework. IQS has implemented with CFEM diffusion by factorizing flux into time-dependent amplitude and spacial- and weakly time-dependent shape. The shape evaluation is very similar to a flux diffusion solve and is computed at large (macro) time steps. While the amplitude evaluation is a PRKE solve where the parameters are dependent on the shape andmore » is computed at small (micro) time steps. IQS has been tested with a custom one-dimensional example and the TWIGL ramp benchmark. These examples prove it to be a viable and effective method for highly transient cases. More complex cases are intended to be applied to further test the method and its implementation.« less
Langley 14- by 22-foot subsonic tunnel test engineer's data acquisition and reduction manual
NASA Technical Reports Server (NTRS)
Quinto, P. Frank; Orie, Nettie M.
1994-01-01
The Langley 14- by 22-Foot Subsonic Tunnel is used to test a large variety of aircraft and nonaircraft models. To support these investigations, a data acquisition system has been developed that has both static and dynamic capabilities. The static data acquisition and reduction system is described; the hardware and software of this system are explained. The theory and equations used to reduce the data obtained in the wind tunnel are presented; the computer code is not included.
Fatigue criterion for the design of rotating shafts under combined stress
NASA Technical Reports Server (NTRS)
Loewenthal, S. H.
1977-01-01
A revised approach to the design of transmission shafting which considers the flexure fatigue characteristics of the shaft material under combined cyclic bending and static torsion stress is presented. A fatigue failure relation, corroborated by published combined stress test data, is presented which shows an elliptical variation of reversed bending endurance strength with static torsional stress. From this elliptical failure relations, a design formula for computing the diameter of rotating solid shafts under the most common condition of loading is developed.
Brave New Media World: Science Communication Voyages through the Global Seas
NASA Astrophysics Data System (ADS)
Clark, C. L.; Reisewitz, A.
2010-12-01
By leveraging online tools, such as blogs, Twitter, Facebook, Google Earth, flickr, web-based discussion boards, and a bi-monthly electronic magazine for the non-scientist, Scripps Institution of Oceanography is taking science communications out of the static webpage to create interactive journeys that spark social dialogue and helped raise awareness of science-based research on global marine environmental issues. Several new initiatives are being chronicled through popular blogs and expedition web sites as researchers share interesting scientific facts and unusual findings in near real-time.
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
Statis Program Analysis for Reliable, Trusted Apps
2017-02-01
flexibility to system design. However, it is challenging for a static analysis to compute or verify properties about a system that uses implicit control...sources might affect the variable’s value. The type qualifier @Sink indicates where (information computed from) the value might be output. These...upper bound on the set of sensitive sources that were actually used to compute the value. If the type of x is qualified by @Source({INTERNET, LOCATION
Three-dimensional Computational Fluid Dynamics Investigation of a Spinning Helicopter Slung Load
NASA Technical Reports Server (NTRS)
Theorn, J. N.; Duque, E. P. N.; Cicolani, L.; Halsey, R.
2005-01-01
After performing steady-state Computational Fluid Dynamics (CFD) calculations using OVERFLOW to validate the CFD method against static wind-tunnel data of a box-shaped cargo container, the same setup was used to investigate unsteady flow with a moving body. Results were compared to flight test data previously collected in which the container is spinning.
ERIC Educational Resources Information Center
Kaplan, Danielle E.; Wu, Erin Chia-ling
2006-01-01
Our research suggests static and animated graphics can lead to more animated thinking and more correct problem solving in computer-based probability learning. Pilot software modules were developed for graduate online statistics courses and representation research. A study with novice graduate student statisticians compared problem solving in five…
Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender
NASA Astrophysics Data System (ADS)
Larsen, Elizabeth A.; Stubbs, Margaret L.
Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.
WebGL for Rosetta Science Planning
NASA Astrophysics Data System (ADS)
Schmidt, Albrecht; Völk, Stefan; Grieger, Björn
2013-04-01
Rosetta is a mission of the European Space Agency (ESA) to rendez-vous with comet Churyumov-Gerasimenko in 2014. The trajectory and operations of the mission are particularly complex, have many free parameters and are novel to the community. To support science planning, communicate operational ideas and disseminate operational scenarios to the scientific community, the science ground segment makes use of Web-based visualisation technologies. Using the recent standard WebGL, static pages of time-dependent three-dimensional views of the spacecraft and the field-of-views of the instruments are generated, directly from the operational files. These can then be viewed in modern Web browsers for understanding or verification, be analysed and correlated with other studies. Variable timesteps make it possible to provide both overviews and detailed animated scenes. The technical challenges that are particular to Web-based environments include: (1) In traditional OpenGL, is much easier to compute needed data on demand since the visualisation runs natively on a usually quite powerful computer. In WebGL application, since requests for additional data have to be passed through a Web server, they are more complex and also require a more complex infrastructure. (2) The volume of data that can be kept in a browser environment is limited and has to be transferred over often slow network links. Thus, careful design and reduction of data is required. (3) Although browser support for WebGL has improved since the authors started using it, it is often not well supported on mobile and small devices. (4) Web browsers often only support limited end user interactions with a mouse or keyboards. While some of the challenges can be expected to become less important as technological progress continues, others seem to be more inherent to the approach. On the positive side, the authors' experiences include: (1) low threshold in the community to using the visualisations, (2), thus, cooperative use of the products, and (3) good and still improving tool and library support.
Democratizing Computer Science
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Ryoo, Jean J.
2015-01-01
Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…
ERIC Educational Resources Information Center
Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu
2013-01-01
With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…
Computer Science and the Liberal Arts
ERIC Educational Resources Information Center
Shannon, Christine
2010-01-01
Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…
Marrying Content and Process in Computer Science Education
ERIC Educational Resources Information Center
Zendler, A.; Spannagel, C.; Klaudt, D.
2011-01-01
Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…
ERIC Educational Resources Information Center
Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.
2016-01-01
Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…
Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University
ERIC Educational Resources Information Center
Plane, Jandelyn
2010-01-01
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…
Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.
ERIC Educational Resources Information Center
Turner, Judith Axler
1987-01-01
Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)
African-American males in computer science---Examining the pipeline for clogs
NASA Astrophysics Data System (ADS)
Stone, Daryl Bryant
The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.
Gibbons-Hawking radiation of gravitons in the Poincaré and static patches of de Sitter spacetime
NASA Astrophysics Data System (ADS)
Bernar, Rafael P.; Crispino, Luís C. B.; Higuchi, Atsushi
2018-04-01
We discuss the quantization of linearized gravity in the background de Sitter spacetime using a gauge-invariant formalism to write the perturbed gravitational field in the static patch. This field is quantized after fixing the gauge completely. The response rate of this field to monochromatic multipole sources is then computed in the thermal equilibrium state with the well-known Gibbons-Hawking temperature. We compare this response rate with the one obtained in the Bunch-Davies-like vacuum state defined in the Poincaré patch. These response rates are found to be the same as expected. This agreement serves as a verification of the infrared finite graviton two-point function in the static patch of de Sitter spacetime found previously.
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
Computer Science and Telecommunications Board summary of activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blumenthal, M.S.
1992-03-27
The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.
Hispanic women overcoming deterrents to computer science: A phenomenological study
NASA Astrophysics Data System (ADS)
Herling, Lourdes
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.
MOSHFIT: algorithms for occlusion-tolerant mean shape and rigid motion from 3D movement data.
Mitchelson, Joel R
2013-09-03
This work addresses the use of 3D point data to measure rigid motions, in the presence of occlusion and without reference to a prior model of relative point locations. This is a problem where cluster-based measurement techniques are used (e.g. for measuring limb movements) and no static calibration trial is available. The same problem arises when performing the task known as roving capture, in which a mobile 3D movement analysis system is moved through a volume with static markers in unknown locations and the ego-motion of the system is required in order to understand biomechanical activity in the environment. To provide a solution for both of these applications, the new concept of a visibility graph is introduced, and is combined with a generalised procrustes method adapted from ones used by the biological shape statistics and computer graphics communities. Recent results on shape space manifolds are applied to show sufficient conditions for convergence to unique solution. Algorithm source code is available and referenced here. Processing speed and rate of convergence are demonstrated using simulated data. Positional and angular accuracy are shown to be equivalent to approaches which require full calibration, to within a small fraction of input resolution. Typical processing times for sub-micron convergence are found to be fractions of a second, so the method is suitable for workflows where there may be time pressure such as in sports science and clinical analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.
Research in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
ERIC Educational Resources Information Center
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-01-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…
Opportunities for Computational Discovery in Basic Energy Sciences
NASA Astrophysics Data System (ADS)
Pederson, Mark
2011-03-01
An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~
Research | Computational Science | NREL
Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples
Konrad, A; Stafilidis, S; Tilp, M
2017-10-01
The purpose of this study was to investigate the influence of a single static, ballistic, or proprioceptive neuromuscular facilitation (PNF) stretching exercise on the various muscle-tendon parameters of the lower leg and to detect possible differences in the effects between the methods. Volunteers (n = 122) were randomly divided into static, ballistic, and PNF stretching groups and a control group. Before and after the 4 × 30 s stretching intervention, we determined the maximum dorsiflexion range of motion (RoM) with the corresponding fascicle length and pennation angle of the gastrocnemius medialis. Passive resistive torque (PRT) and maximum voluntary contraction (MVC) were measured with a dynamometer. Observation of muscle-tendon junction (MTJ) displacement with ultrasound allowed us to determine the length changes in the tendon and muscle, respectively, and hence to calculate stiffness. Although RoM increased (static: +4.3%, ballistic: +4.5%, PNF: +3.5%), PRT (static: -11.4%, ballistic: -11.5%, PNF: -13,7%), muscle stiffness (static: -13.1%, ballistic: -20.3%, PNF: -20.2%), and muscle-tendon stiffness (static: -11.3%, ballistic: -10.5%, PNF: -13.7%) decreased significantly in all the stretching groups. Only in the PNF stretching group, the pennation angle in the stretched position (-4.2%) and plantar flexor MVC (-4.6%) decreased significantly. Multivariate analysis showed no clinically relevant difference between the stretching groups. The increase in RoM and the decrease in PRT and muscle-tendon stiffness could be explained by more compliant muscle tissue following a single static, ballistic, or PNF stretching exercise. © 2017 The Authors Scandinavian Journal of Medicine & Science In Sports Published by John Wiley & Sons Ltd.
Why do parallel cortical systems exist for the perception of static form and moving form?
Grossberg, S
1991-02-01
This article analyzes computational properties that clarify why the parallel cortical systems V1----V2, V1----MT, and V1----V2----MT exist for the perceptual processing of static visual forms and moving visual forms. The article describes a symmetry principle, called FM symmetry, that is predicted to govern the development of these parallel cortical systems by computing all possible ways of symmetrically gating sustained cells with transient cells and organizing these sustained-transient cells into opponent pairs of on-cells and off-cells whose output signals are insensitive to direction of contrast. This symmetric organization explains how the static form system (static BCS) generates emergent boundary segmentations whose outputs are insensitive to direction of contrast and insensitive to direction of motion, whereas the motion form system (motion BCS) generates emergent boundary segmentations whose outputs are insensitive to direction of contrast but sensitive to direction of motion. FM symmetry clarifies why the geometries of static and motion form perception differ--for example, why the opposite orientation of vertical is horizontal (90 degrees), but the opposite direction of up is down (180 degrees). Opposite orientations and directions are embedded in gated dipole opponent processes that are capable of antagonistic rebound. Negative afterimages, such as the MacKay and waterfall illusions, are hereby explained as are aftereffects of long-range apparent motion. These antagonistic rebounds help to control a dynamic balance between complementary perceptual states of resonance and reset. Resonance cooperatively links features into emergent boundary segmentations via positive feedback in a CC loop, and reset terminates a resonance when the image changes, thereby preventing massive smearing of percepts. These complementary preattentive states of resonance and reset are related to analogous states that govern attentive feature integration, learning, and memory search in adaptive resonance theory. The mechanism used in the V1----MT system to generate a wave of apparent motion between discrete flashes may also be used in other cortical systems to generate spatial shifts of attention. The theory suggests how the V1----V2----MT cortical stream helps to compute moving form in depth and how long-range apparent motion of illusory contours occurs. These results collectively argue against vision theories that espouse independent processing modules. Instead, specialized subsystems interact to overcome computational uncertainties and complementary deficiencies, to cooperatively bind features into context-sensitive resonances, and to realize symmetry principles that are predicted to govern the development of the visual cortex.
Response of multi-panel assembly to noise from a jet in forward motion
NASA Technical Reports Server (NTRS)
Bayliss, A.; Maestrello, L.; Mcgreevy, J. L.; Fenno, C. C., Jr.
1995-01-01
A model of the interaction of the noise from a spreading subsonic jet with a 4 panel assembly is studied numerically in two dimensions. The effect of forward motion of the jet is accounted for by considering a uniform flow field superimposed on a mean jet exit profile. The jet is initially excited by a pulse-like source inserted into the flow field. The pulse triggers instabilities associated with the inviscid instability of the jet shear layer. These instabilities generate sound which in turn serves to excite the panels. We compare the sound from the jet, the responses of the panels and the resulting acoustic radiation for the static jet and the jet in forward motion. The far field acoustic radiation, the panel response and sound radiated from the panels are all computed and compared to computations of a static jet. The results demonstrate that for a jet in forward motion there is a reduction in sound in downstream directions and an increase in sound in upstream directions in agreement with experiments. Furthermore, the panel response and radiation for a jet in forward motion exhibits a downstream attenuation as compared with the static case.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattarai, Bishnu P.; Gentle, Jake P.; Hill, Porter
Abstract—Overhead transmission lines (TLs) are conventionally given seasonal ratings based on conservative environmental assumptions. Such an approach often results in underutilization of the line ampacity as the worst conditions prevail only for a short period over a year/season. We presents dynamic line rating (DLR) as an enabling smart grid technology that adaptively computes ratings of TLs based on local weather conditions to utilize additional headroom of existing lines. In particular, general line ampacity state solver utilizes measured weather data for computing the real-time thermal rating of the TLs. The performance of the presented method is demonstrated from a field studymore » of DLR technology implementation on four TL segments at AltaLink, Canada. The performance is evaluated and quantified by comparing the existing static and proposed dynamic line ratings, and the potential benefits of DLR for enhanced transmission assets utilization. For the given line segments, the proposed DLR results in real-time ratings above the seasonal static ratings for most of the time; up to 95.1% of the time, with a mean increase of 72% over static rating.« less
ProteinAC: a frequency domain technique for analyzing protein dynamics
NASA Astrophysics Data System (ADS)
Bozkurt Varolgunes, Yasemin; Demir, Alper
2018-03-01
It is widely believed that the interactions of proteins with ligands and other proteins are determined by their dynamic characteristics as opposed to only static, time-invariant processes. We propose a novel computational technique, called ProteinAC (PAC), that can be used to analyze small scale functional protein motions as well as interactions with ligands directly in the frequency domain. PAC was inspired by a frequency domain analysis technique that is widely used in electronic circuit design, and can be applied to both coarse-grained and all-atom models. It can be considered as a generalization of previously proposed static perturbation-response methods, where the frequency of the perturbation becomes the key. We discuss the precise relationship of PAC to static perturbation-response schemes. We show that the frequency of the perturbation may be an important factor in protein dynamics. Perturbations at different frequencies may result in completely different response behavior while magnitude and direction are kept constant. Furthermore, we introduce several novel frequency dependent metrics that can be computed via PAC in order to characterize response behavior. We present results for the ferric binding protein that demonstrate the potential utility of the proposed techniques.
Statistical Learning of Origin-Specific Statically Optimal Individualized Treatment Rules
van der Laan, Mark J.; Petersen, Maya L.
2008-01-01
Consider a longitudinal observational or controlled study in which one collects chronological data over time on a random sample of subjects. The time-dependent process one observes on each subject contains time-dependent covariates, time-dependent treatment actions, and an outcome process or single final outcome of interest. A statically optimal individualized treatment rule (as introduced in van der Laan et. al. (2005), Petersen et. al. (2007)) is a treatment rule which at any point in time conditions on a user-supplied subset of the past, computes the future static treatment regimen that maximizes a (conditional) mean future outcome of interest, and applies the first treatment action of the latter regimen. In particular, Petersen et. al. (2007) clarified that, in order to be statically optimal, an individualized treatment rule should not depend on the observed treatment mechanism. Petersen et. al. (2007) further developed estimators of statically optimal individualized treatment rules based on a past capturing all confounding of past treatment history on outcome. In practice, however, one typically wishes to find individualized treatment rules responding to a user-supplied subset of the complete observed history, which may not be sufficient to capture all confounding. The current article provides an important advance on Petersen et. al. (2007) by developing locally efficient double robust estimators of statically optimal individualized treatment rules responding to such a user-supplied subset of the past. However, failure to capture all confounding comes at a price; the static optimality of the resulting rules becomes origin-specific. We explain origin-specific static optimality, and discuss the practical importance of the proposed methodology. We further present the results of a data analysis in which we estimate a statically optimal rule for switching antiretroviral therapy among patients infected with resistant HIV virus. PMID:19122792
Design and Implementation of an MC68020-Based Educational Computer Board
1989-12-01
device and the other for a Macintosh personal computer. A stored program can be installed in 8K bytes Programmable Read Only Memory (PROM) to initialize...MHz. It includes four * Static Random Access Memory (SRAM) chips which provide a storage of 32K bytes. Two Programmable Array Logic (PAL) chips...device and the other for a Macintosh personal computer. A stored program can be installed in 8K bytes Programmable Read Only Memory (PROM) to
The relative effectiveness of computer-based and traditional resources for education in anatomy.
Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R; Wainman, Bruce
2013-01-01
There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic model. We conducted a controlled trial in which 60 undergraduate students had ten minutes to study the names of 20 different pelvic structures. The outcome measure was a 25 item short answer test consisting of 15 nominal and 10 functional questions, based on a cadaveric pelvis. All subjects also took a brief mental rotations test (MRT) as a measure of spatial ability, used as a covariate in the analysis. Data were analyzed with repeated measures ANOVA. The group learning from the model performed significantly better than the other two groups on the nominal questions (Model 67%; KV 40%; VR 41%, Effect size 1.19 and 1.29, respectively). There was no difference between the KV and VR groups. There was no difference between the groups on the functional questions (Model 28%; KV, 23%, VR 25%). Computer-based learning resources appear to have significant disadvantages compared to traditional specimens in learning nominal anatomy. Consistent with previous research, virtual reality shows no advantage over static presentation of key views. © 2013 American Association of Anatomists.
Efficient shortest-path-tree computation in network routing based on pulse-coupled neural networks.
Qu, Hong; Yi, Zhang; Yang, Simon X
2013-06-01
Shortest path tree (SPT) computation is a critical issue for routers using link-state routing protocols, such as the most commonly used open shortest path first and intermediate system to intermediate system. Each router needs to recompute a new SPT rooted from itself whenever a change happens in the link state. Most commercial routers do this computation by deleting the current SPT and building a new one using static algorithms such as the Dijkstra algorithm at the beginning. Such recomputation of an entire SPT is inefficient, which may consume a considerable amount of CPU time and result in a time delay in the network. Some dynamic updating methods using the information in the updated SPT have been proposed in recent years. However, there are still many limitations in those dynamic algorithms. In this paper, a new modified model of pulse-coupled neural networks (M-PCNNs) is proposed for the SPT computation. It is rigorously proved that the proposed model is capable of solving some optimization problems, such as the SPT. A static algorithm is proposed based on the M-PCNNs to compute the SPT efficiently for large-scale problems. In addition, a dynamic algorithm that makes use of the structure of the previously computed SPT is proposed, which significantly improves the efficiency of the algorithm. Simulation results demonstrate the effective and efficient performance of the proposed approach.
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Girls Save the World through Computer Science
ERIC Educational Resources Information Center
Murakami, Christine
2011-01-01
It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…
ERIC Educational Resources Information Center
Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2015-01-01
The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…
Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study
ERIC Educational Resources Information Center
Herling, Lourdes
2011-01-01
The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…
ERIC Educational Resources Information Center
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-01-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…
Non-Determinism: An Abstract Concept in Computer Science Studies
ERIC Educational Resources Information Center
Armoni, Michal; Gal-Ezer, Judith
2007-01-01
Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…
An Investigation of Primary School Science Teachers' Use of Computer Applications
ERIC Educational Resources Information Center
Ocak, Mehmet Akif; Akdemir, Omur
2008-01-01
This study investigated the level and frequency of science teachers' use of computer applications as an instructional tool in the classroom. The manner and frequency of science teachers' use of computer, their perceptions about integration of computer applications, and other factors contributed to changes in their computer literacy are…
Methodical Approaches to Teaching of Computer Modeling in Computer Science Course
ERIC Educational Resources Information Center
Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina
2015-01-01
The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…
The Influence of Visual and Spatial Reasoning in Interpreting Simulated 3D Worlds.
ERIC Educational Resources Information Center
Lowrie, Tom
2002-01-01
Explores ways in which 6-year-old children make sense of screen-based images on the computer. Uses both static and relatively dynamic software programs in the investigation. Suggests that young children should be exposed to activities that establish explicit links between 2D and 3D objects away from the computer before attempting difficult links…
Climate Modeling Computing Needs Assessment
NASA Astrophysics Data System (ADS)
Petraska, K. E.; McCabe, J. D.
2011-12-01
This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.
Static Memory Deduplication for Performance Optimization in Cloud Computing.
Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan
2017-04-27
In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible.
Static Memory Deduplication for Performance Optimization in Cloud Computing
Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan
2017-01-01
In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible. PMID:28448434
NASA Astrophysics Data System (ADS)
Erhard, Jannis; Bleiziffer, Patrick; Görling, Andreas
2016-09-01
A power series approximation for the correlation kernel of time-dependent density-functional theory is presented. Using this approximation in the adiabatic-connection fluctuation-dissipation (ACFD) theorem leads to a new family of Kohn-Sham methods. The new methods yield reaction energies and barriers of unprecedented accuracy and enable a treatment of static (strong) correlation with an accuracy of high-level multireference configuration interaction methods but are single-reference methods allowing for a black-box-like handling of static correlation. The new methods exhibit a better scaling of the computational effort with the system size than rivaling wave-function-based electronic structure methods. Moreover, the new methods do not suffer from the problem of singularities in response functions plaguing previous ACFD methods and therefore are applicable to any type of electronic system.
feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology
Ontology for customer centric digital services and analytics
NASA Astrophysics Data System (ADS)
Keat, Ng Wai; Shahrir, Mohammad Shazri
2017-11-01
In computer science research, ontologies are commonly utilised to create a unified abstract across many rich and different fields. In this paper, we apply the concept to the customer centric domain of digital services analytics and present an analytics solution ontology. The essence is based from traditional Entity Relationship Diagram (ERD), which then was abstracted out to cover wider areas on customer centric digital services. The ontology we developed covers both static aspects (customer identifiers) and dynamic aspects (customer's temporal interactions). The structure of the customer scape is modeled with classes that represent different types of customer touch points, ranging from digital and digital-stamps which represent physical analogies. The dynamic aspects of customer centric digital service are modeled with a set of classes, with the importance is represented in different associations involving establishment and termination of the target interaction. The realized ontology can be used in development of frameworks for customer centric applications, and for specification of common data format used by cooperating digital service applications.
Swelling, Structure, and Phase Stability of Soft, Compressible Microgels
NASA Astrophysics Data System (ADS)
Denton, Alan R.; Urich, Matthew
Microgels are soft colloidal particles that swell when dispersed in a solvent. The equilibrium particle size is governed by a delicate balance of osmotic pressures, which can be tuned by varying single-particle properties and externally controlled conditions, such as temperature, pH, ionic strength, and concentration. Because of their tunable size and ability to encapsulate dye or drug molecules, microgels have practical relevance for biosensing, drug delivery, carbon capture, and filtration. Using Monte Carlo simulation, we model suspensions of microgels that interact via Hertzian elastic interparticle forces and can expand or contract via trial size changes governed by the Flory-Rehner free energy of cross-linked polymer gels. We analyze the influence of particle compressibility and size fluctuations on bulk structural and thermal properties by computing swelling ratios, radial distribution functions, static structure factors, osmotic pressures, and freezing densities. With increasing density, microgels progressively deswell and their intrinsic polydispersity broadens, while compressibility acts to forestall crystallization. This work was supported by the National Science Foundation under Grant No. DMR- 1106331.
Computer-aided design and computer science technology
NASA Technical Reports Server (NTRS)
Fulton, R. E.; Voigt, S. J.
1976-01-01
A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.
NASA Astrophysics Data System (ADS)
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-07-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.
ICASE Computer Science Program
NASA Technical Reports Server (NTRS)
1985-01-01
The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.
How static media is understood and used by high school science teachers
NASA Astrophysics Data System (ADS)
Hirata, Miguel
The purpose of the present study is to explore the role of static media in textbooks, as defined by Mayer (2001) in the form of printed images and text, and how these media are viewed and used by high school science teachers. Textbooks appeared in the United States in the late 1800s, and since then pictorial aids have been used extensively in them to support the teacher's work in the classroom (Giordano, 2003). According to Woodward, Elliott, and Nagel (1988/2013) the research on textbooks prior to the 1970s doesn't present relevant work related to the curricular role and the quality and instructional design of textbooks. Since then there has been abundant research, specially on the use of visual images in textbooks that has been approached from: (a) the text/image ratio (Evans, Watson, & Willows, 1987; Levin & Mayer, 1993; Mayer, 1993; Woodward, 1993), and (b) the instructional effectiveness of images (Woodward, 1993). The theoretical framework for this study comes from multimedia learning (Mayer, 2001), information design (Pettersson, 2002), and visual literacy (Moore & Dwyer, 1994). Data was collected through in-depth interviews of three high school science teachers and the graphic analyses of three textbooks used by the interviewed teachers. The interview data were compared through an analytic model developed from the literature, and the graphic analyses were performed using Mayer's multimedia learning principles (Mayer, 2001) and the Graphic Analysis Protocol (GAP) (Slough & McTigue, 2013). The conclusions of this study are: (1) pictures are specially useful for teaching science because science is a difficult subject to teach, (2) due this difficulty, pictures are very important to make the class dynamic and avoid students distraction, (3) static and dynamic media when used together can be more effective, (4) some specific type of graphics were found in the science textbooks used by the participants, in this case they were naturalistic drawings, stylized drawings, scale diagram, flow chart - cycle, flow chart - sequence, and hybrids, no photographs were found, (5) graphics can be related not only to the general text but specifically to the captions, (6) the textbooks analyzed had a balanced proportion of text and graphics, and (7) to facilitate the text-graphics relationship the spatial contiguity of both elements is key to their semantic integration.
Hypersonic and Unsteady Flow Science Issues for Explosively Formed Penetrators
2006-08-01
under going real- time dynamic deformation. ACCOMPLISHMENTS/RESULTS • Completed initial assessment of flow chemistry • Completed initial stability... flow chemistry during rapid deformation •Cannot use static boundary conditions in CFD codes •Interfaces one approach to coupling with hydrocodes
Radiative-Transfer Modeling of Spectra of Densely Packed Particulate Media
NASA Astrophysics Data System (ADS)
Ito, G.; Mishchenko, M. I.; Glotch, T. D.
2017-12-01
Remote sensing measurements over a wide range of wavelengths from both ground- and space-based platforms have provided a wealth of data regarding the surfaces and atmospheres of various solar system bodies. With proper interpretations, important properties, such as composition and particle size, can be inferred. However, proper interpretation of such datasets can often be difficult, especially for densely packed particulate media with particle sizes on the order of wavelength of light being used for remote sensing. Radiative transfer theory has often been applied to the study of densely packed particulate media like planetary regoliths and snow, but with difficulty, and here we continue to investigate radiative transfer modeling of spectra of densely packed media. We use the superposition T-matrix method to compute scattering properties of clusters of particles and capture the near-field effects important for dense packing. Then, the scattering parameters from the T-matrix computations are modified with the static structure factor correction, accounting for the dense packing of the clusters themselves. Using these corrected scattering parameters, reflectance (or emissivity via Kirchhoff's Law) is computed with the method of invariance imbedding solution to the radiative transfer equation. For this work we modeled the emissivity spectrum of the 3.3 µm particle size fraction of enstatite, representing some common mineralogical and particle size components of regoliths, in the mid-infrared wavelengths (5 - 50 µm). The modeled spectrum from the T-matrix method with static structure factor correction using moderate packing densities (filling factors of 0.1 - 0.2) produced better fits to the laboratory measurement of corresponding spectrum than the spectrum modeled by the equivalent method without static structure factor correction. Future work will test the method of the superposition T-matrix and static structure factor correction combination for larger particles sizes and polydispersed clusters in search for the most effective modeling of spectra of densely packed particulate media.
Variation of Static-PPP Positioning Accuracy Using GPS-Single Frequency Observations (Aswan, Egypt)
NASA Astrophysics Data System (ADS)
Farah, Ashraf
2017-06-01
Precise Point Positioning (PPP) is a technique used for position computation with a high accuracy using only one GNSS receiver. It depends on highly accurate satellite position and clock data rather than broadcast ephemeries. PPP precision varies based on positioning technique (static or kinematic), observations type (single or dual frequency) and the duration of collected observations. PPP-(dual frequency receivers) offers comparable accuracy to differential GPS. PPP-single frequency receivers has many applications such as infrastructure, hydrography and precision agriculture. PPP using low cost GPS single-frequency receivers is an area of great interest for millions of users in developing countries such as Egypt. This research presents a study for the variability of single frequency static GPS-PPP precision based on different observation durations.
Static analysis of the hull plate using the finite element method
NASA Astrophysics Data System (ADS)
Ion, A.
2015-11-01
This paper aims at presenting the static analysis for two levels of a container ship's construction as follows: the first level is at the girder / hull plate and the second level is conducted at the entire strength hull of the vessel. This article will describe the work for the static analysis of a hull plate. We shall use the software package ANSYS Mechanical 14.5. The program is run on a computer with four Intel Xeon X5260 CPU processors at 3.33 GHz, 32 GB memory installed. In terms of software, the shared memory parallel version of ANSYS refers to running ANSYS across multiple cores on a SMP system. The distributed memory parallel version of ANSYS (Distributed ANSYS) refers to running ANSYS across multiple processors on SMP systems or DMP systems.
a Virtual Trip to the Schwarzschild-De Sitter Black Hole
NASA Astrophysics Data System (ADS)
Bakala, Pavel; Hledík, Stanislav; Stuchlík, Zdenĕk; Truparová, Kamila; Čermák, Petr
2008-09-01
We developed realistic fully general relativistic computer code for simulation of optical projection in a strong, spherically symmetric gravitational field. Standard theoretical analysis of optical projection for an observer in the vicinity of a Schwarzschild black hole is extended to black hole spacetimes with a repulsive cosmological constant, i.e, Schwarzschild-de Sitter (SdS) spacetimes. Influence of the cosmological constant is investigated for static observers and observers radially free-falling from static radius. Simulation includes effects of gravitational lensing, multiple images, Doppler and gravitational frequency shift, as well as the amplification of intensity. The code generates images of static observers sky and a movie simulations for radially free-falling observers. Techniques of parallel programming are applied to get high performance and fast run of the simulation code.
Static sign language recognition using 1D descriptors and neural networks
NASA Astrophysics Data System (ADS)
Solís, José F.; Toxqui, Carina; Padilla, Alfonso; Santiago, César
2012-10-01
A frame work for static sign language recognition using descriptors which represents 2D images in 1D data and artificial neural networks is presented in this work. The 1D descriptors were computed by two methods, first one consists in a correlation rotational operator.1 and second is based on contour analysis of hand shape. One of the main problems in sign language recognition is segmentation; most of papers report a special color in gloves or background for hand shape analysis. In order to avoid the use of gloves or special clothing, a thermal imaging camera was used to capture images. Static signs were picked up from 1 to 9 digits of American Sign Language, a multilayer perceptron reached 100% recognition with cross-validation.
Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State
ERIC Educational Resources Information Center
Lewis, Colleen Marie
2012-01-01
To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…
Ferrecchia, Christie E; Jensen, Kelly; Van Andel, Roger
2014-03-01
The relationship among ammonia levels, cage-changing frequency, and bedding types is an important and potentially controversial topic in the laboratory animal science community. Some bedding options may not provide sufficient urine absorption and bacterial regulation to minimize ammonia production during the interval between cage changes. High intracage ammonia levels can cause subclinical degeneration and inflammation of nasal passages, rhinitis and olfactory epithelial necrosis in exposed mice. Here we sought to compare the effects of 4 commonly used bedding substrates (1/4-in. irradiated corncob, reclaimed wood pulp, aspen wood chips, and recycled newspaper) on ammonia generation when housing female C57BL/6 mice in static and individually ventilated caging. Intracage ammonia levels were measured daily for 1 wk (static cage experiment) or 2 wk (IVC experiment). The results of this study suggest that the corncob, aspen wood chip, and recycled newspaper beddings that we tested are suitable for once-weekly cage changing for static cages and for changing every 2 wk for IVC. However, ammonia levels were not controlled appropriately in cages containing reclaimed wood pulp bedding, and pathologic changes occurred within 1 wk in the nares of mice housed on this bedding in static cages.
Ferrecchia, Christie E; Jensen, Kelly; Andel, Roger Van
2014-01-01
The relationship among ammonia levels, cage-changing frequency, and bedding types is an important and potentially controversial topic in the laboratory animal science community. Some bedding options may not provide sufficient urine absorption and bacterial regulation to minimize ammonia production during the interval between cage changes. High intracage ammonia levels can cause subclinical degeneration and inflammation of nasal passages, rhinitis and olfactory epithelial necrosis in exposed mice. Here we sought to compare the effects of 4 commonly used bedding substrates (1/4-in. irradiated corncob, reclaimed wood pulp, aspen wood chips, and recycled newspaper) on ammonia generation when housing female C57BL/6 mice in static and individually ventilated caging. Intracage ammonia levels were measured daily for 1 wk (static cage experiment) or 2 wk (IVC experiment). The results of this study suggest that the corncob, aspen wood chip, and recycled newspaper beddings that we tested are suitable for once-weekly cage changing for static cages and for changing every 2 wk for IVC. However, ammonia levels were not controlled appropriately in cages containing reclaimed wood pulp bedding, and pathologic changes occurred within 1 wk in the nares of mice housed on this bedding in static cages. PMID:24602540
Scientific Computing Strategic Plan for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiting, Eric Todd
Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less
Game On, Science - How Video Game Technology May Help Biologists Tackle Visualization Challenges
Da Silva, Franck; Empereur-mot, Charly; Chavent, Matthieu; Baaden, Marc
2013-01-01
The video games industry develops ever more advanced technologies to improve rendering, image quality, ergonomics and user experience of their creations providing very simple to use tools to design new games. In the molecular sciences, only a small number of experts with specialized know-how are able to design interactive visualization applications, typically static computer programs that cannot easily be modified. Are there lessons to be learned from video games? Could their technology help us explore new molecular graphics ideas and render graphics developments accessible to non-specialists? This approach points to an extension of open computer programs, not only providing access to the source code, but also delivering an easily modifiable and extensible scientific research tool. In this work, we will explore these questions using the Unity3D game engine to develop and prototype a biological network and molecular visualization application for subsequent use in research or education. We have compared several routines to represent spheres and links between them, using either built-in Unity3D features or our own implementation. These developments resulted in a stand-alone viewer capable of displaying molecular structures, surfaces, animated electrostatic field lines and biological networks with powerful, artistic and illustrative rendering methods. We consider this work as a proof of principle demonstrating that the functionalities of classical viewers and more advanced novel features could be implemented in substantially less time and with less development effort. Our prototype is easily modifiable and extensible and may serve others as starting point and platform for their developments. A webserver example, standalone versions for MacOS X, Linux and Windows, source code, screen shots, videos and documentation are available at the address: http://unitymol.sourceforge.net/. PMID:23483961
Game on, science - how video game technology may help biologists tackle visualization challenges.
Lv, Zhihan; Tek, Alex; Da Silva, Franck; Empereur-mot, Charly; Chavent, Matthieu; Baaden, Marc
2013-01-01
The video games industry develops ever more advanced technologies to improve rendering, image quality, ergonomics and user experience of their creations providing very simple to use tools to design new games. In the molecular sciences, only a small number of experts with specialized know-how are able to design interactive visualization applications, typically static computer programs that cannot easily be modified. Are there lessons to be learned from video games? Could their technology help us explore new molecular graphics ideas and render graphics developments accessible to non-specialists? This approach points to an extension of open computer programs, not only providing access to the source code, but also delivering an easily modifiable and extensible scientific research tool. In this work, we will explore these questions using the Unity3D game engine to develop and prototype a biological network and molecular visualization application for subsequent use in research or education. We have compared several routines to represent spheres and links between them, using either built-in Unity3D features or our own implementation. These developments resulted in a stand-alone viewer capable of displaying molecular structures, surfaces, animated electrostatic field lines and biological networks with powerful, artistic and illustrative rendering methods. We consider this work as a proof of principle demonstrating that the functionalities of classical viewers and more advanced novel features could be implemented in substantially less time and with less development effort. Our prototype is easily modifiable and extensible and may serve others as starting point and platform for their developments. A webserver example, standalone versions for MacOS X, Linux and Windows, source code, screen shots, videos and documentation are available at the address: http://unitymol.sourceforge.net/.
Regan, R. Steven; Markstrom, Steven L.; Hay, Lauren E.; Viger, Roland J.; Norton, Parker A.; Driscoll, Jessica M.; LaFontaine, Jacob H.
2018-01-08
This report documents several components of the U.S. Geological Survey National Hydrologic Model of the conterminous United States for use with the Precipitation-Runoff Modeling System (PRMS). It provides descriptions of the (1) National Hydrologic Model, (2) Geospatial Fabric for National Hydrologic Modeling, (3) PRMS hydrologic simulation code, (4) parameters and estimation methods used to compute spatially and temporally distributed default values as required by PRMS, (5) National Hydrologic Model Parameter Database, and (6) model extraction tool named Bandit. The National Hydrologic Model Parameter Database contains values for all PRMS parameters used in the National Hydrologic Model. The methods and national datasets used to estimate all the PRMS parameters are described. Some parameter values are derived from characteristics of topography, land cover, soils, geology, and hydrography using traditional Geographic Information System methods. Other parameters are set to long-established default values and computation of initial values. Additionally, methods (statistical, sensitivity, calibration, and algebraic) were developed to compute parameter values on the basis of a variety of nationally-consistent datasets. Values in the National Hydrologic Model Parameter Database can periodically be updated on the basis of new parameter estimation methods and as additional national datasets become available. A companion ScienceBase resource provides a set of static parameter values as well as images of spatially-distributed parameters associated with PRMS states and fluxes for each Hydrologic Response Unit across the conterminuous United States.
Thimmaiah, Tim; Voje, William E; Carothers, James M
2015-01-01
With progress toward inexpensive, large-scale DNA assembly, the demand for simulation tools that allow the rapid construction of synthetic biological devices with predictable behaviors continues to increase. By combining engineered transcript components, such as ribosome binding sites, transcriptional terminators, ligand-binding aptamers, catalytic ribozymes, and aptamer-controlled ribozymes (aptazymes), gene expression in bacteria can be fine-tuned, with many corollaries and applications in yeast and mammalian cells. The successful design of genetic constructs that implement these kinds of RNA-based control mechanisms requires modeling and analyzing kinetically determined co-transcriptional folding pathways. Transcript design methods using stochastic kinetic folding simulations to search spacer sequence libraries for motifs enabling the assembly of RNA component parts into static ribozyme- and dynamic aptazyme-regulated expression devices with quantitatively predictable functions (rREDs and aREDs, respectively) have been described (Carothers et al., Science 334:1716-1719, 2011). Here, we provide a detailed practical procedure for computational transcript design by illustrating a high throughput, multiprocessor approach for evaluating spacer sequences and generating functional rREDs. This chapter is written as a tutorial, complete with pseudo-code and step-by-step instructions for setting up a computational cluster with an Amazon, Inc. web server and performing the large numbers of kinefold-based stochastic kinetic co-transcriptional folding simulations needed to design functional rREDs and aREDs. The method described here should be broadly applicable for designing and analyzing a variety of synthetic RNA parts, devices and transcripts.
A Cognitive Model for Problem Solving in Computer Science
ERIC Educational Resources Information Center
Parham, Jennifer R.
2009-01-01
According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
Defining Computational Thinking for Mathematics and Science Classrooms
ERIC Educational Resources Information Center
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-01-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…
Static structure of a pointed charged drop
NASA Astrophysics Data System (ADS)
Fernandez de La Mora, Juan
2017-11-01
The static equilibrium structure of an equipotential drop with two symmetric Taylor cones is computed by assigning a charge distribution along the z axis q (z) = ∑Bn (L2 -z2)n + 1 / 2 . Taylor's local equilibrium at the poles z = L , - L fixes two of the Bn coefficients as a function of the other, determined by minimizing stress imbalance. Just two optimally chosen terms in the Bn expansion yield imperceptible errors. Prior work has argued that an exploding drop initially carrying Rayleigh's charge qR is quasi static. Paradoxically, quasi-static predictions on the size of the progeny drops emitted during a Coulombic explosion disagree with observations. The static drop structure found here also models poorly a Coulomb explosion having an equatorial over polar length ratio (0.42) and the a drop charge exceeding those observed (0.28-0.36 and qR / 2). Our explanation for this paradox is that, while the duration tc of a Coulomb explosion is much larger than the charge relaxation time, the dynamic time scale for drop elongation is typically far longer than tc. Therefore, the pressure distribution within the exploding drop is not uniform. A similar analysis for a drop in an external field fits well the experimental shape.
Sela, Shai; van Es, Harold M; Moebius-Clune, Bianca N; Marjerison, Rebecca; Moebius-Clune, Daniel; Schindelbeck, Robert; Severson, Keith; Young, Eric
2017-03-01
Large temporal and spatial variability in soil nitrogen (N) availability leads many farmers across the United States to over-apply N fertilizers in maize ( L.) production environments, often resulting in large environmental N losses. Static Stanford-type N recommendation tools are typically promoted in the United States, but new dynamic model-based decision tools allow for highly adaptive N recommendations that account for specific production environments and conditions. This study compares the Corn N Calculator (CNC), a static N recommendation tool for New York, to Adapt-N, a dynamic simulation tool that combines soil, crop, and management information with real-time weather data to estimate optimum N application rates for maize. The efficiency of the two tools in predicting the Economically Optimum N Rate (EONR) is compared using field data from 14 multiple N-rate trials conducted in New York during the years 2011 through 2015. The CNC tool was used with both realistic grower-estimated potential yields and those extracted from the CNC default database, which were found to be unrealistically low when compared with field data. By accounting for weather and site-specific conditions, the Adapt-N tool was found to increase the farmer profits and significantly improve the prediction of the EONR (RMSE = 34 kg ha). Furthermore, using a dynamic instead of a static approach led to reduced N application rates, which in turn resulted in substantially lower simulated environmental N losses. This study shows that better N management through a dynamic decision tool such as Adapt-N can help reduce environmental impacts while sustaining farm economic viability. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Fate and Effect of Antibiotics in Beef and Dairy Manure during Static and Turned Composting.
Ray, Partha; Chen, Chaoqi; Knowlton, Katharine F; Pruden, Amy; Xia, Kang
2017-01-01
Manure composting has general benefits for production of soil amendment, but the effects of composting on antibiotic persistence and effects of antibiotics on the composting process are not well-characterized, especially for antibiotics commonly used in dairy cattle. This study provides a comprehensive, head-to-head, replicated comparison of the effect of static and turned composting on typical antibiotics used in beef and dairy cattle in their actual excreted form and corresponding influence on composting efficacy. Manure from steers (with or without chlortetracycline, sulfamethazine, and tylosin feeding) and dairy cows (with or without pirlimycin and cephapirin administration) were composted at small scale (wet mass: 20-22 kg) in triplicate under static and turned conditions adapted to represent US Food and Drug Administration guidelines. Thermophilic temperature (>55°C) was attained and maintained for 3 d in all composts, with no measureable effect of compost method on the pattern, rate, or extent of disappearance of the antibiotics examined, except tylosin. Disappearance of all antibiotics, except pirlimycin, followed bi-phasic first-order kinetics. However, individual antibiotics displayed different fate patterns in response to the treatments. Reduction in concentration of chlortetracycline (71-84%) and tetracycline (66-72%) was substantial, while near-complete removal of sulfamethazine (97-98%) and pirlimycin (100%) was achieved. Tylosin removal during composting was relatively poor. Both static and turned composting were generally effective for reducing most beef and dairy antibiotic residuals excreted in manure, with no apparent negative impact of antibiotics on the composting process, but with some antibiotics apparently more recalcitrant than others. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
NASA Center for Computational Sciences: History and Resources
NASA Technical Reports Server (NTRS)
2000-01-01
The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.
Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.
Computers in Science: Thinking Outside the Discipline.
ERIC Educational Resources Information Center
Hamilton, Todd M.
2003-01-01
Describes the Computers in Science course which integrates computer-related techniques into the science disciplines of chemistry, physics, biology, and Earth science. Uses a team teaching approach and teaches students how to solve chemistry problems with spreadsheets, identify minerals with X-rays, and chemical and force analysis. (Contains 14…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting SUMMARY: As a result of the impact of the recent government shutdown, the... Committee for Computer and Information Science and Engineering meeting. The public notice for this committee...
Exemplary Science Teachers' Use of Technology
ERIC Educational Resources Information Center
Hakverdi-Can, Meral; Dana, Thomas M.
2012-01-01
The purpose of this study is to examine exemplary science teachers' level of computer use, their knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, how often they required their students to use those applications in or for their science class…
2017-12-27
were determined and the effects of changes in loading rate and solution on this susceptibility were determined. Technical Approach The technical... approach followed in this completed work has been to conduct quasi- static fracture and fatigue experiments on 5XXX commercial aluminum alloys of interest...Metallic Materials Studied by Correlative Tomography", in 38th Riso International Symposium on Materials Science - IOP Conf. Series: Materials Science
1998-09-30
Dr. Jan Rogers, project scientist for the Electrostatic Levitator (ESL) at NASA's Marshall Space Flight Center(MSFC). The ESL uses static electricity to suspend an obejct (about 2-3 mm in diameter) inside a vacuum chamber while a laser heats the sample until it melts. This lets scientists record a wide range of physical properties without the sample contacting the container or any instruments, conditions that would alter the readings. The Electrostatic Levitator is one of several tools used in NASA's microgravity materials sciences program.
ERIC Educational Resources Information Center
Science and Children, 1990
1990-01-01
Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…
Task allocation in a distributed computing system
NASA Technical Reports Server (NTRS)
Seward, Walter D.
1987-01-01
A conceptual framework is examined for task allocation in distributed systems. Application and computing system parameters critical to task allocation decision processes are discussed. Task allocation techniques are addressed which focus on achieving a balance in the load distribution among the system's processors. Equalization of computing load among the processing elements is the goal. Examples of system performance are presented for specific applications. Both static and dynamic allocation of tasks are considered and system performance is evaluated using different task allocation methodologies.
NASA Astrophysics Data System (ADS)
Ren, Peng; Guo, Zitao
Quasi-static and dynamic fracture initiation toughness of gy4 armour steel material are investigated using three point bend specimen. The modified split Hopkinson pressure bar (SHPB) apparatus with digital image correlation (DIC) system is applied to dynamic loading experiments. Full-field deformation measurements are obtained by using DIC to elucidate on the strain fields associated with the mechanical response. A series of experiments are conducted at different strain rate ranging from 10-3 s-1 to 103 s-1, and the loading rate on the fracture initiation toughness is investigated. Specially, the scanning electron microscope imaging technique is used to investigate the fracture failure micromechanism of fracture surfaces. The gy4 armour steel material fracture toughness is found to be sensitive to strain rate and higher for dynamic loading as compared to quasi-static loading. This work is supported by National Nature Science Foundation under Grant 51509115.
An Overview of NASA's Intelligent Systems Program
NASA Technical Reports Server (NTRS)
Cooke, Daniel E.; Norvig, Peter (Technical Monitor)
2001-01-01
NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.
A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.
ERIC Educational Resources Information Center
Deek, Fadi P.; Kimmel, Howard
2002-01-01
Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)
A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…
Making Advanced Computer Science Topics More Accessible through Interactive Technologies
ERIC Educational Resources Information Center
Shao, Kun; Maher, Peter
2012-01-01
Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…
ASCR Workshop on Quantum Computing for Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward
This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less
BIOCOMPUTATION: some history and prospects.
Cull, Paul
2013-06-01
At first glance, biology and computer science are diametrically opposed sciences. Biology deals with carbon based life forms shaped by evolution and natural selection. Computer Science deals with electronic machines designed by engineers and guided by mathematical algorithms. In this brief paper, we review biologically inspired computing. We discuss several models of computation which have arisen from various biological studies. We show what these have in common, and conjecture how biology can still suggest answers and models for the next generation of computing problems. We discuss computation and argue that these biologically inspired models do not extend the theoretical limits on computation. We suggest that, in practice, biological models may give more succinct representations of various problems, and we mention a few cases in which biological models have proved useful. We also discuss the reciprocal impact of computer science on biology and cite a few significant contributions to biological science. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A Case Study of the Introduction of Computer Science in NZ Schools
ERIC Educational Resources Information Center
Bell, Tim; Andreae, Peter; Robins, Anthony
2014-01-01
For many years computing in New Zealand schools was focused on teaching students how to use computers, and there was little opportunity for students to learn about programming and computer science as formal subjects. In this article we review a series of initiatives that occurred from 2007 to 2009 that led to programming and computer science being…
Research in Applied Mathematics, Fluid Mechanics and Computer Science
NASA Technical Reports Server (NTRS)
1999-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.
[Research activities in applied mathematics, fluid mechanics, and computer science
NASA Technical Reports Server (NTRS)
1995-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.
Activities of the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1985-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1985 through October 2, 1985 is summarized.
Combat Simulation Using Breach Computer Language
1979-09-01
simulation and weapon system analysis computer language Two types of models were constructed: a stochastic duel and a dynamic engagement model The... duel model validates the BREACH approach by comparing results with mathematical solutions. The dynamic model shows the capability of the BREACH...BREACH 2 Background 2 The Language 3 Static Duel 4 Background and Methodology 4 Validation 5 Results 8 Tank Duel Simulation 8 Dynamic Assault Model
Evaluation of verification and testing tools for FORTRAN programs
NASA Technical Reports Server (NTRS)
Smith, K. A.
1980-01-01
Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.
Thermal stress analysis of reusable surface insulation for shuttle
NASA Technical Reports Server (NTRS)
Ojalvo, I. U.; Levy, A.; Austin, F.
1974-01-01
An iterative procedure for accurately determining tile stresses associated with static mechanical and thermally induced internal loads is presented. The necessary conditions for convergence of the method are derived. An user-oriented computer program based upon the present method of analysis was developed. The program is capable of analyzing multi-tiled panels and determining the associated stresses. Typical numerical results from this computer program are presented.
Study of magnetic resonance with parametric modulation in a potassium vapor cell
NASA Astrophysics Data System (ADS)
Zhang, Rui; Wang, Zhiguo; Peng, Xiang; Li, Wenhao; Li, Songjian; Guo, Hong; Cream Team
2017-04-01
A typical magnetic-resonance scheme employs a static bias magnetic field and an orthogonal driving magnetic field oscillating at the Larmor frequency, at which the atomic polarization precesses around the static magnetic field. We demonstrate in a potassium vapor cell the variations of the resonance condition and the spin precession dynamics resulting from the parametric modulation of the bias field, which are in well agreement with theoretical predictions from the Bloch equation. We show that, the driving magnetic field with the frequency detuned by different harmonics of the parametric modulation frequency can lead to resonance as well. Also, a series of frequency sidebands centered at the driving frequency and spaced by the parametric modulation frequency can be observed in the precession of the atomic polarization. These effects could be used in different atomic magnetometry applications. This work is supported by the National Science Fund for Distinguished Young Scholars of China (Grant No. 61225003) and the National Natural Science Foundation of China (Grant Nos. 61531003 and 61571018).
A Quantitative Model for Assessing Visual Simulation Software Architecture
2011-09-01
Software Engineering Arnold Buss Research Associate Professor of MOVES LtCol Jeff Boleng, PhD Associate Professor of Computer Science U.S. Air Force Academy... science (operating and programming systems series). New York, NY, USA: Elsevier Science Ltd. Henry, S., & Kafura, D. (1984). The evaluation of software...Rudy Darken Professor of Computer Science Dissertation Supervisor Ted Lewis Professor of Computer Science Richard Riehle Professor of Practice
ERIC Educational Resources Information Center
Wofford, Jennifer
2009-01-01
Computing is anticipated to have an increasingly expansive impact on the sciences overall, becoming the third, crucial component of a "golden triangle" that includes mathematics and experimental and theoretical science. However, even more true with computing than with math and science, we are not preparing our students for this new reality. It is…
Interactive Synthesis of Code Level Security Rules
2017-04-01
Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University
Approaching gender parity: Women in computer science at Afghanistan's Kabul University
NASA Astrophysics Data System (ADS)
Plane, Jandelyn
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in Afghanistan, they appear to hinder advancement to degree to a lesser extent. Women comprise at least 36% of each graduating class from KU's Computer Science Department; however, in 2007 women were 25% of the university population. In the US, women comprise over 50% of university populations while only graduating on average 25% women in undergraduate computer science programs. Representation of women in computer science in the US is 50% below the university rate, but at KU, it is 50% above the university rate. This mixed methods study of KU was conducted in the following three stages: setting up focus groups with women computer science students, distributing surveys to all students in the CS department, and conducting a series of 22 individual interviews with fourth year CS students. The analysis of the data collected and its comparison to literature on university/department retention in Science, Technology, Engineering and Mathematics gender representation and on women's education in underdeveloped Islamic countries illuminates KU's uncharacteristic representation of women in its Computer Science Department. The retention of women in STEM through the education pipeline has several characteristics in Afghanistan that differ from countries often studied in available literature. Few Afghan students have computers in their home and few have training beyond secretarial applications before considering studying CS at university. University students in Afghanistan are selected based on placement exams and are then assigned to an area of study, and financially supported throughout their academic career, resulting in a low attrition rate from the program. Gender and STEM literature identifies parental encouragement, stereotypes and employment perceptions as influential characteristics. Afghan women in computer science received significant parental encouragement even from parents with no computer background. They do not seem to be influenced by any negative "geek" stereotypes, but they do perceive limitations when considering employment after graduation.
Meijer, Tineke W H; de Geus-Oei, Lioe-Fee; Visser, Eric P; Oyen, Wim J G; Looijen-Salamon, Monika G; Visvikis, Dimitris; Verhagen, Ad F T M; Bussink, Johan; Vriens, Dennis
2017-05-01
Purpose To assess whether dynamic fluorine 18 ( 18 F) fluorodeoxyglucose (FDG) positron emission tomography (PET) has added value over static 18 F-FDG PET for tumor delineation in non-small cell lung cancer (NSCLC) radiation therapy planning by using pathology volumes as the reference standard and to compare pharmacokinetic rate constants of 18 F-FDG metabolism, including regional variation, between NSCLC histologic subtypes. Materials and Methods The study was approved by the institutional review board. Patients gave written informed consent. In this prospective observational study, 1-hour dynamic 18 F-FDG PET/computed tomographic examinations were performed in 35 patients (36 resectable NSCLCs) between 2009 and 2014. Static and parametric images of glucose metabolic rate were obtained to determine lesion volumes by using three delineation strategies. Pathology volume was calculated from three orthogonal dimensions (n = 32). Whole tumor and regional rate constants and blood volume fraction (V B ) were computed by using compartment modeling. Results Pathology volumes were larger than PET volumes (median difference, 8.7-25.2 cm 3 ; Wilcoxon signed rank test, P < .001). Static fuzzy locally adaptive Bayesian (FLAB) volumes corresponded best with pathology volumes (intraclass correlation coefficient, 0.72; P < .001). Bland-Altman analyses showed the highest precision and accuracy for static FLAB volumes. Glucose metabolic rate and 18 F-FDG phosphorylation rate were higher in squamous cell carcinoma (SCC) than in adenocarcinoma (AC), whereas V B was lower (Mann-Whitney U test or t test, P = .003, P = .036, and P = .019, respectively). Glucose metabolic rate, 18 F-FDG phosphorylation rate, and V B were less heterogeneous in AC than in SCC (Friedman analysis of variance). Conclusion Parametric images are not superior to static images for NSCLC delineation. FLAB-based segmentation on static 18 F-FDG PET images is in best agreement with pathology volume and could be useful for NSCLC autocontouring. Differences in glycolytic rate and V B between SCC and AC are relevant for research in targeting agents and radiation therapy dose escalation. © RSNA, 2016 Online supplemental material is available for this article.
Science-Technology Coupling: The Case of Mathematical Logic and Computer Science.
ERIC Educational Resources Information Center
Wagner-Dobler, Roland
1997-01-01
In the history of science, there have often been periods of sudden rapprochements between pure science and technology-oriented branches of science. Mathematical logic as pure science and computer science as technology-oriented science have experienced such a rapprochement, which is studied in this article in a bibliometric manner. (Author)
Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna
2017-12-01
To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.
Recent Enhancements to the Development of CFD-Based Aeroelastic Reduced-Order Models
NASA Technical Reports Server (NTRS)
Silva, Walter A.
2007-01-01
Recent enhancements to the development of CFD-based unsteady aerodynamic and aeroelastic reduced-order models (ROMs) are presented. These enhancements include the simultaneous application of structural modes as CFD input, static aeroelastic analysis using a ROM, and matched-point solutions using a ROM. The simultaneous application of structural modes as CFD input enables the computation of the unsteady aerodynamic state-space matrices with a single CFD execution, independent of the number of structural modes. The responses obtained from a simultaneous excitation of the CFD-based unsteady aerodynamic system are processed using system identification techniques in order to generate an unsteady aerodynamic state-space ROM. Once the unsteady aerodynamic state-space ROM is generated, a method for computing the static aeroelastic response using this unsteady aerodynamic ROM and a state-space model of the structure, is presented. Finally, a method is presented that enables the computation of matchedpoint solutions using a single ROM that is applicable over a range of dynamic pressures and velocities for a given Mach number. These enhancements represent a significant advancement of unsteady aerodynamic and aeroelastic ROM technology.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-04
... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting In accordance with Federal Advisory Committee Act (Pub. L. 92-463, as amended... Committee for Computer and Information Science and Engineering (1115). Date/Time: Oct 31, 2013: 12:30 p.m...
Activities of the Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1985-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1984 through March 31, 1985 is summarized.
[Research Conducted at the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1997-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1996 - 31 Mar. 1997.
Activities of the Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 2, 1987 through March 31, 1988.
[Activities of Institute for Computer Applications in Science and Engineering (ICASE)
NASA Technical Reports Server (NTRS)
1999-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics. fluid mechanics, and computer science during the period April 1, 1999 through September 30. 1999.
Practical Measurement of Complexity In Dynamic Systems
2012-01-01
policies that produce highly complex behaviors , yet yield no benefit. 21Jason B. Clark and David R. Jacques / Procedia Computer Science 8 (2012) 14... Procedia Computer Science 8 (2012) 14 – 21 1877-0509 © 2012 Published by Elsevier B.V. doi:10.1016/j.procs.2012.01.008 Available online at...www.sciencedirect.com Procedia Computer Science Procedia Computer Science 00 (2012) 000–000 www.elsevier.com/locate/ procedia Available online at
The role of physicality in rich programming environments
NASA Astrophysics Data System (ADS)
Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin
2013-12-01
Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot Virtual Worlds (RVWs), can be used to teach computer science principles within a robotics context by examining its use in high-school classrooms. We also investigated whether the lack of physicality in these environments impacts student learning by comparing classrooms that used either virtual or physical robots for the RVW curriculum. Results suggest that the RVW environment leads to significant gains in computer science knowledge, that virtual robots lead to faster learning, and that physical robots may have some influence on algorithmic thinking. We discuss the implications of physicality in these programming environments for learning computer science.
NASA Technical Reports Server (NTRS)
Brankovic, Andreja; Ryder, Robert C., Jr.; Hendricks, Robert C.; Liu, Nan-Suey; Gallagher, John R.; Shouse, Dale T.; Roquemore, W. Melvyn; Cooper, Clayton S.; Burrus, David L.; Hendricks, John A.
2002-01-01
The trapped vortex combustor (TVC) pioneered by Air Force Research Laboratories (AFRL) is under consideration as an alternative to conventional gas turbine combustors. The TVC has demonstrated excellent operational characteristics such as high combustion efficiency, low NO(x) emissions, effective flame stabilization, excellent high-altitude relight capability, and operation in the lean-burn or rich burn-quick quench-lean burn (RQL) modes of combustion. It also has excellent potential for lowering the engine combustor weight. This performance at low to moderate combustor mach numbers has stimulated interest in its ability to operate at higher combustion mach number, and for aerospace, this implies potentially higher flight mach numbers. To this end, a lobed diffuser-mixer that enhances the fuel-air mixing in the TVC combustor core was designed and evaluated, with special attention paid to the potential shock system entering the combustor core. For the present investigation, the lobed diffuser-mixer combustor rig is in a full annular configuration featuring sixfold symmetry among the lobes, symmetry within each lobe, and plain parallel, symmetric incident flow. During hardware cold-flow testing, significant discrepancies were found between computed and measured values for the pitot-probe-averaged static pressure profiles at the lobe exit plane. Computational fluid dynamics (CFD) simulations were initiated to determine whether the static pressure probe was causing high local flow-field disturbances in the supersonic flow exiting the diffuser-mixer and whether shock wave impingement on the pitot probe tip, pressure ports, or surface was the cause of the discrepancies. Simulations were performed with and without the pitot probe present in the modeling. A comparison of static pressure profiles without the probe showed that static pressure was off by nearly a factor of 2 over much of the radial profile, even when taking into account potential axial displacement of the probe by up to 0.25 in. (0.64 cm). Including the pitot probe in the CFD modeling and data interpretation lead to good agreement between measurement and prediction. Graphical inspection of the results showed that the shock waves impinging on the probe surface were highly nonuniform, with static pressure varying circumferentially among the pressure ports by over 10 percent in some cases. As part of the measurement methodology, such measurements should be routinely supplemented with CFD analyses that include the pitot probe as part of the flow-path geometry.
Path Not Found: Disparities in Access to Computer Science Courses in California High Schools
ERIC Educational Resources Information Center
Martin, Alexis; McAlear, Frieda; Scott, Allison
2015-01-01
"Path Not Found: Disparities in Access to Computer Science Courses in California High Schools" exposes one of the foundational causes of underrepresentation in computing: disparities in access to computer science courses in California's public high schools. This report provides new, detailed data on these disparities by student body…
ERIC Educational Resources Information Center
Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung
2010-01-01
Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…
EOS MLS Science Data Processing System: A Description of Architecture and Capabilities
NASA Technical Reports Server (NTRS)
Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.
2006-01-01
This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campione, Salvatore; Warne, Larry K.; Basilio, Lorena I.
In this paper we develop a fully-retarded, dipole approximation model to estimate the effective polarizabilities of a dimer made of dielectric resonators. They are computed from the polarizabilities of the two resonators composing the dimer. We analyze the situation of full-cubes as well as split-cubes, which have been shown to exhibit overlapping electric and magnetic resonances. We compare the effective dimer polarizabilities to ones retrieved via full-wave simulations as well as ones computed via a quasi-static, dipole approximation. We observe good agreement between the fully-retarded solution and the full-wave results, whereas the quasi-static approximation is less accurate for the problemmore » at hand. The developed model can be used to predict the electric and magnetic resonances of a dimer under parallel or orthogonal (to the dimer axis) excitation. This is particularly helpful when interested in locating frequencies at which the dimer will emit directional radiation.« less
Characteristics of middle and upper tropospheric clouds as deduced from rawinsonde data
NASA Technical Reports Server (NTRS)
Starr, D. D. O.; Cox, S. K.
1982-01-01
The static environment of middle and upper tropospheric clouds is characterized. Computed relative humidity with respect to ice is used to diagnose the presence of cloud layer. The deduced seasonal mean cloud cover estimates based on this technique are shown to be reasonable. The cases are stratified by season and pressure thickness, and the dry static stability, vertical wind speed shear, and Richardson number are computed for three layers for each case. Mean values for each parameter are presented for each stratification and layer. The relative frequency of occurrence of various structures is presented for each stratification. The observed values of each parameter and the observed structure of each parameter are quite variable. Structures corresponding to any of a number of different conceptual models may be found. Moist adiabatic conditions are not commonly observed and the stratification based on thickness yields substantially different results for each group.
Narayanan, Ajit; Chen, Yi; Pang, Shaoning; Tao, Ban
2013-01-01
The continuous growth of malware presents a problem for internet computing due to increasingly sophisticated techniques for disguising malicious code through mutation and the time required to identify signatures for use by antiviral software systems (AVS). Malware modelling has focused primarily on semantics due to the intended actions and behaviours of viral and worm code. The aim of this paper is to evaluate a static structure approach to malware modelling using the growing malware signature databases now available. We show that, if malware signatures are represented as artificial protein sequences, it is possible to apply standard sequence alignment techniques in bioinformatics to improve accuracy of distinguishing between worm and virus signatures. Moreover, aligned signature sequences can be mined through traditional data mining techniques to extract metasignatures that help to distinguish between viral and worm signatures. All bioinformatics and data mining analysis were performed on publicly available tools and Weka.
The Effects of Different Representations on Static Structure Analysis of Computer Malware Signatures
Narayanan, Ajit; Chen, Yi; Pang, Shaoning; Tao, Ban
2013-01-01
The continuous growth of malware presents a problem for internet computing due to increasingly sophisticated techniques for disguising malicious code through mutation and the time required to identify signatures for use by antiviral software systems (AVS). Malware modelling has focused primarily on semantics due to the intended actions and behaviours of viral and worm code. The aim of this paper is to evaluate a static structure approach to malware modelling using the growing malware signature databases now available. We show that, if malware signatures are represented as artificial protein sequences, it is possible to apply standard sequence alignment techniques in bioinformatics to improve accuracy of distinguishing between worm and virus signatures. Moreover, aligned signature sequences can be mined through traditional data mining techniques to extract metasignatures that help to distinguish between viral and worm signatures. All bioinformatics and data mining analysis were performed on publicly available tools and Weka. PMID:23983644
Computational approach for deriving cancer progression roadmaps from static sample data
Yao, Jin; Yang, Le; Chen, Runpu; Nowak, Norma J.
2017-01-01
Abstract As with any biological process, cancer development is inherently dynamic. While major efforts continue to catalog the genomic events associated with human cancer, it remains difficult to interpret and extrapolate the accumulating data to provide insights into the dynamic aspects of the disease. Here, we present a computational strategy that enables the construction of a cancer progression model using static tumor sample data. The developed approach overcame many technical limitations of existing methods. Application of the approach to breast cancer data revealed a linear, branching model with two distinct trajectories for malignant progression. The validity of the constructed model was demonstrated in 27 independent breast cancer data sets, and through visualization of the data in the context of disease progression we were able to identify a number of potentially key molecular events in the advance of breast cancer to malignancy. PMID:28108658
BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0
NASA Technical Reports Server (NTRS)
1991-01-01
The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.
SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX/80
NASA Astrophysics Data System (ADS)
Kamat, Manohar P.; Watson, Brian C.
1992-02-01
The results of a research activity aimed at providing a finite element capability for analyzing turbo-machinery bladed-disk assemblies in a vector/parallel processing environment are summarized. Analysis of aircraft turbofan engines is very computationally intensive. The performance limit of modern day computers with a single processing unit was estimated at 3 billions of floating point operations per second (3 gigaflops). In view of this limit of a sequential unit, performance rates higher than 3 gigaflops can be achieved only through vectorization and/or parallelization as on Alliant FX/80. Accordingly, the efforts of this critically needed research were geared towards developing and evaluating parallel finite element methods for static and vibration analysis. A special purpose code, named with the acronym SAPNEW, performs static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements.
NASA Technical Reports Server (NTRS)
Giles, G. L.; Wallas, M.
1981-01-01
User documentation is presented for a computer program which considers the nonlinear properties of the strain isolator pad (SIP) in the static stress analysis of the shuttle thermal protection system. This program is generalized to handle an arbitrary SIP footprint including cutouts for instrumentation and filler bar. Multiple SIP surfaces are defined to model tiles in unique locations such as leading edges, intersections, and penetrations. The nonlinearity of the SIP is characterized by experimental stress displacement data for both normal and shear behavior. Stresses in the SIP are calculated using a Newton iteration procedure to determine the six rigid body displacements of the tile which develop reaction forces in the SIP to equilibrate the externally applied loads. This user documentation gives an overview of the analysis capabilities, a detailed description of required input data and an example to illustrate use of the program.
Solute segregation kinetics and dislocation depinning in a binary alloy
NASA Astrophysics Data System (ADS)
Dontsova, E.; Rottler, J.; Sinclair, C. W.
2015-06-01
Static strain aging, a phenomenon caused by diffusion of solute atoms to dislocations, is an important contributor to the strength of substitutional alloys. Accurate modeling of this complex process requires both atomic spatial resolution and diffusional time scales, which is very challenging to achieve with commonly used atomistic computational methods. In this paper, we use the recently developed "diffusive molecular dynamics" (DMD) method that is capable of describing the kinetics of the solute segregation process at the atomic level while operating on diffusive time scales in a computationally efficient way. We study static strain aging in the Al-Mg system and calculate the depinning shear stress between edge and screw dislocations and their solute atmospheres formed for various waiting times with different solute content and for a range of temperatures. A simple phenomenological model is also proposed that describes the observed behavior of the critical shear stress as a function of segregation level.
SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX/80
NASA Technical Reports Server (NTRS)
Kamat, Manohar P.; Watson, Brian C.
1992-01-01
The results of a research activity aimed at providing a finite element capability for analyzing turbo-machinery bladed-disk assemblies in a vector/parallel processing environment are summarized. Analysis of aircraft turbofan engines is very computationally intensive. The performance limit of modern day computers with a single processing unit was estimated at 3 billions of floating point operations per second (3 gigaflops). In view of this limit of a sequential unit, performance rates higher than 3 gigaflops can be achieved only through vectorization and/or parallelization as on Alliant FX/80. Accordingly, the efforts of this critically needed research were geared towards developing and evaluating parallel finite element methods for static and vibration analysis. A special purpose code, named with the acronym SAPNEW, performs static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements.
A k-Vector Approach to Sampling, Interpolation, and Approximation
NASA Astrophysics Data System (ADS)
Mortari, Daniele; Rogers, Jonathan
2013-12-01
The k-vector search technique is a method designed to perform extremely fast range searching of large databases at computational cost independent of the size of the database. k-vector search algorithms have historically found application in satellite star-tracker navigation systems which index very large star catalogues repeatedly in the process of attitude estimation. Recently, the k-vector search algorithm has been applied to numerous other problem areas including non-uniform random variate sampling, interpolation of 1-D or 2-D tables, nonlinear function inversion, and solution of systems of nonlinear equations. This paper presents algorithms in which the k-vector search technique is used to solve each of these problems in a computationally-efficient manner. In instances where these tasks must be performed repeatedly on a static (or nearly-static) data set, the proposed k-vector-based algorithms offer an extremely fast solution technique that outperforms standard methods.
Recent advances in the modelling of crack growth under fatigue loading conditions
NASA Technical Reports Server (NTRS)
Dekoning, A. U.; Tenhoeve, H. J.; Henriksen, T. K.
1994-01-01
Fatigue crack growth associated with cyclic (secondary) plastic flow near a crack front is modelled using an incremental formulation. A new description of threshold behaviour under small load cycles is included. Quasi-static crack extension under high load excursions is described using an incremental formulation of the R-(crack growth resistance)- curve concept. The integration of the equations is discussed. For constant amplitude load cycles the results will be compared with existing crack growth laws. It will be shown that the model also properly describes interaction effects of fatigue crack growth and quasi-static crack extension. To evaluate the more general applicability the model is included in the NASGRO computer code for damage tolerance analysis. For this purpose the NASGRO program was provided with the CORPUS and the STRIP-YIELD models for computation of the crack opening load levels. The implementation is discussed and recent results of the verification are presented.
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan; Bittker, David A.
1993-01-01
A general chemical kinetics and sensitivity analysis code for complex, homogeneous, gas-phase reactions is described. The main features of the code, LSENS, are its flexibility, efficiency and convenience in treating many different chemical reaction models. The models include static system, steady, one-dimensional, inviscid flow, shock initiated reaction, and a perfectly stirred reactor. In addition, equilibrium computations can be performed for several assigned states. An implicit numerical integration method, which works efficiently for the extremes of very fast and very slow reaction, is used for solving the 'stiff' differential equation systems that arise in chemical kinetics. For static reactions, sensitivity coefficients of all dependent variables and their temporal derivatives with respect to the initial values of dependent variables and/or the rate coefficient parameters can be computed. This paper presents descriptions of the code and its usage, and includes several illustrative example problems.
Visualizing Parallel Computer System Performance
NASA Technical Reports Server (NTRS)
Malony, Allen D.; Reed, Daniel A.
1988-01-01
Parallel computer systems are among the most complex of man's creations, making satisfactory performance characterization difficult. Despite this complexity, there are strong, indeed, almost irresistible, incentives to quantify parallel system performance using a single metric. The fallacy lies in succumbing to such temptations. A complete performance characterization requires not only an analysis of the system's constituent levels, it also requires both static and dynamic characterizations. Static or average behavior analysis may mask transients that dramatically alter system performance. Although the human visual system is remarkedly adept at interpreting and identifying anomalies in false color data, the importance of dynamic, visual scientific data presentation has only recently been recognized Large, complex parallel system pose equally vexing performance interpretation problems. Data from hardware and software performance monitors must be presented in ways that emphasize important events while eluding irrelevant details. Design approaches and tools for performance visualization are the subject of this paper.
Analytic double product integrals for all-frequency relighting.
Wang, Rui; Pan, Minghao; Chen, Weifeng; Ren, Zhong; Zhou, Kun; Hua, Wei; Bao, Hujun
2013-07-01
This paper presents a new technique for real-time relighting of static scenes with all-frequency shadows from complex lighting and highly specular reflections from spatially varying BRDFs. The key idea is to depict the boundaries of visible regions using piecewise linear functions, and convert the shading computation into double product integrals—the integral of the product of lighting and BRDF on visible regions. By representing lighting and BRDF with spherical Gaussians and approximating their product using Legendre polynomials locally in visible regions, we show that such double product integrals can be evaluated in an analytic form. Given the precomputed visibility, our technique computes the visibility boundaries on the fly at each shading point, and performs the analytic integral to evaluate the shading color. The result is a real-time all-frequency relighting technique for static scenes with dynamic, spatially varying BRDFs, which can generate more accurate shadows than the state-of-the-art real-time PRT methods.
NASA Astrophysics Data System (ADS)
Thackeray, Lynn Roy
The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.
NASA Astrophysics Data System (ADS)
Fruman, Mark D.; Remmler, Sebastian; Achatz, Ulrich; Hickel, Stefan
2014-10-01
A systematic approach to the direct numerical simulation (DNS) of breaking upper mesospheric inertia-gravity waves of amplitude close to or above the threshold for static instability is presented. Normal mode or singular vector analysis applied in a frame of reference moving with the phase velocity of the wave (in which the wave is a steady solution) is used to determine the most likely scale and structure of the primary instability and to initialize nonlinear "2.5-D" simulations (with three-dimensional velocity and vorticity fields but depending only on two spatial coordinates). Singular vector analysis is then applied to the time-dependent 2.5-D solution to predict the transition of the breaking event to three-dimensional turbulence and to initialize three-dimensional DNS. The careful choice of the computational domain and the relatively low Reynolds numbers, on the order of 25,000, relevant to breaking waves in the upper mesosphere, makes the three-dimensional DNS tractable with present-day computing clusters. Three test cases are presented: a statically unstable low-frequency inertia-gravity wave, a statically and dynamically stable inertia-gravity wave, and a statically unstable high-frequency gravity wave. The three-dimensional DNS are compared to ensembles of 2.5-D simulations. In general, the decay of the wave and generation of turbulence is faster in three dimensions, but the results are otherwise qualitatively and quantitatively similar, suggesting that results of 2.5-D simulations are meaningful if the domain and initial condition are chosen properly.
J.A. Schumpeter and T.B. Veblen on economic evolution: the dichotomy between statics and dynamics
Schütz, Marlies; Rainer, Andreas
2016-01-01
Abstract At present, the discussion on the dichotomy between statics and dynamics is resolved by concentrating on its mathematical meaning. Yet, a simple formalisation masks the underlying methodological discussion. Overcoming this limitation, the paper discusses Schumpeter's and Veblen's viewpoint on dynamic economic systems as systems generating change from within. It contributes to an understanding on their ideas of how economics could become an evolutionary science and on their contributions to elaborate an evolutionary economics. It confronts Schumpeter's with Veblen's perspective on evolutionary economics and provides insight into their evolutionary economic theorising by discussing their ideas on the evolution of capitalism. PMID:28057981
Linear static structural and vibration analysis on high-performance computers
NASA Technical Reports Server (NTRS)
Baddourah, M. A.; Storaasli, O. O.; Bostic, S. W.
1993-01-01
Parallel computers offer the oppurtunity to significantly reduce the computation time necessary to analyze large-scale aerospace structures. This paper presents algorithms developed for and implemented on massively-parallel computers hereafter referred to as Scalable High-Performance Computers (SHPC), for the most computationally intensive tasks involved in structural analysis, namely, generation and assembly of system matrices, solution of systems of equations and calculation of the eigenvalues and eigenvectors. Results on SHPC are presented for large-scale structural problems (i.e. models for High-Speed Civil Transport). The goal of this research is to develop a new, efficient technique which extends structural analysis to SHPC and makes large-scale structural analyses tractable.
77 FR 38630 - Open Internet Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-28
... Computer Science and Co-Founder of the Berkman Center for Internet and Society, Harvard University, is... of Technology Computer Science and Artificial Intelligence Laboratory, is appointed vice-chairperson... Jennifer Rexford, Professor of Computer Science, Princeton University Dennis Roberson, Vice Provost...
Research in progress at the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1987-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
Enabling Earth Science Through Cloud Computing
NASA Technical Reports Server (NTRS)
Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian
2012-01-01
Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.
Teaching and Learning Methodologies Supported by ICT Applied in Computer Science
ERIC Educational Resources Information Center
Capacho, Jose
2016-01-01
The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-03
... Engineering; Notice of Meeting In accordance with the Federal Advisory Committee Act (Pub. L. 92- 463, as... Computer and Information Science and Engineering (1115). Date and Time: November 1, 2011 from 12 p.m.-5:30... Computer and Information Science and Engineering, National Science Foundation, 4201 Wilson Blvd., Suite...
Computer Science in High School Graduation Requirements. ECS Education Trends (Updated)
ERIC Educational Resources Information Center
Zinth, Jennifer
2016-01-01
Allowing high school students to fulfill a math or science high school graduation requirement via a computer science credit may encourage more student to pursue computer science coursework. This Education Trends report is an update to the original report released in April 2015 and explores state policies that allow or require districts to apply…
ERIC Educational Resources Information Center
Korenic, Eileen
1988-01-01
Describes a series of activities and demonstrations involving the science of soap bubbles. Starts with a recipe for bubble solution and gives instructions for several activities on topics such as density, interference colors, optics, static electricity, and galaxy formation. Contains some background information to help explain some of the effects.…
Characteristics of the Navy Laboratory Warfare Center Technical Workforce
2013-09-29
Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information
Electron transfer beyond the static picture: A TDDFT/TD-ZINDO study of a pentacene dimer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reslan, Randa; Lopata, Kenneth; Arntsen, Christopher
2012-12-14
We use time-dependent density functional theory and time-dependent ZINDO (a semi-empirical method) to study transfer of an extra electron between a pair of pentacene molecules. A measure of the electronic transfer integral is computed in a dynamic picture via the vertical excitation energy from a delocalized anionic ground state. With increasing dimer separation, this dynamical measurement of charge transfer is shown to be significantly larger than the commonly used static approximation (i.e., LUMO+1–LUMO of the neutral dimer, or HOMO–LUMO of the charged dimer), up to an order of magnitude higher at 6 Å. These results offer a word of cautionmore » for calculations involving large separations, as in organic photovoltaics, where care must be taken when using a static picture to model charge transfer.« less
Electron transfer beyond the static picture: A TDDFT/TD-ZINDO study of a pentacene dimer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reslan, Randa; Lopata, Kenneth A.; Arntsen, Christopher D.
2012-12-14
We use time-dependent density functional theory and time-dependent ZINDO (a semi-empirical method) to study transfer of an extra electron between a pair of pentacene dimers. A measure of the electronic transfer integral is computed in a dynamic picture via the vertical excitation energy from a delocalized anionic ground state. With increasing dimer separation, this dynamical measurement of charge transfer is shown to be significantly larger than the commonly used static approximation (i.e., LUMO+1 - LUMO of the neutral dimer, or HOMO - LUMO of the charged dimer), up to an order of magnitude higher at 6 Å. These results offermore » a word of caution for calculations involving large separations, as in organic photovoltaics, where care must be taken when using a static picture to model charge transfer.« less
Human-display interactions: Context-specific biases
NASA Technical Reports Server (NTRS)
Kaiser, Mary Kister; Proffitt, Dennis R.
1987-01-01
Recent developments in computer engineering have greatly enhanced the capabilities of display technology. As displays are no longer limited to simple alphanumeric output, they can present a wide variety of graphic information, using either static or dynamic presentation modes. At the same time that interface designers exploit the increased capabilities of these displays, they must be aware of the inherent limitation of these displays. Generally, these limitations can be divided into those that reflect limitations of the medium (e.g., reducing three-dimensional representations onto a two-dimensional projection) and those reflecting the perceptual and conceptual biases of the operator. The advantages and limitations of static and dynamic graphic displays are considered. Rather than enter into the discussion of whether dynamic or static displays are superior, general advantages and limitations are explored which are contextually specific to each type of display.
Elastic collisions of low-energy electrons with SiY4 (Y = Cl, Br, I) molecules
NASA Astrophysics Data System (ADS)
Bettega, M. H. F.
2011-11-01
We employed the Schwinger multichannel method to compute elastic integral, differential, and momentum transfer cross sections for low-energy electron collisions with SiY4 (Y = Cl, Br, I) molecules. The calculations were carried out in the static-exchange and static-exchange plus polarization approximations for energies up to 10 eV. The elastic integral cross section for SiCl4 and SiBr4, computed in the static-exchange plus polarization approximation, shows two shape resonances belonging to the T2 and E symmetries of the Td group, and for SiI4 shows one shape resonance belonging to the E symmetry of the Td group. The present results agree well in shape with experimental total cross sections. The positions of the resonances observed in the calculated integral cross sections are also in agreement with the experimental positions. We have found the presence of a virtual state for SiCl4 and a Ramsauer-Townsend minimum for SiI4 at 0.5 eV. The present results show that the proper inclusion of polarization effects is crucial in order to correctly describe the resonance spectra of these molecules and also to identify a Ramsauer-Townsend minimum for SiI4 and a virtual state for SiCl4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
ERIC Educational Resources Information Center
Falkner, Katrina; Vivian, Rebecca
2015-01-01
To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…
ERIC Educational Resources Information Center
Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay
2007-01-01
This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…
Prospective Students' Reactions to the Presentation of the Computer Science Major
ERIC Educational Resources Information Center
Weaver, Daniel Scott
2010-01-01
The number of students enrolling in Computer Science in colleges and Universities has declined since its peak in the early 2000s. Some claim contributing factors that intimate that prospective students fear the lack of employment opportunities if they study computing in college. However, the lack of understanding of what Computer Science is and…
Scientific Application Requirements for Leadership Computing at the Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahern, Sean; Alam, Sadaf R; Fahey, Mark R
2007-12-01
The Department of Energy s Leadership Computing Facility, located at Oak Ridge National Laboratory s National Center for Computational Sciences, recently polled scientific teams that had large allocations at the center in 2007, asking them to identify computational science requirements for future exascale systems (capable of an exaflop, or 1018 floating point operations per second). These requirements are necessarily speculative, since an exascale system will not be realized until the 2015 2020 timeframe, and are expressed where possible relative to a recent petascale requirements analysis of similar science applications [1]. Our initial findings, which beg further data collection, validation, andmore » analysis, did in fact align with many of our expectations and existing petascale requirements, yet they also contained some surprises, complete with new challenges and opportunities. First and foremost, the breadth and depth of science prospects and benefits on an exascale computing system are striking. Without a doubt, they justify a large investment, even with its inherent risks. The possibilities for return on investment (by any measure) are too large to let us ignore this opportunity. The software opportunities and challenges are enormous. In fact, as one notable computational scientist put it, the scale of questions being asked at the exascale is tremendous and the hardware has gotten way ahead of the software. We are in grave danger of failing because of a software crisis unless concerted investments and coordinating activities are undertaken to reduce and close this hardwaresoftware gap over the next decade. Key to success will be a rigorous requirement for natural mapping of algorithms to hardware in a way that complements (rather than competes with) compilers and runtime systems. The level of abstraction must be raised, and more attention must be paid to functionalities and capabilities that incorporate intent into data structures, are aware of memory hierarchy, possess fault tolerance, exploit asynchronism, and are power-consumption aware. On the other hand, we must also provide application scientists with the ability to develop software without having to become experts in the computer science components. Numerical algorithms are scattered broadly across science domains, with no one particular algorithm being ubiquitous and no one algorithm going unused. Structured grids and dense linear algebra continue to dominate, but other algorithm categories will become more common. A significant increase is projected for Monte Carlo algorithms, unstructured grids, sparse linear algebra, and particle methods, and a relative decrease foreseen in fast Fourier transforms. These projections reflect the expectation of much higher architecture concurrency and the resulting need for very high scalability. The new algorithm categories that application scientists expect to be increasingly important in the next decade include adaptive mesh refinement, implicit nonlinear systems, data assimilation, agent-based methods, parameter continuation, and optimization. The attributes of leadership computing systems expected to increase most in priority over the next decade are (in order of importance) interconnect bandwidth, memory bandwidth, mean time to interrupt, memory latency, and interconnect latency. The attributes expected to decrease most in relative priority are disk latency, archival storage capacity, disk bandwidth, wide area network bandwidth, and local storage capacity. These choices by application developers reflect the expected needs of applications or the expected reality of available hardware. One interpretation is that the increasing priorities reflect the desire to increase computational efficiency to take advantage of increasing peak flops [floating point operations per second], while the decreasing priorities reflect the expectation that computational efficiency will not increase. Per-core requirements appear to be relatively static, while aggregate requirements will grow with the system. This projection is consistent with a relatively small increase in performance per core with a dramatic increase in the number of cores. Leadership system software must face and overcome issues that will undoubtedly be exacerbated at the exascale. The operating system (OS) must be as unobtrusive as possible and possess more stability, reliability, and fault tolerance during application execution. As applications will be more likely at the exascale to experience loss of resources during an execution, the OS must mitigate such a loss with a range of responses. New fault tolerance paradigms must be developed and integrated into applications. Just as application input and output must not be an afterthought in hardware design, job management, too, must not be an afterthought in system software design. Efficient scheduling of those resources will be a major obstacle faced by leadership computing centers at the exas...« less
Design and evaluation of a computer tutorial on electric fields
NASA Astrophysics Data System (ADS)
Morse, Jeanne Jackson
Research has shown that students do not fully understand electric fields and their interactions with charged particles after completing traditional classroom instruction. The purpose of this project was to develop a computer tutorial to remediate some of these difficulties. Research on the effectiveness of computer-delivered instructional materials showed that students would learn better from media incorporating user-controlled interactive graphics. Two versions of the tutorial were tested. One version used interactive graphics and the other used static graphics. The two versions of the tutorial were otherwise identical. This project was done in four phases. Phases I and II were used to refine the topics covered in the tutorial and to test the usability of the tutorial. The final version of the tutorial was tested in Phases III and IV. The tutorial was tested using a pretest-posttest design with a control group. Both tests were administered in an interview setting. The tutorial using interactive graphics was more effective at remediating students' difficulties than the tutorial using static graphics for students in Phase III (p = 0.001). In Phase IV students who viewed the tutorial with static graphics did better than those viewing interactive graphics. The sample size in Phase IV was too small for this to be a statistically meaningful result. Some student reasoning errors were noted during the interviews. These include difficulty with the vector representation of electric fields, treating electric charge as if it were mass, using faulty algebraic reasoning to answer questions involving ratios and proportions, and using Coulomb's law in situations in which it is not appropriate.
PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah
2009-12-01
In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less
Using the Tower of Hanoi puzzle to infuse your mathematics classroom with computer science concepts
NASA Astrophysics Data System (ADS)
Marzocchi, Alison S.
2016-07-01
This article suggests that logic puzzles, such as the well-known Tower of Hanoi puzzle, can be used to introduce computer science concepts to mathematics students of all ages. Mathematics teachers introduce their students to computer science concepts that are enacted spontaneously and subconsciously throughout the solution to the Tower of Hanoi puzzle. These concepts include, but are not limited to, conditionals, iteration, and recursion. Lessons, such as the one proposed in this article, are easily implementable in mathematics classrooms and extracurricular programmes as they are good candidates for 'drop in' lessons that do not need to fit into any particular place in the typical curriculum sequence. As an example for readers, the author describes how she used the puzzle in her own Number Sense and Logic course during the federally funded Upward Bound Math/Science summer programme for college-intending low-income high school students. The article explains each computer science term with real-life and mathematical examples, applies each term to the Tower of Hanoi puzzle solution, and describes how students connected the terms to their own solutions of the puzzle. It is timely and important to expose mathematics students to computer science concepts. Given the rate at which technology is currently advancing, and our increased dependence on technology in our daily lives, it has become more important than ever for children to be exposed to computer science. Yet, despite the importance of exposing today's children to computer science, many children are not given adequate opportunity to learn computer science in schools. In the United States, for example, most students finish high school without ever taking a computing course. Mathematics lessons, such as the one described in this article, can help to make computer science more accessible to students who may have otherwise had little opportunity to be introduced to these increasingly important concepts.
NASA Astrophysics Data System (ADS)
Priest, Richard Harding
A significant percentage of high school science teachers are not using computers to teach their students or prepare them for standardized testing. A survey of high school science teachers was conducted to determine how they are having students use computers in the classroom, why science teachers are not using computers in the classroom, which variables were relevant to their not using computers, and what are the effects of standardized testing on the use of technology in the high school science classroom. A self-administered questionnaire was developed to measure these aspects of computer integration and demographic information. A follow-up telephone interview survey of a portion of the original sample was conducted in order to clarify questions, correct misunderstandings, and to draw out more holistic descriptions from the subjects. The primary method used to analyze the quantitative data was frequency distributions. Multiple regression analysis was used to investigate the relationships between the barriers and facilitators and the dimensions of instructional use, frequency, and importance of the use of computers. All high school science teachers in a large urban/suburban school district were sent surveys. A response rate of 58% resulted from two mailings of the survey. It was found that contributing factors to why science teachers do not use computers were not enough up-to-date computers in their classrooms and other educational commitments and duties do not leave them enough time to prepare lessons that include technology. While a high percentage of science teachers thought their school and district administrations were supportive of technology, they also believed more inservice technology training and follow-up activities to support that training are needed and more software needs to be created. The majority of the science teachers do not use the computer to help students prepare for standardized tests because they believe they can prepare students more efficiently without a computer. Nearly half of the teachers, however, gave lack of time to prepare instructional materials and lack of a means to project a computer image to the whole class as reasons they do not use computers. A significant percentage thought science standardized testing was having a negative effect on computer use.
Educational NASA Computational and Scientific Studies (enCOMPASS)
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess
2013-01-01
Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.
NASA Technical Reports Server (NTRS)
1993-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
ERIC Educational Resources Information Center
School Science Review, 1982
1982-01-01
Demonstrations, procedures, games, teaching suggestions and information on a variety of physics topics are presented, including hydraulic rams, units and formulae, static electric motors, a computer graphics program, diffraction, adaptation of a basic meter, photoelasticity, photo-diodes, radioactive decay, and analog-digital conversions. (DC)
NASA Astrophysics Data System (ADS)
Ogden, Fred L.; Hawkins, Richard Pete; Walter, M. Todd; Goodrich, David C.
2017-07-01
Bartlett et al. (2016) performed a re-interpretation and modification of the space-time lumped USDA NRCS (formerly SCS) Curve Number (CN) method to extend its applicability to forested watersheds. We believe that the well documented limitations of the CN method severely constrains the applicability of the modifications proposed by Bartlett et al. (2016). This forward-looking comment urges the research communities in hydrologic science and engineering to consider the CN method as a stepping stone that has outlived its usefulness in research. The CN method fills a narrow niche in certain settings as a parsimonious method having utility as an empirical equation to estimate runoff from a given amount of rainfall, which originated as a static functional form that fits rainfall-runoff data sets. Sixty five years of use and multiple reinterpretations have not resulted in improved hydrological predictability using the method. We suggest that the research community should move forward by (1) identifying appropriate dynamic hydrological model formulations for different hydro-geographic settings, (2) specifying needed model capabilities for solving different classes of problems (e.g., flooding, erosion/sedimentation, nutrient transport, water management, etc.) in different hydro-geographic settings, and (3) expanding data collection and research programs to help ameliorate the so-called "overparameterization" problem in contemporary modeling. Many decades of advances in geo-spatial data and processing, computation, and understanding are being squandered on continued focus on the static CN regression method. It is time to truly "move beyond" the Curve Number method.
Creating Science Simulations through Computational Thinking Patterns
ERIC Educational Resources Information Center
Basawapatna, Ashok Ram
2012-01-01
Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…
77 FR 65417 - Proposal Review Panel for Computing Communication Foundations; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-26
...: To assess the progress of the EIC Award, ``Collaborative Research: Computational Behavioral Science... NATIONAL SCIENCE FOUNDATION Proposal Review Panel for Computing Communication Foundations; Notice... National Science Foundation announces the following meeting: Name: Site Visit, Proposal Panel Review for...
Updated Chemical Kinetics and Sensitivity Analysis Code
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan
2005-01-01
An updated version of the General Chemical Kinetics and Sensitivity Analysis (LSENS) computer code has become available. A prior version of LSENS was described in "Program Helps to Determine Chemical-Reaction Mechanisms" (LEW-15758), NASA Tech Briefs, Vol. 19, No. 5 (May 1995), page 66. To recapitulate: LSENS solves complex, homogeneous, gas-phase, chemical-kinetics problems (e.g., combustion of fuels) that are represented by sets of many coupled, nonlinear, first-order ordinary differential equations. LSENS has been designed for flexibility, convenience, and computational efficiency. The present version of LSENS incorporates mathematical models for (1) a static system; (2) steady, one-dimensional inviscid flow; (3) reaction behind an incident shock wave, including boundary layer correction; (4) a perfectly stirred reactor; and (5) a perfectly stirred reactor followed by a plug-flow reactor. In addition, LSENS can compute equilibrium properties for the following assigned states: enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. For static and one-dimensional-flow problems, including those behind an incident shock wave and following a perfectly stirred reactor calculation, LSENS can compute sensitivity coefficients of dependent variables and their derivatives, with respect to the initial values of dependent variables and/or the rate-coefficient parameters of the chemical reactions.
Vauhkonen, P J; Vauhkonen, M; Kaipio, J P
2000-02-01
In electrical impedance tomography (EIT), an approximation for the internal resistivity distribution is computed based on the knowledge of the injected currents and measured voltages on the surface of the body. The currents spread out in three dimensions and therefore off-plane structures have a significant effect on the reconstructed images. A question arises: how far from the current carrying electrodes should the discretized model of the object be extended? If the model is truncated too near the electrodes, errors are produced in the reconstructed images. On the other hand if the model is extended very far from the electrodes the computational time may become too long in practice. In this paper the model truncation problem is studied with the extended finite element method. Forward solutions obtained using so-called infinite elements, long finite elements and separable long finite elements are compared to the correct solution. The effects of the truncation of the computational domain on the reconstructed images are also discussed and results from the three-dimensional (3D) sensitivity analysis are given. We show that if the finite element method with ordinary elements is used in static 3D EIT, the dimension of the problem can become fairly large if the errors associated with the domain truncation are to be avoided.
NASA Astrophysics Data System (ADS)
Wright, D. J.
2013-12-01
In the early 1990s the author came of age as the technology driving the geographic information system or GIS was beginning to successfully 'handle' geospatial data at a range of scales and formats, and a wide array of information technology products emerged from an expanding GIS industry. However, that small community struggled to reflect the diverse research efforts at play in understanding the deeper issues surrounding geospatial data, and the impediments to that effective use of that data. It was from this need that geographic information science or GIScience arose, to ensure in part that GIS did not fall into the trap of being a technology in search of applications, a one-time, one-off, non-intellectual 'bag of tricks' with no substantive theory underpinning it, and suitable only for a static period of time (e.g., Goodchild, 1992). The community has since debated the issue of "tool versus science' which has also played a role in defining GIS as an actual profession. In turn, GIS has contributed to "methodological versus substantive" questions in science, leading to understandings of how the Earth works versus how the Earth should look. In the author's experience, the multidimensional structuring and scaling data, with integrative and innovative approaches to analyzing, modeling, and developing extensive and spatial data from selected places on land and at sea, have revealed how theory and application are in no way mutually exclusive, and it may often be application that advances theory, rather than vice versa. Increasingly, both the system and science of geographic information have welcomed strong collaborations among computer scientists, information scientists, and domain scientists to solve complex scientific questions. As such, they have paralleled the emergence and acceptance of "data science." And now that we are squarely in an era of regional- to global-scale observation and simulation of the Earth, produce data that are too big, move too fast, and do not fit the structures and processing capacity of conventional database systems, and the author reflects on how the potential of the GIS/GIScience world to contribute to the training and professional advancement of data science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Hack, James; Riley, Katherine
The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less
ERIC Educational Resources Information Center
Grandell, Linda
2005-01-01
Computer science is becoming increasingly important in our society. Meta skills, such as problem solving and logical and algorithmic thinking, are emphasized in every field, not only in the natural sciences. Still, largely due to gaps in tuition, common misunderstandings exist about the true nature of computer science. These are especially…
Non-parallel processing: Gendered attrition in academic computer science
NASA Astrophysics Data System (ADS)
Cohoon, Joanne Louise Mcgrath
2000-10-01
This dissertation addresses the issue of disproportionate female attrition from computer science as an instance of gender segregation in higher education. By adopting a theoretical framework from organizational sociology, it demonstrates that the characteristics and processes of computer science departments strongly influence female retention. The empirical data identifies conditions under which women are retained in the computer science major at comparable rates to men. The research for this dissertation began with interviews of students, faculty, and chairpersons from five computer science departments. These exploratory interviews led to a survey of faculty and chairpersons at computer science and biology departments in Virginia. The data from these surveys are used in comparisons of the computer science and biology disciplines, and for statistical analyses that identify which departmental characteristics promote equal attrition for male and female undergraduates in computer science. This three-pronged methodological approach of interviews, discipline comparisons, and statistical analyses shows that departmental variation in gendered attrition rates can be explained largely by access to opportunity, relative numbers, and other characteristics of the learning environment. Using these concepts, this research identifies nine factors that affect the differential attrition of women from CS departments. These factors are: (1) The gender composition of enrolled students and faculty; (2) Faculty turnover; (3) Institutional support for the department; (4) Preferential attitudes toward female students; (5) Mentoring and supervising by faculty; (6) The local job market, starting salaries, and competitiveness of graduates; (7) Emphasis on teaching; and (8) Joint efforts for student success. This work contributes to our understanding of the gender segregation process in higher education. In addition, it contributes information that can lead to effective solutions for an economically significant issue in modern American society---gender equality in computer science.
Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee
NASA Technical Reports Server (NTRS)
Gallagher, D. L. (Editor)
1993-01-01
The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.
NASA Astrophysics Data System (ADS)
Falkner, Katrina; Vivian, Rebecca
2015-10-01
To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.
Ambient belonging: how stereotypical cues impact gender participation in computer science.
Cheryan, Sapna; Plaut, Victoria C; Davies, Paul G; Steele, Claude M
2009-12-01
People can make decisions to join a group based solely on exposure to that group's physical environment. Four studies demonstrate that the gender difference in interest in computer science is influenced by exposure to environments associated with computer scientists. In Study 1, simply changing the objects in a computer science classroom from those considered stereotypical of computer science (e.g., Star Trek poster, video games) to objects not considered stereotypical of computer science (e.g., nature poster, phone books) was sufficient to boost female undergraduates' interest in computer science to the level of their male peers. Further investigation revealed that the stereotypical broadcast a masculine stereotype that discouraged women's sense of ambient belonging and subsequent interest in the environment (Studies 2, 3, and 4) but had no similar effect on men (Studies 3, 4). This masculine stereotype prevented women's interest from developing even in environments entirely populated by other women (Study 2). Objects can thus come to broadcast stereotypes of a group, which in turn can deter people who do not identify with these stereotypes from joining that group.
Static Extended Trailing Edge for Lift Enhancement: Experimental and Computational Studies
NASA Technical Reports Server (NTRS)
Liu, Tianshu; Montefort; Liou, William W.; Pantula, Srinivasa R.; Shams, Qamar A.
2007-01-01
A static extended trailing edge attached to a NACA0012 airfoil section is studied for achieving lift enhancement at a small drag penalty. It is indicated that the thin extended trailing edge can enhance the lift while the zero-lift drag is not significantly increased. Experiments and calculations are conducted to compare the aerodynamic characteristics of the extended trailing edge with those of Gurney flap and conventional flap. The extended trailing edge, as a simple mechanical device added on a wing without altering the basic configuration, has a good potential to improve the cruise flight efficiency.
Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane
NASA Technical Reports Server (NTRS)
Gera, Joseph; Bosworth, John T.
1987-01-01
Novel flight test and analysis techniques in the flight dynamics and handling qualities area are described. These techniques were utilized at NASA Ames-Dryden during the initial flight envelope clearance of the X-29A aircraft. It is shown that the open-loop frequency response of an aircraft with highly relaxed static stability can be successfully computed on the ground from telemetry data. Postflight closed-loop frequency response data were obtained from pilot-generated frequency sweeps and it is found that the current handling quality requirements for high-maneuverability aircraft are generally applicable to the X-29A.
Correlation of AH-1G airframe test data with a NASTRAN mathematical model
NASA Technical Reports Server (NTRS)
Cronkhite, J. D.; Berry, V. L.
1976-01-01
Test data was provided for evaluating a mathematical vibration model of the Bell AH-1G helicopter airframe. The math model was developed and analyzed using the NASTRAN structural analysis computer program. Data from static and dynamic tests were used for comparison with the math model. Static tests of the fuselage and tailboom were conducted to verify the stiffness representation of the NASTRAN model. Dynamic test data were obtained from shake tests of the airframe and were used to evaluate the NASTRAN model for representing the low frequency (below 30 Hz) vibration response of the airframe.
NASA Astrophysics Data System (ADS)
Kayser, Lyle D.
1986-07-01
Wind tunnel test results on a typical projectile shape with small nose bluntness are reported. Flat and hemispherical nose tip results are shown in addition to sharp nose tip results. The effects of nose bluntness on static stability are shown to be negligible at both Mach 0.91 and 3.02. The effects of nose bluntness on Magnus force and Magnus moment were not large, but of sufficient magnitude to indicate that such bluntness should not be neglected in a numerical flow field computation.
Low-speed Aerodynamic Investigations of a Hybrid Wing Body Configuration
NASA Technical Reports Server (NTRS)
Vicroy, Dan D.; Gatlin, Gregory M.; Jenkins, Luther N.; Murphy, Patrick C.; Carter, Melissa B.
2014-01-01
Two low-speed static wind tunnel tests and a water tunnel static and dynamic forced-motion test have been conducted on a hybrid wing-body (HWB) twinjet configuration. These tests, in addition to computational fluid dynamics (CFD) analysis, have provided a comprehensive dataset of the low-speed aerodynamic characteristics of this nonproprietary configuration. In addition to force and moment measurements, the tests included surface pressures, flow visualization, and off-body particle image velocimetry measurements. This paper will summarize the results of these tests and highlight the data that is available for code comparison or additional analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frazin, Richard A., E-mail: rfrazin@umich.edu
2013-04-10
Heretofore, the literature on exoplanet detection with coronagraphic telescope systems has paid little attention to the information content of short exposures and methods of utilizing the measurements of adaptive optics wavefront sensors. This paper provides a framework for the incorporation of the wavefront sensor measurements in the context of observing modes in which the science camera takes millisecond exposures. In this formulation, the wavefront sensor measurements provide a means to jointly estimate the static speckle and the planetary signal. The ability to estimate planetary intensities in as little as a few seconds has the potential to greatly improve the efficiencymore » of exoplanet search surveys. For simplicity, the mathematical development assumes a simple optical system with an idealized Lyot coronagraph. Unlike currently used methods, in which increasing the observation time beyond a certain threshold is useless, this method produces estimates whose error covariances decrease more quickly than inversely proportional to the observation time. This is due to the fact that the estimates of the quasi-static aberrations are informed by a new random (but approximately known) wavefront every millisecond. The method can be extended to include angular (due to diurnal field rotation) and spectral diversity. Numerical experiments are performed with wavefront data from the AEOS Adaptive Optics System sensing at 850 nm. These experiments assume a science camera wavelength {lambda} of 1.1 {mu}, that the measured wavefronts are exact, and a Gaussian approximation of shot-noise. The effects of detector read-out noise and other issues are left to future investigations. A number of static aberrations are introduced, including one with a spatial frequency exactly corresponding the planet location, which was at a distance of Almost-Equal-To 3{lambda}/D from the star. Using only 4 s of simulated observation time, a planetary intensity, of Almost-Equal-To 1 photon ms{sup -1}, and a stellar intensity of Almost-Equal-To 10{sup 5} photons ms{sup -1} (contrast ratio 10{sup 5}), the short-exposure estimation method recovers the amplitudes' static aberrations with 1% accuracy, and the planet brightness with 20% accuracy.« less
ERIC Educational Resources Information Center
Huss, Jeanine; Baker, Cheryl
2010-01-01
Agriculture can play a key role in fostering scientific literacy because it brings important plant and ecosystem concepts into the classroom. Plus, agriculture, like science, is not static and includes much trial and error, investigation, and innovation. With help from community experts at the U.S. Department of Agriculture-Agricultural Research…
Serials Management by Microcomputer: The Potential of DBMS.
ERIC Educational Resources Information Center
Vogel, J. Thomas; Burns, Lynn W.
1984-01-01
Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…
Technology-Based Content through Virtual and Physical Modeling: A National Research Study
ERIC Educational Resources Information Center
Ernst, Jeremy V.; Clark, Aaron C.
2009-01-01
Visualization is becoming more prevalent as an application in science, engineering, and technology related professions. The analysis of static and dynamic graphical visualization provides data solutions and understandings that go beyond traditional forms of communication. The study of technology-based content and the application of conceptual…
Moire strain analysis of paper
R. E. Rowlands; P. K. Beasley; D. E. Gunderson
1983-01-01
Efficient use of paper products involves using modern aspects of materials science and engineering mechanics. This implies the ability to determine simultaneously different components of strain at multiple locations and under static or dynamic conditions. Although measuring strains in paper has been a topic of interest for over 40 years, present capability remains...
ERIC Educational Resources Information Center
Cooke, Nancy J.; Gorman, Jamie C.; Myers, Christopher W.; Duran, Jasmine L.
2013-01-01
Cognition in work teams has been predominantly understood and explained in terms of shared cognition with a focus on the similarity of static knowledge structures across individual team members. Inspired by the current zeitgeist in cognitive science, as well as by empirical data and pragmatic concerns, we offer an alternative theory of team…
ERIC Educational Resources Information Center
Buczynski, James Andrew
2005-01-01
Developing a library collection to support the curriculum of Canada's largest computer studies school has debunked many myths about collecting computer science and technology information resources. Computer science students are among the heaviest print book and e-book users in the library. Circulation statistics indicate that the demand for print…
Snatching Defeat from the Jaws of Victory: When Good Projects Go Bad. Girls and Computer Science.
ERIC Educational Resources Information Center
Sanders, Jo
In week-long semesters in the summers of 1997, 1998, and 1999, the 6APT (Summer Institute in Computer Science for Advanced Placement Teachers) project taught 240 high school teachers of Advanced Placement Computer Science (APCS) about gender equity in computers. Teachers were then followed through 2000. Results indicated that while teachers, did…
77 FR 12823 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-02
... Exascale ARRA projects--Magellan final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR..., Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the...
75 FR 18407 - Investing in Innovation Fund
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-12
... include computer science rather than science. To correct this error, the Department makes the following..., in footnote number eight, in line six, ``including science'' is replaced with ``including computer... obtain this document in an accessible format (e.g., Braille, large print, audiotape, or computer diskette...
Innovative Science Experiments Using Phoenix
ERIC Educational Resources Information Center
Kumar, B. P. Ajith; Satyanarayana, V. V. V.; Singh, Kundan; Singh, Parmanand
2009-01-01
A simple, flexible and very low cost hardware plus software framework for developing computer-interfaced science experiments is presented. It can be used for developing computer-interfaced science experiments without getting into the details of electronics or computer programming. For developing experiments this is a middle path between…
The Metamorphosis of an Introduction to Computer Science.
ERIC Educational Resources Information Center
Ben-Jacob, Marion G.
1997-01-01
Introductory courses in computer science at colleges and universities have undergone significant changes in 20 years. This article provides an overview of the history of introductory computer science (FORTRAN, ANSI flowchart symbols, BASIC, data processing concepts, and PASCAL) and its future (robotics and C++). (PEN)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Koushik; Jawulski, Konrad; Pastorczak, Ewa
A perfect-pairing generalized valence bond (GVB) approximation is known to be one of the simplest approximations, which allows one to capture the essence of static correlation in molecular systems. In spite of its attractive feature of being relatively computationally efficient, this approximation misses a large portion of dynamic correlation and does not offer sufficient accuracy to be generally useful for studying electronic structure of molecules. We propose to correct the GVB model and alleviate some of its deficiencies by amending it with the correlation energy correction derived from the recently formulated extended random phase approximation (ERPA). On the examples ofmore » systems of diverse electronic structures, we show that the resulting ERPA-GVB method greatly improves upon the GVB model. ERPA-GVB recovers most of the electron correlation and it yields energy barrier heights of excellent accuracy. Thanks to a balanced treatment of static and dynamic correlation, ERPA-GVB stays reliable when one moves from systems dominated by dynamic electron correlation to those for which the static correlation comes into play.« less
NASA Technical Reports Server (NTRS)
Ting, Eric; Nguyen, Nhan; Trinh, Khanh
2014-01-01
This paper presents a static aeroelastic model and longitudinal trim model for the analysis of a flexible wing transport aircraft. The static aeroelastic model is built using a structural model based on finite-element modeling and coupled to an aerodynamic model that uses vortex-lattice solution. An automatic geometry generation tool is used to close the loop between the structural and aerodynamic models. The aeroelastic model is extended for the development of a three degree-of-freedom longitudinal trim model for an aircraft with flexible wings. The resulting flexible aircraft longitudinal trim model is used to simultaneously compute the static aeroelastic shape for the aircraft model and the longitudinal state inputs to maintain an aircraft trim state. The framework is applied to an aircraft model based on the NASA Generic Transport Model (GTM) with wing structures allowed to flexibly deformed referred to as the Elastically Shaped Aircraft Concept (ESAC). The ESAC wing mass and stiffness properties are based on a baseline "stiff" values representative of current generation transport aircraft.
NASA Technical Reports Server (NTRS)
Montgomery, Raymond C.; Scott, Michael A.; Weston, Robert P.
1998-01-01
This paper represents an initial study on the use of quasi-static shape change devices in aircraft maneuvering. The macroscopic effects and requirements for these devices in flight control are the focus of this study. Groups of devices are postulated to replace the conventional leading-edge flap (LEF) and the all-moving wing tip (AMT) on the tailless LMTAS-ICE (Lockheed Martin Tactical Aircraft Systems - Innovative Control Effectors) configuration. The maximum quasi-static shape changes are 13.8% and 7.7% of the wing section thickness for the LEF and AMT replacement devices, respectively. A Computational Fluid Dynamics (CFD) panel code is used to determine the control effectiveness of groups of these devices. A preliminary design of a wings-leveler autopilot is presented. Initial evaluation at 0.6 Mach at 15,000 ft. altitude is made through batch simulation. Results show small disturbance stability is achieved, however, an increase in maximum distortion is needed to statically offset five degrees of sideslip. This only applies to the specific device groups studied, encouraging future research on optimal device placement.
Dickinson, Christopher A.; Zelinsky, Gregory J.
2013-01-01
Two experiments are reported that further explore the processes underlying dynamic search. In Experiment 1, observers’ oculomotor behavior was monitored while they searched for a randomly oriented T among oriented L distractors under static and dynamic viewing conditions. Despite similar search slopes, eye movements were less frequent and more spatially constrained under dynamic viewing relative to static, with misses also increasing more with target eccentricity in the dynamic condition. These patterns suggest that dynamic search involves a form of sit-and-wait strategy in which search is restricted to a small group of items surrounding fixation. To evaluate this interpretation, we developed a computational model of a sit-and-wait process hypothesized to underlie dynamic search. In Experiment 2 we tested this model by varying fixation position in the display and found that display positions optimized for a sit-and-wait strategy resulted in higher d′ values relative to a less optimal location. We conclude that different strategies, and therefore underlying processes, are used to search static and dynamic displays. PMID:23372555
ERIC Educational Resources Information Center
Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri
2017-01-01
Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…
After-Hours Science: Microchips and Onion Dip.
ERIC Educational Resources Information Center
Brugger, Steve
1984-01-01
Computer programs were developed for a science center nutrition exhibit. The exhibit was recognized by the National Science Teachers Association Search for Excellence in Science Education as an outstanding science program. The computer programs (Apple II) and their use in the exhibit are described. (BC)
Production against static electricity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shteiner, A.L.; Minaev, G.S.; Shatkov, O.P.
1978-01-01
Coke industry shops process electrifiable, highly inflammable and explosive substances (benzene, toluene, xylenes, sulfur, coal dust, and coke-oven gas). The electrification of those materials creates a danger of buildup of static electricity charges in them and on the surface of objects interacting with them, followed by an electrical discharge which may cause explosion, fire, or disruption of the technological process. Some of the regulations for protection against static electricity do not reflect modern methods of static electricity control. The regulations are not always observed by workers in the plant services. The main means of protection used to remove static electricitymore » charges in grounding. In many cases it completely drains the charge from the surface of the electrifiable bodies. However, in the processing of compounds with a high specific volumetric electrical resistence grounding is insufficient, since it does not drain the charge from the interior of the substance. Gigh adsorption capacity) are generally met by brown coal low-temperature ompared with predictions using the hourly computer program. The concept of a lumped thermal network for predicting heat losses from in-ground heat storage tanks, developed earlier in the project, has beethe cased-hole log data from various companies and additional comparison factors were calculated for the cased-hole log data. These comparison factors allow for some quantification of these uncalibrated log data.« less
Neural Correlates of Racial Ingroup Bias in Observing Computer-Animated Social Encounters.
Katsumi, Yuta; Dolcos, Sanda
2017-01-01
Despite evidence for the role of group membership in the neural correlates of social cognition, the mechanisms associated with processing non-verbal behaviors displayed by racially ingroup vs. outgroup members remain unclear. Here, 20 Caucasian participants underwent fMRI recording while observing social encounters with ingroup and outgroup characters displaying dynamic and static non-verbal behaviors. Dynamic behaviors included approach and avoidance behaviors, preceded or not by a handshake; both dynamic and static behaviors were followed by participants' ratings. Behaviorally, participants showed bias toward their ingroup members, demonstrated by faster/slower reaction times for evaluating ingroup static/approach behaviors, respectively. At the neural level, despite overall similar responses in the action observation network to ingroup and outgroup encounters, the medial prefrontal cortex showed dissociable activation, possibly reflecting spontaneous processing of ingroup static behaviors and positive evaluations of ingroup approach behaviors. The anterior cingulate and superior frontal cortices also showed sensitivity to race, reflected in coordinated and reduced activation for observing ingroup static behaviors. Finally, the posterior superior temporal sulcus showed uniquely increased activity to observing ingroup handshakes. These findings shed light on the mechanisms of racial ingroup bias in observing social encounters, and have implications for understanding factors related to successful interactions with individuals from diverse backgrounds.
3-D scapular kinematics during arm elevation: effect of motion velocity.
Fayad, F; Hoffmann, G; Hanneton, S; Yazbeck, C; Lefevre-Colau, M M; Poiraudeau, S; Revel, M; Roby-Brami, A
2006-11-01
No three-dimensional (3-D) data exist on the influence of motion velocity on scapular kinematics. The effect of arm elevation velocity has been studied only in a two-dimensional setting. Thirty healthy subjects performed dominant (right) arm elevation in two planes, sagittal and frontal, and at slow and fast self-selected arm speed. Scapular orientation and humeral elevation were measured at 30 Hz recording frequency with use of a 6-degree-of-freedom electromagnetic system (Polhemus Fastraka). Motion was computed according to the International Society of Biomechanics standards. Scapular orientation was also determined with the arm held in different static positions. We obtained a full 3-D kinematic description of scapula achieving a reliable, complex 3-D motion during humeral elevation and lowering. The maximal sagittal arm elevation showed a characteristic "M"-shape pattern of protraction/retraction curve. Scapular rotations did not differ significantly between slow and fast movements. Moreover, protraction/retraction and tilt angular values did not differ significantly between static and dynamic tasks. However, scapular lateral rotation values differed between static and dynamic measurements during sagittal and frontal arm elevation. Lateral scapular rotation appears to be less in static than in dynamic measurement, particularly in the sagittal plane. Interpolation of statically recorded positions of the bones cannot reflect the kinematics of the scapula.
Confirmation of quasi-static approximation in SAR evaluation for a wireless power transfer system.
Hirata, Akimasa; Ito, Fumihiro; Laakso, Ilkka
2013-09-07
The present study discusses the applicability of the magneto-quasi-static approximation to the calculation of the specific absorption rate (SAR) in a cylindrical model for a wireless power transfer system. Resonant coils with different parameters were considered in the 10 MHz band. A two-step quasi-static method that is comprised of the method of moments and the scalar-potential finite-difference methods is applied, which can consider the effects of electric and magnetic fields on the induced SAR separately. From our computational results, the SARs obtained from our quasi-static method are found to be in good agreement with full-wave analysis for different positions of the cylindrical model relative to the wireless power transfer system, confirming the applicability of the quasi-static approximation in the 10 MHz band. The SAR induced by the external electric field is found to be marginal as compared to that induced by the magnetic field. Thus, the dosimetry for the external magnetic field, which may be marginally perturbed by the presence of biological tissue, is confirmed to be essential for SAR compliance in the 10 MHz band or lower. This confirmation also suggests that the current in the coil rather than the transferred power is essential for SAR compliance.
NASA Technical Reports Server (NTRS)
Harvie, E.; Filla, O.; Baker, D.
1993-01-01
Analysis performed in the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) measures error in the static Earth sensor onboard the National Oceanic and Atmospheric Administration (NOAA)-10 spacecraft using flight data. Errors are computed as the difference between Earth sensor pitch and roll angle telemetry and reference pitch and roll attitude histories propagated by gyros. The flight data error determination illustrates the effect on horizon sensing of systemic variation in the Earth infrared (IR) horizon radiance with latitude and season, as well as the effect of anomalies in the global IR radiance. Results of the analysis provide a comparison between static Earth sensor flight performance and that of scanning Earth sensors studied previously in the GSFC/FDD. The results also provide a baseline for evaluating various models of the static Earth sensor. Representative days from the NOAA-10 mission indicate the extent of uniformity and consistency over time of the global IR horizon. A unique aspect of the NOAA-10 analysis is the correlation of flight data errors with independent radiometric measurements of stratospheric temperature. The determination of the NOAA-10 static Earth sensor error contributes to realistic performance expectations for missions to be equipped with similar sensors.
Neural Correlates of Racial Ingroup Bias in Observing Computer-Animated Social Encounters
Katsumi, Yuta; Dolcos, Sanda
2018-01-01
Despite evidence for the role of group membership in the neural correlates of social cognition, the mechanisms associated with processing non-verbal behaviors displayed by racially ingroup vs. outgroup members remain unclear. Here, 20 Caucasian participants underwent fMRI recording while observing social encounters with ingroup and outgroup characters displaying dynamic and static non-verbal behaviors. Dynamic behaviors included approach and avoidance behaviors, preceded or not by a handshake; both dynamic and static behaviors were followed by participants’ ratings. Behaviorally, participants showed bias toward their ingroup members, demonstrated by faster/slower reaction times for evaluating ingroup static/approach behaviors, respectively. At the neural level, despite overall similar responses in the action observation network to ingroup and outgroup encounters, the medial prefrontal cortex showed dissociable activation, possibly reflecting spontaneous processing of ingroup static behaviors and positive evaluations of ingroup approach behaviors. The anterior cingulate and superior frontal cortices also showed sensitivity to race, reflected in coordinated and reduced activation for observing ingroup static behaviors. Finally, the posterior superior temporal sulcus showed uniquely increased activity to observing ingroup handshakes. These findings shed light on the mechanisms of racial ingroup bias in observing social encounters, and have implications for understanding factors related to successful interactions with individuals from diverse backgrounds. PMID:29354042
An assessment of the real-time application capabilities of the SIFT computer system
NASA Technical Reports Server (NTRS)
Butler, R. W.
1982-01-01
The real-time capabilities of the SIFT computer system, a highly reliable multicomputer architecture developed to support the flight controls of a relaxed static stability aircraft, are discussed. The SIFT computer system was designed to meet extremely high reliability requirements and to facilitate a formal proof of its correctness. Although SIFT represents a significant achievement in fault-tolerant system research it presents an unusual and restrictive interface to its users. The characteristics of the user interface and its impact on application system design are assessed.
Backtracking and Re-execution in the Automatic Debugging of Parallelized Programs
NASA Technical Reports Server (NTRS)
Matthews, Gregory; Hood, Robert; Johnson, Stephen; Leggett, Peter; Biegel, Bryan (Technical Monitor)
2002-01-01
In this work we describe a new approach using relative debugging to find differences in computation between a serial program and a parallel version of th it program. We use a combination of re-execution and backtracking in order to find the first difference in computation that may ultimately lead to an incorrect value that the user has indicated. In our prototype implementation we use static analysis information from a parallelization tool in order to perform the backtracking as well as the mapping required between serial and parallel computations.