Scientific Assistant Virtual Laboratory (SAVL)
NASA Astrophysics Data System (ADS)
Alaghband, Gita; Fardi, Hamid; Gnabasik, David
2007-03-01
The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.
ERIC Educational Resources Information Center
Zhang, Jianwei; Chen, Qi; Sun, Yanquing; Reid, David J.
2004-01-01
Learning support studies involving simulation-based scientific discovery learning have tended to adopt an ad hoc strategies-oriented approach in which the support strategies are typically pre-specified according to learners' difficulties in particular activities. This article proposes a more integrated approach, a triple scheme for learning…
Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review
NASA Technical Reports Server (NTRS)
Antonsson, Erik; Gombosi, Tamas
2005-01-01
Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.
The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation
NASA Astrophysics Data System (ADS)
Goulet, C.; Silva, F.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.
2015-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100Hz) ground motions for earthquakes at regional scales. The BBP scientific software modules implement kinematic rupture generation, low and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, seismogram ground motion amplitude calculations, and goodness of fit measurements. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground motion seismograms, using multiple alternative ground motion simulation methods, and software utilities that can generate plots, charts, and maps. The BBP has been developed over the last five years in a collaborative scientific, engineering, and software development project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The SCEC BBP software released in 2015 can be compiled and run on recent Linux systems with GNU compilers. It includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, updated ground motion simulation methods, and a simplified command line user interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver
2009-11-20
Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less
Arab, Abeer; Alatassi, Abdulaleem; Alattas, Elias; Alzoraigi, Usamah; AlZaher, Zaki; Ahmad, Abdulaziz; Albabtain, Hesham; Boker, Abdulaziz
2017-01-01
The educational programs in the Saudi Commission for Health Specialties are developing rapidly in the fields of technical development. Such development is witnessed, particularly in the scientific areas related to what is commonly known as evidence-based medicine. This review highlights the critical need and importance of integrating simulation into anesthesia training and assessment. Furthermore, it describes the current utilization of simulation in anesthesia and critical care assessment process. PMID:28442961
Airborne simulation of Shuttle/Spacelab management and operation
NASA Technical Reports Server (NTRS)
Mulholland, D. R.; Neel, C. B.
1976-01-01
The ASSESS (Airborne Science/Spacelab Experiments System Simulation) program is discussed. A simulated Spacelab operation was carried out aboard the CV-990 airborne laboratory at Ames Research Center. A scientific payload was selected to conduct studies in upper atmospheric physics and infrared astronomy with principal investigators from France, the Netherlands, England and the U.S. Two experiment operators (EOs) from the U.S. and two from Europe were trained to function as proxies for the principal investigators in operating, maintaining, and repairing the scientific instruments. The simulated mission, in which the EOs and a Mission Manager were confined to the aircraft and living quarters for a 1-week period while making scientific observations during nightly flights, provided experience in the overall management of a complex international payload, experiment preparation, testing, and integration, the training and selection of proxy operators, and data handling.
Ginzburg, Samara B; Brenner, Judith; Cassara, Michael; Kwiatkowski, Thomas; Willey, Joanne M
2017-01-01
There has been a call for increased integration of basic and clinical sciences during preclinical years of undergraduate medical education. Despite the recognition that clinical simulation is an effective pedagogical tool, little has been reported on its use to demonstrate the relevance of basic science principles to the practice of clinical medicine. We hypothesized that simulation with an integrated science and clinical debrief used with early learners would illustrate the importance of basic science principles in clinical diagnosis and management of patients. Small groups of first- and second-year medical students were engaged in a high-fidelity simulation followed by a comprehensive debrief facilitated by a basic scientist and clinician. Surveys including anchored and open-ended questions were distributed at the conclusion of each experience. The majority of the students agreed that simulation followed by an integrated debrief illustrated the clinical relevance of basic sciences (mean ± standard deviation: 93.8% ± 2.9% of first-year medical students; 96.7% ± 3.5% of second-year medical students) and its importance in patient care (92.8% of first-year medical students; 90.4% of second-year medical students). In a thematic analysis of open-ended responses, students felt that these experiences provided opportunities for direct application of scientific knowledge to diagnosis and treatment, improving student knowledge, simulating real-world experience, and developing clinical reasoning, all of which specifically helped them understand the clinical relevance of basic sciences. Small-group simulation followed by a debrief that integrates basic and clinical sciences is an effective means of demonstrating the relationship between scientific fundamentals and patient care for early learners. As more medical schools embrace integrated curricula and seek opportunities for integration, our model is a novel approach that can be utilized.
Improving the trust in results of numerical simulations and scientific data analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cappello, Franck; Constantinescu, Emil; Hovland, Paul
This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation andmore » scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general approaches to address it. This paper does not focus on the trust that the execution will actually complete. The product of simulation or of data analytic executions is the final element of a potentially long chain of transformations, where each stage has the potential to introduce harmful corruptions. These corruptions may produce results that deviate from the user-expected accuracy without notifying the user of this deviation. There are many potential sources of corruption before and during the execution; consequently, in this white paper we do not focus on the protection of the end result after the execution.« less
MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.
2016-01-01
MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.
To simulate or not to simulate: what are the questions?
Dudai, Yadin; Evers, Kathinka
2014-10-22
Simulation is a powerful method in science and engineering. However, simulation is an umbrella term, and its meaning and goals differ among disciplines. Rapid advances in neuroscience and computing draw increasing attention to large-scale brain simulations. What is the meaning of simulation, and what should the method expect to achieve? We discuss the concept of simulation from an integrated scientific and philosophical vantage point and pinpoint selected issues that are specific to brain simulation.
Assess 2: Spacelab simulation. Executive summary
NASA Technical Reports Server (NTRS)
1977-01-01
An Airborne Science/Spacelab Experiments System Simulation (ASSESS II) mission, was conducted with the CV-990 airborne laboratory in May 1977. The project studied the full range of Spacelab-type activities including management interactions, experiment selection and funding, hardware development, payload integration and checkout, mission specialist and payload specialist selection and training, mission control center payload operations control center arrangements and interactions, real time interaction during flight between principal investigators and the flight crew, and retrieval of scientific flight data. ESA established an integration and coordination center for the ESA portion of the payload as planned for Spacelab. A strongly realistic Spacelab mission was conducted on the CV-990 aircraft. U.S. and ESA scientific experiments were integrated into a payload and flown over a 10 day period, with the payload flight crew fully-confined to represent a Spacelab mission. Specific conclusions for Spacelab planning are presented along with a brief explanation of each.
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation
NASA Astrophysics Data System (ADS)
Silva, F.; Goulet, C. A.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.
2016-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100 Hz) ground motions for earthquakes at regional scales. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The BBP scientific software modules implement kinematic rupture generation, low- and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, several ground motion intensity measure calculations, and various ground motion goodness-of-fit tools. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground-motion seismograms, using multiple alternative ground motion simulation methods, and software utilities to generate tables, plots, and maps. The BBP has been developed over the last five years in a collaborative project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The SCEC BBP software released in 2016 can be compiled and run on recent Linux and Mac OS X systems with GNU compilers. It includes five simulation methods, seven simulation regions covering California, Japan, and Eastern North America, and the ability to compare simulation results against empirical ground motion models (aka GMPEs). The latest version includes updated ground motion simulation methods, a suite of new validation metrics and a simplified command line user interface.
Simulation of wave interactions with MHD
NASA Astrophysics Data System (ADS)
Batchelor, D.; Alba, C.; Bateman, G.; Bernholdt, D.; Berry, L.; Bonoli, P.; Bramley, R.; Breslau, J.; Chance, M.; Chen, J.; Choi, M.; Elwasif, W.; Fu, G.; Harvey, R.; Jaeger, E.; Jardin, S.; Jenkins, T.; Keyes, D.; Klasky, S.; Kruger, S.; Ku, L.; Lynch, V.; McCune, D.; Ramos, J.; Schissel, D.; Schnack, D.; Wright, J.
2008-07-01
The broad scientific objectives of the SWIM (Simulation 01 Wave Interaction with MHD) project are twofold: (1) improve our understanding of interactions that both radio frequency (RF) wave and particle sources have on extended-MHD phenomena, and to substantially improve our capability for predicting and optimizing the performance of burning plasmas in devices such as ITER: and (2) develop an integrated computational system for treating multiphysics phenomena with the required flexibility and extensibility to serve as a prototype for the Fusion Simulation Project. The Integrated Plasma Simulator (IPS) has been implemented. Presented here are initial physics results on RP effects on MHD instabilities in tokamaks as well as simulation results for tokamak discharge evolution using the IPS.
A Collaborative Extensible User Environment for Simulation and Knowledge Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Vicky L.; Lansing, Carina S.; Porter, Ellen A.
2015-06-01
In scientific simulation, scientists use measured data to create numerical models, execute simulations and analyze results from advanced simulators executing on high performance computing platforms. This process usually requires a team of scientists collaborating on data collection, model creation and analysis, and on authorship of publications and data. This paper shows that scientific teams can benefit from a user environment called Akuna that permits subsurface scientists in disparate locations to collaborate on numerical modeling and analysis projects. The Akuna user environment is built on the Velo framework that provides both a rich client environment for conducting and analyzing simulations andmore » a Web environment for data sharing and annotation. Akuna is an extensible toolset that integrates with Velo, and is designed to support any type of simulator. This is achieved through data-driven user interface generation, use of a customizable knowledge management platform, and an extensible framework for simulation execution, monitoring and analysis. This paper describes how the customized Velo content management system and the Akuna toolset are used to integrate and enhance an effective collaborative research and application environment. The extensible architecture of Akuna is also described and demonstrates its usage for creation and execution of a 3D subsurface simulation.« less
THE VIRTUAL INSTRUMENT: SUPPORT FOR GRID-ENABLED MCELL SIMULATIONS
Casanova, Henri; Berman, Francine; Bartol, Thomas; Gokcay, Erhan; Sejnowski, Terry; Birnbaum, Adam; Dongarra, Jack; Miller, Michelle; Ellisman, Mark; Faerman, Marcio; Obertelli, Graziano; Wolski, Rich; Pomerantz, Stuart; Stiles, Joel
2010-01-01
Ensembles of widely distributed, heterogeneous resources, or Grids, have emerged as popular platforms for large-scale scientific applications. In this paper we present the Virtual Instrument project, which provides an integrated application execution environment that enables end-users to run and interact with running scientific simulations on Grids. This work is performed in the specific context of MCell, a computational biology application. While MCell provides the basis for running simulations, its capabilities are currently limited in terms of scale, ease-of-use, and interactivity. These limitations preclude usage scenarios that are critical for scientific advances. Our goal is to create a scientific “Virtual Instrument” from MCell by allowing its users to transparently access Grid resources while being able to steer running simulations. In this paper, we motivate the Virtual Instrument project and discuss a number of relevant issues and accomplishments in the area of Grid software development and application scheduling. We then describe our software design and report on the current implementation. We verify and evaluate our design via experiments with MCell on a real-world Grid testbed. PMID:20689618
MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation
Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.; ...
2016-01-01
We present MADNESS (multiresolution adaptive numerical environment for scientific simulation) that is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision that are based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.
Integrated simulations for fusion research in the 2030's time frame (white paper outline)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, Alex; LoDestro, Lynda L.; Parker, Jeffrey B.
This white paper presents the rationale for developing a community-wide capability for whole-device modeling, and advocates for an effort with the expectation of persistence: a long-term programmatic commitment, and support for community efforts. Statement of 2030 goal (two suggestions): (a) Robust integrated simulation tools to aid real-time experimental discharges and reactor designs by employing a hierarchy in fidelity of physics models. (b) To produce by the early 2030s a capability for validated, predictive simulation via integration of a suite of physics models from moderate through high fidelity, to understand and plan full plasma discharges, aid in data interpretation, carry outmore » discovery science, and optimize future machine designs. We can achieve this goal via a focused effort to extend current scientific capabilities and rigorously integrate simulations of disparate physics into a comprehensive set of workflows.« less
NASA/ESACV-990 spacelab simulation. Appendix B: Experiment development and performance
NASA Technical Reports Server (NTRS)
Reller, J. O., Jr.; Neel, C. B.; Haughney, L. C.
1976-01-01
Eight experiments flown on the CV-990 airborne laboratory during the NASA/ESA joint Spacelab simulation mission are described in terms of their physical arrangement in the aircraft, their scientific objectives, developmental considerations dictated by mission requirements, checkout, integration into the aircraft, and the inflight operation and performance of the experiments.
Electromagnetic Fields Exposure Limits
2018-01-01
analysis, synthesis, integration and validation of knowledge derived through the scientific method. In NATO, S&T is addressed using different...Panel • NMSG NATO Modelling and Simulation Group • SAS System Analysis and Studies Panel • SCI Systems Concepts and Integration Panel • SET... integrity or morphology. They later also failed to find a lack of direct DNA damage in human blood (strand breaks, alkali-labile sites, and incomplete
Instructional Simulation Integrates Research, Education, and Practice.
Teasdale, Thomas A; Mapes, Sheryl A; Henley, Omolara; Lindsey, Jeanene; Dillard, Della
2016-01-01
Instructional simulation is widely used in clinical education. Examples include the use of inanimate models meant to imitate humans, standardized patients who are actors portraying patients with certain conditions, and role-play where learners experience the disease through props and circumstances. These modalities are briefly described, and then case examples are provided of simulation curricula in use that integrate research findings and clinical practice expertise to guide development and implementation steps. The cases illustrate how formative and summative feedback from two legs of the "three-legged stool" can be potent integrating forces in development of simulation curricula. In these examples, the educational outputs benefit from purposeful inclusion of research and practice inputs. Costs are outlined for instructor and learner time commitments, space considerations, and expendables. The authors' data and experience suggest that instructional simulation that is supported by a solid scientific base and clinical expertise is appreciated by teachers and learners.
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
NASA Astrophysics Data System (ADS)
Pallant, Amy; Lee, Hee-Sun
2015-04-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.
ERIC Educational Resources Information Center
Ray, Darrell L.
2013-01-01
Students often enter biology programs deficient in the math and computational skills that would enhance their attainment of a deeper understanding of the discipline. To address some of these concerns, I developed a series of spreadsheet simulation exercises that focus on some of the mathematical foundations of scientific inquiry and the benefits…
CVT/GPL phase 3 integrated testing
NASA Technical Reports Server (NTRS)
Shurney, R. E.; Cantrell, E.; Maybee, G.; Schmitt, S.
1975-01-01
The hardware for 20 candidate shuttle program life sciences experiments was installed in the GPL and experiments were conducted during a 5-day simulated mission. The experiments involved humans, primates, rats, chickens, and marigold plants. All experiments were completed to the satisfaction of the experimenters. In addition to the scientific data gathered for each experiment, information was obtained concerning experiment hardware design and integration, experiment procedures, GPL support systems, and test operations. The results of the integrated tests are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energymore » advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.« less
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
GillesPy: A Python Package for Stochastic Model Building and Simulation.
Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R
2016-09-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.
GillesPy: A Python Package for Stochastic Model Building and Simulation
Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.
2017-01-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
Protein Simulation Data in the Relational Model.
Simms, Andrew M; Daggett, Valerie
2012-10-01
High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.
Protein Simulation Data in the Relational Model
Simms, Andrew M.; Daggett, Valerie
2011-01-01
High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost—significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server. PMID:23204646
ERIC Educational Resources Information Center
Parks, Melissa
2014-01-01
Model-eliciting activities (MEAs) are not new to those in engineering or mathematics, but they were new to Melissa Parks. Model-eliciting activities are simulated real-world problems that integrate engineering, mathematical, and scientific thinking as students find solutions for specific scenarios. During this process, students generate solutions…
The Virtual Brain: a simulator of primate brain network dynamics.
Sanz Leon, Paula; Knock, Stuart A; Woodman, M Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor
2013-01-01
We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications.
The Virtual Brain: a simulator of primate brain network dynamics
Sanz Leon, Paula; Knock, Stuart A.; Woodman, M. Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R.; Jirsa, Viktor
2013-01-01
We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications. PMID:23781198
1979-10-01
prescribed as well as alternative personnel and equipment configurations. This user’s guide is a companion to ARI Technical Report 413 (Volume IV...Library I Medrchn Chef C E.R.P.A.-Arsenal. TouioneNaval France 2 USA Aviation Test Bd. Ft Rucker. ATTN: STEBO-PO I P.... Scientific Off. Aptil Hfum
NASA Astrophysics Data System (ADS)
Zieleniewski, Simon; Thatte, Niranjan; Kendrew, Sarah; Houghton, Ryan; Tecza, Matthias; Clarke, Fraser; Fusco, Thierry; Swinbank, Mark
2014-07-01
With the next generation of extremely large telescopes commencing construction, there is an urgent need for detailed quantitative predictions of the scientific observations that these new telescopes will enable. Most of these new telescopes will have adaptive optics fully integrated with the telescope itself, allowing unprecedented spatial resolution combined with enormous sensitivity. However, the adaptive optics point spread function will be strongly wavelength dependent, requiring detailed simulations that accurately model these variations. We have developed a simulation pipeline for the HARMONI integral field spectrograph, a first light instrument for the European Extremely Large Telescope. The simulator takes high-resolution input data-cubes of astrophysical objects and processes them with accurate atmospheric, telescope and instrumental effects, to produce mock observed cubes for chosen observing parameters. The output cubes represent the result of a perfect data reduc- tion process, enabling a detailed analysis and comparison between input and output, showcasing HARMONI's capabilities. The simulations utilise a detailed knowledge of the telescope's wavelength dependent adaptive op- tics point spread function. We discuss the simulation pipeline and present an early example of the pipeline functionality for simulating observations of high redshift galaxies.
Hosting and pulishing astronomical data in SQL databases
NASA Astrophysics Data System (ADS)
Galkin, Anastasia; Klar, Jochen; Riebe, Kristin; Matokevic, Gal; Enke, Harry
2017-04-01
In astronomy, terabytes and petabytes of data are produced by ground instruments, satellite missions and simulations. At Leibniz-Institute for Astrophysics Potsdam (AIP) we host and publish terabytes of cosmological simulation and observational data. The public archive at AIP has now reached a size of 60TB and growing and helps to produce numerous scientific papers. The web framework Daiquiri offers a dedicated web interface for each of the hosted scientific databases. Scientists all around the world run SQL queries which include specific astrophysical functions and get their desired data in reasonable time. Daiquiri supports the scientific projects by offering a number of administration tools such as database and user management, contact messages to the staff and support for organization of meetings and workshops. The webpages can be customized and the Wordpress integration supports the participating scientists in maintaining the documentation and the projects' news sections.
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
Flexible workflow sharing and execution services for e-scientists
NASA Astrophysics Data System (ADS)
Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely
2013-04-01
The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.
The LSST Scheduler from design to construction
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Reuter, Michael A.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.
OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments
NASA Astrophysics Data System (ADS)
Rebuffi, Luca; Sanchez del Rio, Manuel
2017-08-01
The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.
NASA Astrophysics Data System (ADS)
Turinsky, Paul J.; Martin, William R.
2017-04-01
In this special issue of the Journal of Computational Physics, the research and development completed at the time of manuscript submission by the Consortium for Advanced Simulation of Light Water Reactors (CASL) is presented. CASL is the first of several Energy Innovation Hubs that have been created by the Department of Energy. The Hubs are modeled after the strong scientific management characteristics of the Manhattan Project and AT&T Bell Laboratories, and function as integrated research centers that combine basic and applied research with engineering to accelerate scientific discovery that addresses critical energy issues. Lifetime of a Hub is expected to be five or ten years depending upon performance, with CASL being granted a ten year lifetime.
The Use of a Computer Simulation to Promote Scientific Conceptions of Moon Phases
ERIC Educational Resources Information Center
Bell, Randy L.; Trundle, Kathy Cabe
2008-01-01
This study described the conceptual understandings of 50 early childhood (Pre-K-3) preservice teachers about standards-based lunar concepts before and after inquiry-based instruction utilizing educational technology. The instructional intervention integrated the planetarium software "Starry Night Backyard[TM]" with instruction on moon phases from…
Decision makers often need assistance in understanding the dynamic interactions and linkages among economic, environmental and social systems in coastal watersheds. They also need scientific input to better evaluate the potential costs and benefits of intervention options. The US...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.
2014-01-28
Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create andmore » execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern Interconnect domain, which they are now planning to extend to predict the demand for the complete century. The initial study raised their data demands from a few GBs to 400GB for the 3year study and expected tens of TBs for the full century.« less
Master of Puppets: Cooperative Multitasking for In Situ Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Lukic, Zarija
2016-01-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
NASA Astrophysics Data System (ADS)
Straub, K. H.; Kesgin, B.
2012-12-01
During the fall 2012 semester, students in two introductory courses at Susquehanna University - EENV:101 Environmental Science and POLI:131 World Affairs - will participate together in an online international relations simulation called Statecraft (www.statecraftsim.com). In this strategy game, students are divided into teams representing independent countries, and choose their government type (democracy, constitutional monarchy, communist totalitarian, or military dictatorship) and two country attributes (industrial, green, militaristic, pacifist, or scientific), which determine a set of rules by which that country must abide. Countries interact over issues such as resource distribution, war, pollution, immigration, and global climate change, and must also keep domestic political unrest to a minimum in order to succeed in the game. This simulation has typically been run in political science courses, as the goal is to allow students to experience the balancing act necessary to maintain control of global and domestic issues in a dynamic, diverse world. This semester, environmental science students will be integrated into the simulation, both as environmental advisers to each country and as independent actors representing groups such as Greenpeace, ExxonMobil, and UNEP. The goal in integrating the two courses in the simulation is for the students in each course to gain both 1) content knowledge of certain fundamental material in the other course, and 2) a more thorough, applied understanding of the integrated nature of the two subjects. Students will gain an appreciation for the multiple tradeoffs that decision-makers must face in the real world (economy, resources, pollution, health, defense, etc.). Environmental science students will link these concepts to the traditional course material through a "systems thinking" approach to sustainability. Political science students will face the challenges of global climate change and gain an understanding of the nature of scientific research and uncertainty on this topic. One of the global issues that students must face in the simulation is the melting of "Ice Mountain," which threatens to flood coastal cities before the end of the game; only through cooperative action can the "Globe of Frost" be built to potentially stop the melting. In addition, the game fundamentally integrates tradeoffs between resources, pollution, immigration, education, health, defense, and other sustainability-related subjects throughout. Pre- and post-course surveys will include both environmental science/sustainability and political science concepts that may not be explicitly taught in both courses, but that students should have a greater awareness of through their interaction in the Statecraft simulation. Student attitudes toward integration of the course material will also be assessed.
Science in support of the Deepwater Horizon response
Lubchenco, Jane; McNutt, Marcia K.; Dreyfus, Gabrielle; Murawski, Steven A.; Kennedy, David M.; Anastas, Paul T.; Chu, Steven; Hunter, Tom
2012-01-01
This introduction to the Special Feature presents the context for science during the Deepwater Horizon oil spill response, summarizes how scientific knowledge was integrated across disciplines and statutory responsibilities, identifies areas where scientific information was accurate and where it was not, and considers lessons learned and recommendations for future research and response. Scientific information was integrated within and across federal and state agencies, with input from nongovernmental scientists, across a diverse portfolio of needs—stopping the flow of oil, estimating the amount of oil, capturing and recovering the oil, tracking and forecasting surface oil, protecting coastal and oceanic wildlife and habitat, managing fisheries, and protecting the safety of seafood. Disciplines involved included atmospheric, oceanographic, biogeochemical, ecological, health, biological, and chemical sciences, physics, geology, and mechanical and chemical engineering. Platforms ranged from satellites and planes to ships, buoys, gliders, and remotely operated vehicles to laboratories and computer simulations. The unprecedented response effort depended directly on intense and extensive scientific and engineering data, information, and advice. Many valuable lessons were learned that should be applied to future events.
Science in support of the Deepwater Horizon response
Lubchenco, Jane; McNutt, Marcia K.; Dreyfus, Gabrielle; Murawski, Steven A.; Kennedy, David M.; Anastas, Paul T.; Chu, Steven; Hunter, Tom
2012-01-01
This introduction to the Special Feature presents the context for science during the Deepwater Horizon oil spill response, summarizes how scientific knowledge was integrated across disciplines and statutory responsibilities, identifies areas where scientific information was accurate and where it was not, and considers lessons learned and recommendations for future research and response. Scientific information was integrated within and across federal and state agencies, with input from nongovernmental scientists, across a diverse portfolio of needs—stopping the flow of oil, estimating the amount of oil, capturing and recovering the oil, tracking and forecasting surface oil, protecting coastal and oceanic wildlife and habitat, managing fisheries, and protecting the safety of seafood. Disciplines involved included atmospheric, oceanographic, biogeochemical, ecological, health, biological, and chemical sciences, physics, geology, and mechanical and chemical engineering. Platforms ranged from satellites and planes to ships, buoys, gliders, and remotely operated vehicles to laboratories and computer simulations. The unprecedented response effort depended directly on intense and extensive scientific and engineering data, information, and advice. Many valuable lessons were learned that should be applied to future events. PMID:23213250
Center for Integrated Nanotechnologies 2011 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, Antonya
We are pleased to share with you this 2011 edition of the Annual Report from the Center for Integrated Nanotechnologies (CINT) and the growing excitement we feel around cementing our brand as a leader in integration nanoscience. This can be seen most readily in the momentum we have achieved in our signature Integration Focus Activities (IFAs). These efforts unite our scientists across our four scientific Thrust areas with our users to concentrate research on larger-scale nanoscience integration challenges for specific classes of nanomaterials, systems, and phenomena. All three of our current IFAs (p. 10) now have a full head ofmore » steam, and nearly 30% of our current user projects map in some meaningful way to one of these IFAs. As part of our redoubled effort to increase our industrial user base, we are also looking to leverage these IFAs to build a stronger link to and spur recruitment within our industrial user community. We believe that the IFAs are a natural community-building tool with an intrinsic value proposition for industry; an R&D pipeline that can lead to more mature, more commercially well-positioned technologies. Finally, as nanoscience and nanotechnology are maturing, we as a research community are beginning to see our efforts extend in many exciting new directions. Our focus on nanoscience integration positions us very well to capitalize on new opportunities including the emerging Mesoscale Initiative within the DOE Office of Science. Many aspects of mesoscale science are embodied in the integration of nanoscale building blocks. We are equally proud of our continuing strong performance in support of our user program. We have fully transitioned to our new user proposal database providing enhanced convenience and flexibility for proposal submission and review. In our two regular proposal calls this year we received a total of 225 proposals, an increase of 10% over our 2010 performance. Our official count on number of users for the period remains at {approx}350 and continues to reflect full engagement of our scientific staff. We are also seeing a steady increase in our industrial user base, with the number of industrial proposals (including Rapid Access proposals) doubling in 2011. We attribute this in part of our outreach efforts including our focused industrial session in each of our past two annual User Conferences. The Center for Integrated Nanotechnologies (CINT) is a Department of Energy/Office of Science Nanoscale Science Research Center (NSRC) operating as a national user facility devoted to establishing the scientific principles that govern the design, performance, and integration of nanoscale materials. Jointly operated by Los Alamos and Sandia National Laboratories, CINT explores the continuum from scientific discovery to use-inspired research, with a focus on the integration of nanoscale materials and structures to achieve new properties and performance and their incorporation into the micro- and macro worlds. Through its Core Facility at Sandia National Laboratories and its Gateway Facility at Los Alamos National Laboratory, CINT provides open access to tools and expertise needed to explore the continuum from scientific discovery to the integration of nanostructures into the micro- and macro worlds. In its overall operations, CINT strives to achieve the following goals common to all Nanoscale Science Research Centers: (1) Conduct forefront research in nanoscale science; (2) Operate as a user facility for scientific research; (3) Provide user access to the relevant BES-supported expertise and capabilities at the host national laboratory; and (4) Leverage other relevant national laboratory capabilities to enhance scientific opportunities for the nanoscience user community. These additional goals are specific to the unique CINT mission: (5) Establish and lead a scientific community dedicated to solving nanoscale science integration challenges; and (6) Create a single user facility program that combines expertise and facilities at both Los Alamos and Sandia National Laboratories. The CINT user program provides the international scientific community with open access to world-class scientific staff and state-of-the-art facilities for theory and simulation, nanomaterials synthesis and characterization, and unique capabilities for nanoscale materials integration, from the level of nanoscale synthesis to the fabrication of micro- and macroscale structures and devices. The staff of CINT includes laboratory scientists, postdocs and technical support staff who are leaders in the nanoscience research programs in CINT scientific thrust areas: (1) Nanoscale Electronics and Mechanics, (2) Nanophotonics and Optical Nanomaterials, (3) Soft, Biological and Composite Nanomaterials, and (4) Theory and Simulation of Nanoscale Phenomena.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel
Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less
Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel; ...
2017-07-24
Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less
An Open Simulation System Model for Scientific Applications
NASA Technical Reports Server (NTRS)
Williams, Anthony D.
1995-01-01
A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.
Integrating Numerical Computation into the Modeling Instruction Curriculum
ERIC Educational Resources Information Center
Caballero, Marcos D.; Burk, John B.; Aiken, John M.; Thoms, Brian D.; Douglas, Scott S.; Scanlon, Erin M.; Schatz, Michael F.
2014-01-01
Numerical computation (the use of a computer to solve, simulate, or visualize a physical problem) has fundamentally changed the way scientific research is done. Systems that are too difficult to solve in closed form are probed using computation. Experiments that are impossible to perform in the laboratory are studied numerically. Consequently, in…
Management and assimilation of diverse, distributed watershed datasets
NASA Astrophysics Data System (ADS)
Varadharajan, C.; Faybishenko, B.; Versteeg, R.; Agarwal, D.; Hubbard, S. S.; Hendrix, V.
2016-12-01
The U.S. Department of Energy's (DOE) Watershed Function Scientific Focus Area (SFA) seeks to determine how perturbations to mountainous watersheds (e.g., floods, drought, early snowmelt) impact the downstream delivery of water, nutrients, carbon, and metals over seasonal to decadal timescales. We are building a software platform that enables integration of diverse and disparate field, laboratory, and simulation datasets, of various types including hydrological, geological, meteorological, geophysical, geochemical, ecological and genomic datasets across a range of spatial and temporal scales within the Rifle floodplain and the East River watershed, Colorado. We are using agile data management and assimilation approaches, to enable web-based integration of heterogeneous, multi-scale dataSensor-based observations of water-level, vadose zone and groundwater temperature, water quality, meteorology as well as biogeochemical analyses of soil and groundwater samples have been curated and archived in federated databases. Quality Assurance and Quality Control (QA/QC) are performed on priority datasets needed for on-going scientific analyses, and hydrological and geochemical modeling. Automated QA/QC methods are used to identify and flag issues in the datasets. Data integration is achieved via a brokering service that dynamically integrates data from distributed databases via web services, based on user queries. The integrated results are presented to users in a portal that enables intuitive search, interactive visualization and download of integrated datasets. The concepts, approaches and codes being used are shared across various data science components of various large DOE-funded projects such as the Watershed Function SFA, Next Generation Ecosystem Experiment (NGEE) Tropics, Ameriflux/FLUXNET, and Advanced Simulation Capability for Environmental Management (ASCEM), and together contribute towards DOE's cyberinfrastructure for data management and model-data integration.
NASA Astrophysics Data System (ADS)
Lim, D. S. S.; Abercromby, A.; Beaton, K.; Brady, A. L.; Cardman, Z.; Chappell, S.; Cockell, C. S.; Cohen, B. A.; Cohen, T.; Deans, M.; Deliz, I.; Downs, M.; Elphic, R. C.; Hamilton, J. C.; Heldmann, J.; Hillenius, S.; Hoffman, J.; Hughes, S. S.; Kobs-Nawotniak, S. E.; Lees, D. S.; Marquez, J.; Miller, M.; Milovsoroff, C.; Payler, S.; Sehlke, A.; Squyres, S. W.
2016-12-01
Analogs are destinations on Earth that allow researchers to approximate operational and/or physical conditions on other planetary bodies and within deep space. Over the past decade, our team has been conducting geobiological field science studies under simulated deep space and Mars mission conditions. Each of these missions integrate scientific and operational research with the goal to identify concepts of operations (ConOps) and capabilities that will enable and enhance scientific return during human and human-robotic missions to the Moon, into deep space and on Mars. Working under these simulated mission conditions presents a number of unique challenges that are not encountered during typical scientific field expeditions. However, there are significant benefits to this working model from the perspective of the human space flight and scientific operations research community. Specifically, by applying human (and human-robotic) mission architectures to real field science endeavors, we create a unique operational litmus test for those ConOps and capabilities that have otherwise been vetted under circumstances that did not necessarily demand scientific data return meeting the rigors of peer-review standards. The presentation will give an overview of our team's recent analog research, with a focus on the scientific operations research. The intent is to encourage collaborative dialog with a broader set of analog research community members with an eye towards future scientific field endeavors that will have a significant impact on how we design human and human-robotic missions to the Moon, into deep space and to Mars.
NASA Shines a Spotlight on a Webb Telescope Test
2013-12-11
Dressed in a clean room suit, NASA photographer Desiree Stover shines a light on the Space Environment Simulator's Integration Frame inside the thermal vacuum chamber at NASA's Goddard Space Flight Center in Greenbelt, Md. Shortly after, the chamber was closed up and engineers used this frame to enclose and help cryogenic (cold) test the heart of the James Webb Space Telescope, the Integrated Science Instrument Module. Credit: NASA/Goddard/Chris Gunn NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Astrophysical Computation in Research, the Classroom and Beyond
NASA Astrophysics Data System (ADS)
Frank, Adam
2009-03-01
In this talk I review progress in the use of simulations as a tool for astronomical research, for education and public outreach. The talk will include the basic elements of numerical simulations as well as advances in algorithms which have led to recent dramatic progress such as the use of Adaptive Mesh Refinement methods. The scientific focus of the talk will be star formation jets and outflows while the educational emphasis will be on the use of advanced platforms for simulation based learning in lecture and integrated homework. Learning modules for science outreach websites such as DISCOVER magazine will also be highlighted.
Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin
2016-05-13
This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.
Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason P.; Modenese, Luca; Reinbolt, Jeffrey A.; Thelen, Darryl G.; Umberger, Brian R.
2016-01-01
Objective The overall goal of this document is to demonstrate that dissemination of models and analyses for assessing the reproducibility of simulation results can be incorporated in the scientific review process in biomechanics. Methods As part of a special issue on model sharing and reproducibility in IEEE Transactions on Biomedical Engineering, two manuscripts on computational biomechanics were submitted: A. Rajagopal et al., IEEE Trans. Biomed. Eng., 2016 and A. Schmitz and D. Piovesan, IEEE Trans. Biomed. Eng., 2016. Models used in these studies were shared with the scientific reviewers and the public. In addition to the standard review of the manuscripts, the reviewers downloaded the models and performed simulations that reproduced results reported in the studies. Results There was general agreement between simulation results of the authors and those of the reviewers. Discrepancies were resolved during the necessary revisions. The manuscripts and instructions for download and simulation were updated in response to the reviewers’ feedback; changes that may otherwise have been missed if explicit model sharing and simulation reproducibility analysis were not conducted in the review process. Increased burden on the authors and the reviewers, to facilitate model sharing and to repeat simulations, were noted. Conclusion When the authors of computational biomechanics studies provide access to models and data, the scientific reviewers can download and thoroughly explore the model, perform simulations, and evaluate simulation reproducibility beyond the traditional manuscript-only review process. Significance Model sharing and reproducibility analysis in scholarly publishing will result in a more rigorous review process, which will enhance the quality of modeling and simulation studies and inform future users of computational models. PMID:28072567
Kraemer Diaz, Anne E.; Spears Johnson, Chaya R.; Arcury, Thomas A.
2013-01-01
Community-based participatory research (CBPR) has become essential in health disparities and environmental justice research; however, the scientific integrity of CBPR projects has become a concern. Some concerns, such as appropriate research training, lack of access to resources and finances, have been discussed as possibly limiting the scientific integrity of a project. Prior to understanding what threatens scientific integrity in CBPR, it is vital to understand what scientific integrity means for the professional and community investigators who are involved in CBPR. This analysis explores the interpretation of scientific integrity in CBPR among 74 professional and community research team members from of 25 CBPR projects in nine states in the southeastern United States in 2012. It describes the basic definition for scientific integrity and then explores variations in the interpretation of scientific integrity in CBPR. Variations in the interpretations were associated with team member identity as professional or community investigators. Professional investigators understood scientific integrity in CBPR as either conceptually or logistically flexible, as challenging to balance with community needs, or no different than traditional scientific integrity. Community investigators interpret other factors as important in scientific integrity, such as trust, accountability, and overall benefit to the community. This research demonstrates that the variations in the interpretation of scientific integrity in CBPR call for a new definition of scientific integrity in CBPR that takes into account the understanding and needs of all investigators. PMID:24161098
NASA Astrophysics Data System (ADS)
Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.
2016-02-01
Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2016-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2017-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
A new paradigm for reproducing and analyzing N-body simulations of planetary systems
NASA Astrophysics Data System (ADS)
Rein, Hanno; Tamayo, Daniel
2017-05-01
The reproducibility of experiments is one of the main principles of the scientific method. However, numerical N-body experiments, especially those of planetary systems, are currently not reproducible. In the most optimistic scenario, they can only be replicated in an approximate or statistical sense. Even if authors share their full source code and initial conditions, differences in compilers, libraries, operating systems or hardware often lead to qualitatively different results. We provide a new set of easy-to-use, open-source tools that address the above issues, allowing for exact (bit-by-bit) reproducibility of N-body experiments. In addition to generating completely reproducible integrations, we show that our framework also offers novel and innovative ways to analyse these simulations. As an example, we present a high-accuracy integration of the Solar system spanning 10 Gyr, requiring several weeks to run on a modern CPU. In our framework, we can not only easily access simulation data at predefined intervals for which we save snapshots, but at any time during the integration. We achieve this by integrating an on-demand reconstructed simulation forward in time from the nearest snapshot. This allows us to extract arbitrary quantities at any point in the saved simulation exactly (bit-by-bit), and within seconds rather than weeks. We believe that the tools we present in this paper offer a new paradigm for how N-body simulations are run, analysed and shared across the community.
Data-Intensive Scientific Management, Analysis and Visualization
NASA Astrophysics Data System (ADS)
Goranova, Mariana; Shishedjiev, Bogdan; Juliana Georgieva, Juliana
2012-11-01
The proposed integrated system provides a suite of services for data-intensive sciences that enables scientists to describe, manage, analyze and visualize data from experiments and numerical simulations in distributed and heterogeneous environment. This paper describes the advisor and the converter services and presents an example from the monitoring of the slant column content of atmospheric minor gases.
Telescience - Concepts And Contributions To The Extreme Ultraviolet Explorer Mission
NASA Astrophysics Data System (ADS)
Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.
1987-10-01
A goal of the telescience concept is to allow scientists to use remotely located instruments as they would in their laboratory. Another goal is to increase reliability and scientific return of these instruments. In this paper we discuss the role of transparent software tools in development, integration, and postlaunch environments to achieve hands on access to the instrument. The use of transparent tools helps to reduce the parallel development of capability and to assure that valuable pre-launch experience is not lost in the operations phase. We also discuss the use of simulation as a rapid prototyping technique. Rapid prototyping provides a cost-effective means of using an iterative approach to instrument design. By allowing inexpensive produc-tion of testbeds, scientists can quickly tune the instrument to produce the desired scientific data. Using portions of the Extreme Ultraviolet Explorer (EUVE) system, we examine some of the results of preliminary tests in the use of simulation and tran-sparent tools. Additionally, we discuss our efforts to upgrade our software "EUVE electronics" simulator to emulate a full instrument, and give the pros and cons of the simulation facilities we have developed.
Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project
NASA Astrophysics Data System (ADS)
Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo
2017-04-01
The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.
Efficient Integration of Coupled Electrical-Chemical Systems in Multiscale Neuronal Simulations
Brocke, Ekaterina; Bhalla, Upinder S.; Djurfeldt, Mikael; Hellgren Kotaleski, Jeanette; Hanke, Michael
2016-01-01
Multiscale modeling and simulations in neuroscience is gaining scientific attention due to its growing importance and unexplored capabilities. For instance, it can help to acquire better understanding of biological phenomena that have important features at multiple scales of time and space. This includes synaptic plasticity, memory formation and modulation, homeostasis. There are several ways to organize multiscale simulations depending on the scientific problem and the system to be modeled. One of the possibilities is to simulate different components of a multiscale system simultaneously and exchange data when required. The latter may become a challenging task for several reasons. First, the components of a multiscale system usually span different spatial and temporal scales, such that rigorous analysis of possible coupling solutions is required. Then, the components can be defined by different mathematical formalisms. For certain classes of problems a number of coupling mechanisms have been proposed and successfully used. However, a strict mathematical theory is missing in many cases. Recent work in the field has not so far investigated artifacts that may arise during coupled integration of different approximation methods. Moreover, in neuroscience, the coupling of widely used numerical fixed step size solvers may lead to unexpected inefficiency. In this paper we address the question of possible numerical artifacts that can arise during the integration of a coupled system. We develop an efficient strategy to couple the components comprising a multiscale test problem in neuroscience. We introduce an efficient coupling method based on the second-order backward differentiation formula (BDF2) numerical approximation. The method uses an adaptive step size integration with an error estimation proposed by Skelboe (2000). The method shows a significant advantage over conventional fixed step size solvers used in neuroscience for similar problems. We explore different coupling strategies that define the organization of computations between system components. We study the importance of an appropriate approximation of exchanged variables during the simulation. The analysis shows a substantial impact of these aspects on the solution accuracy in the application to our multiscale neuroscientific test problem. We believe that the ideas presented in the paper may essentially contribute to the development of a robust and efficient framework for multiscale brain modeling and simulations in neuroscience. PMID:27672364
Efficient Integration of Coupled Electrical-Chemical Systems in Multiscale Neuronal Simulations.
Brocke, Ekaterina; Bhalla, Upinder S; Djurfeldt, Mikael; Hellgren Kotaleski, Jeanette; Hanke, Michael
2016-01-01
Multiscale modeling and simulations in neuroscience is gaining scientific attention due to its growing importance and unexplored capabilities. For instance, it can help to acquire better understanding of biological phenomena that have important features at multiple scales of time and space. This includes synaptic plasticity, memory formation and modulation, homeostasis. There are several ways to organize multiscale simulations depending on the scientific problem and the system to be modeled. One of the possibilities is to simulate different components of a multiscale system simultaneously and exchange data when required. The latter may become a challenging task for several reasons. First, the components of a multiscale system usually span different spatial and temporal scales, such that rigorous analysis of possible coupling solutions is required. Then, the components can be defined by different mathematical formalisms. For certain classes of problems a number of coupling mechanisms have been proposed and successfully used. However, a strict mathematical theory is missing in many cases. Recent work in the field has not so far investigated artifacts that may arise during coupled integration of different approximation methods. Moreover, in neuroscience, the coupling of widely used numerical fixed step size solvers may lead to unexpected inefficiency. In this paper we address the question of possible numerical artifacts that can arise during the integration of a coupled system. We develop an efficient strategy to couple the components comprising a multiscale test problem in neuroscience. We introduce an efficient coupling method based on the second-order backward differentiation formula (BDF2) numerical approximation. The method uses an adaptive step size integration with an error estimation proposed by Skelboe (2000). The method shows a significant advantage over conventional fixed step size solvers used in neuroscience for similar problems. We explore different coupling strategies that define the organization of computations between system components. We study the importance of an appropriate approximation of exchanged variables during the simulation. The analysis shows a substantial impact of these aspects on the solution accuracy in the application to our multiscale neuroscientific test problem. We believe that the ideas presented in the paper may essentially contribute to the development of a robust and efficient framework for multiscale brain modeling and simulations in neuroscience.
NASA/ESA CV-990 spacelab simulation
NASA Technical Reports Server (NTRS)
Reller, J. O., Jr.
1976-01-01
Simplified techniques were applied to conduct an extensive spacelab simulation using the airborne laboratory. The scientific payload was selected to perform studies in upper atmospheric physics and infrared astronomy. The mission was successful and provided extensive data relevant to spacelab objectives on overall management of a complex international payload; experiment preparation, testing, and integration; training for proxy operation in space; data handling; multiexperimenter use of common experimenter facilities (telescopes); multiexperiment operation by experiment operators; selection criteria for spacelab experiment operators; and schedule requirements to prepare for such a spacelab mission.
A Process for Comparing Dynamics of Distributed Space Systems Simulations
NASA Technical Reports Server (NTRS)
Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.
2009-01-01
The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Appropriate types of thinning and surface fuel treatments are clearly useful in reducing surface and crown fire hazards under a wide range of fuels and topographic situations. This paper provides well-established scientific principles and simulation tools that can be used to adjust fuel treatments to attain specific risk levels.
NASA Astrophysics Data System (ADS)
Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.
2013-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.
Bungener, Martine; Hadchouel, Michelle
2012-09-01
Fraud is only a part of misconduct in research. Very few French research Institutions have a scientific integrity office, and their prevention. The Institut national de la santé et de la recherche médicale (Inserm) has created a "Scientific Integrity delegation". Scientific Integrity is an international concern. Scientific Integrity is closely linked to organisation, management and evaluation of all research activities. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
A high performance scientific cloud computing environment for materials simulations
NASA Astrophysics Data System (ADS)
Jorissen, K.; Vila, F. D.; Rehr, J. J.
2012-09-01
We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.
Open source software integrated into data services of Japanese planetary explorations
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.
2015-12-01
Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.
Working Group on Virtual Data Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, D. N.; Palanisamy, G.; van Dam, K. K.
2016-02-04
This report is the outcome of a workshop commissioned by the U.S. Department of Energy’s (DOE) Climate and Environmental Sciences Division (CESD) to examine current and future data infrastructure requirements foundational for achieving CESD scientific mission goals in advancing a robust, predictive understanding of Earth’s climate and environmental systems. Over the past several years, data volumes in CESD disciplines have risen sharply to unprecedented levels (tens of petabytes). Moreover, the complexity and diversity of this research data— including simulations, observations, and reanalysis— have grown significantly, posing new challenges for data capture, storage, verification, analysis, and integration. With the trends ofmore » increased data volume (in the hundreds of petabytes), more complex analysis processes, and growing cross-disciplinary collaborations, it is timely to investigate whether the CESD community has the computational and data support needed to fully realize the scientific potential of its data collections. In recognition of the challenges, a partnership is forming across CESD and among national and international agencies to examine the viability of creating an integrated, collaborative data infrastructure: a Virtual Laboratory. The overarching goal of this report is to identify the community’s key data technology requirements and high-priority development needs for sustaining and growing its scientific discovery potential. The report also aims to map these requirements to existing solutions and to identify gaps in current services, tools, and infrastructure that will need to be addressed in the short, medium, and long term to advance scientific progress.« less
Data And Informatics Working Group On Virtual Data Integration Workshop Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, D. N.; Palanisamy, G.; Dam, K. K.
2015-10-13
This report is the outcome of a workshop that was commissioned by the Department of Energy’s Climate and Environmental Sciences Division (CESD) to examine current and future data infrastructure requirements that would be foundational to achieving CESD’s scientific mission goals. Over the past several years, data volumes in CESD disciplines have risen sharply to unprecedented levels (tens of petabytes). So too has the complexity and diversity of the research data (simulation, observation, and reanalysis) needing to be captured, stored, verified, analyzed, and integrated. With the trends of increased data volume (in the hundreds of petabytes), more complex analysis processes, andmore » growing crossdisciplinary collaborations, it is timely to investigate whether the CESD community has the right computational and data support to realize the full scientific potential from its data collections. In recognition of the challenges, a partnership is forming across CESD and with national and international agencies to investigate the viability of creating an integrated, collaborative data infrastructure: a virtual laboratory. The overarching goal of this report is to identify the community’s key data technology requirements and high-priority development needs for sustaining and growing their scientific discovery potential. The report also aims to map these requirements to existing solutions and to identify gaps in current services, tools, and infrastructure that will need to be addressed in the short, medium, and long term so as not to impede scientific progress« less
iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems
NASA Astrophysics Data System (ADS)
Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.
2017-11-01
iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.
Kraemer Diaz, Anne E.; Spears Johnson, Chaya R.; Arcury, Thomas A.
2015-01-01
Scientific integrity is necessary for strong science; yet many variables can influence scientific integrity. In traditional research, some common threats are the pressure to publish, competition for funds, and career advancement. Community-based participatory research (CBPR) provides a different context for scientific integrity with additional and unique concerns. Understanding the perceptions that promote or discourage scientific integrity in CBPR as identified by professional and community investigators is essential to promoting the value of CBPR. This analysis explores the perceptions that facilitate scientific integrity in CBPR as well as the barriers among a sample of 74 professional and community CBPR investigators from 25 CBPR projects in nine states in the southeastern United States in 2012. There were variations in perceptions associated with team member identity as professional or community investigators. Perceptions identified to promote and discourage scientific integrity in CBPR by professional and community investigators were external pressures, community participation, funding, quality control and supervision, communication, training, and character and trust. Some perceptions such as communication and training promoted scientific integrity whereas other perceptions, such as a lack of funds and lack of trust could discourage scientific integrity. These results demonstrate that one of the most important perceptions in maintaining scientific integrity in CBPR is active community participation, which enables a co-responsibility by scientists and community members to provide oversight for scientific integrity. Credible CBPR science is crucial to empower the vulnerable communities to be heard by those in positions of power and policy making. PMID:25588933
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lingerfelt, Eric J; Endeve, Eirik; Hui, Yawei
Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now--with the rise of multimodal acquisition systems and the associated processing capability--the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalablemore » data analysis and simulation and manage uploaded data files via an intuitive, cross-platform client user interface. This framework delivers authenticated, "push-button" execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing compute-and-data cloud infrastructures and HPC environments like Titan at the Oak Ridge Leadershp Computing Facility (OLCF).« less
An Array Library for Microsoft SQL Server with Astrophysical Applications
NASA Astrophysics Data System (ADS)
Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.
2012-09-01
Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.
ERIC Educational Resources Information Center
Rodrigues, João P. G. L. M.; Melquiond, Adrien S. J.; Bonvin, Alexandre M. J. J.
2016-01-01
Molecular modelling and simulations are nowadays an integral part of research in areas ranging from physics to chemistry to structural biology, as well as pharmaceutical drug design. This popularity is due to the development of high-performance hardware and of accurate and efficient molecular mechanics algorithms by the scientific community. These…
Facilitating hydrological data analysis workflows in R: the RHydro package
NASA Astrophysics Data System (ADS)
Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik
2015-04-01
The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges of the RHydro package, including integration with big data technologies, web technologies, and emerging data models in hydrology.
Kraemer Diaz, Anne E; Spears Johnson, Chaya R; Arcury, Thomas A
2015-06-01
Scientific integrity is necessary for strong science; yet many variables can influence scientific integrity. In traditional research, some common threats are the pressure to publish, competition for funds, and career advancement. Community-based participatory research (CBPR) provides a different context for scientific integrity with additional and unique concerns. Understanding the perceptions that promote or discourage scientific integrity in CBPR as identified by professional and community investigators is essential to promoting the value of CBPR. This analysis explores the perceptions that facilitate scientific integrity in CBPR as well as the barriers among a sample of 74 professional and community CBPR investigators from 25 CBPR projects in nine states in the southeastern United States in 2012. There were variations in perceptions associated with team member identity as professional or community investigators. Perceptions identified to promote and discourage scientific integrity in CBPR by professional and community investigators were external pressures, community participation, funding, quality control and supervision, communication, training, and character and trust. Some perceptions such as communication and training promoted scientific integrity whereas other perceptions, such as a lack of funds and lack of trust could discourage scientific integrity. These results demonstrate that one of the most important perceptions in maintaining scientific integrity in CBPR is active community participation, which enables a co-responsibility by scientists and community members to provide oversight for scientific integrity. Credible CBPR science is crucial to empower the vulnerable communities to be heard by those in positions of power and policy making. © 2015 Society for Public Health Education.
ERIC Educational Resources Information Center
Kraemer Diaz, Anne E.; Spears Johnson, Chaya R.; Arcury, Thomas A.
2015-01-01
Scientific integrity is necessary for strong science; yet many variables can influence scientific integrity. In traditional research, some common threats are the pressure to publish, competition for funds, and career advancement. Community-based participatory research (CBPR) provides a different context for scientific integrity with additional and…
Kretser, Alison; Murphy, Delia; Dwyer, Johanna
2017-01-01
ABSTRACT Scientific integrity is at the forefront of the scientific research enterprise. This paper provides an overview of key existing efforts on scientific integrity by federal agencies, foundations, nonprofit organizations, professional societies, and academia from 1989 to April 2016. It serves as a resource for the scientific community on scientific integrity work and helps to identify areas in which more action is needed. Overall, there is tremendous activity in this area and there are clear linkages among the efforts of the five sectors. All the same, scientific integrity needs to remain visible in the scientific community and evolve along with new research paradigms. High priority in instilling these values falls upon all stakeholders. PMID:27748637
The integrity of science - lost in translation?
Kaiser, Matthias
2014-04-01
This paper presents some selected issues currently discussed about the integrity of science, and it argues that there exist serious challenges to integrity in the various sciences. Due to the involved conceptual complexities, even core definitions of scientific integrity have been disputed, and core cases of scientific misconduct influenced the public discussion about them. It is claimed that ethics and law may not always go well together in matters of scientific integrity. Explanations of the causes of scientific misconduct vary, and defining good scientific practices is not a straightforward task. Even though the efficacy of ethics courses to improve scientific integrity can be doubted, and universities probably need to come up with more innovative formats to improve ethics in scientific training, ethics talk may be the only practical remedy. Copyright © 2014. Published by Elsevier Ltd.
Scientific Integrity Policy Creation and Implementation.
NASA Astrophysics Data System (ADS)
Koizumi, K.
2017-12-01
Ensuring the integrity of science was a priority for the Obama Administration. In March 2009, President Obama issued a Presidential Memorandum that recognized the need for the public to be able to trust the science and scientific process informing public policy decisions. In 2010, the White House Office of Science and Technology Policy (OSTP) issued a Memorandum providing guidelines for Federal departments and agencies to follow in developing scientific integrity policies. This Memorandum describes minimum standards for: (1) strengthening the foundations of scientific integrity in government, including by shielding scientific data and analysis from inappropriate political influence; (2) improving public communication about science and technology by promoting openness and transparency; (3) enhancing the ability of Federal Advisory Committees to provide independent scientific advice; and (4) supporting the professional development of government scientists and engineers. The Memorandum called upon the heads of departments and agencies to develop scientific integrity policies that meet these requirements. At the end of the Obama Administration, 24 Federal departments and agencies had developed and implemented scientific integrity policies consistent with the OSTP guidelines. This year, there are significant questions as to the Trump Administration's commitment to these scientific integrity policies and interest in the Congress in codifying these policies in law. The session will provide an update on the status of agency scientific integrity policies and legislation.
A model of "integrated scientific method" and its application for the analysis of instruction
NASA Astrophysics Data System (ADS)
Rusbult, Craig Francis
A model of 'integrated scientific method' (ISM) was constructed as a framework for describing the process of science in terms of activities (formulating a research problem, and inventing and evaluating actions--such as selecting and inventing theories, evaluating theories, designing experiments, and doing experiments--intended to solve the problem) and evaluation criteria (empirical, conceptual, and cultural-personal). Instead of trying to define the scientific method, ISM is intended to serve as a flexible framework that--by varying the characteristics of its components, their integrated relationships, and their relative importance can be used to describe a variety of scientific methods, and a variety of perspectives about what constitutes an accurate portrayal of scientific methods. This framework is outlined visually and verbally, followed by an elaboration of the framework and my own views about science, and an evaluation of whether ISM can serve as a relatively neutral framework for describing a wide range of science practices and science interpretations. ISM was used to analyze an innovative, guided inquiry classroom (taught by Susan Johnson, using Genetics Construction Kit software) in which students do simulated scientific research by solving classical genetics problems that require effect-to-cause reasoning and theory revision. The immediate goal of analysis was to examine the 'science experiences' of students, to determine how the 'structure of instruction' provides opportunities for these experiences. Another goal was to test and improve the descriptive and analytical utility of ISM. In developing ISM, a major objective was to make ISM educationally useful. A concluding discussion includes controversies about "the nature of science" and how to teach it, how instruction can expand opportunities for student experience, and how goal-oriented intentional learning (using ISM might improve the learning, retention, and transfer of thinking skills. Potential educational applications of ISM could involve its use for instructional analysis or design, or for teaching students in the classroom; or ISM and IDM (a closely related, generalized 'integrated design method') could play valuable roles in a 'wide spiral' curriculum designed for the coordinated teaching of thinking skills, including creativity and critical thinking, across a wide range of subjects.
Optical Alignment of the JWST ISIM to the OTE Simulator (OSIM): Current Concept and Design Studies
NASA Technical Reports Server (NTRS)
Frey, Bradley J.; Davila, Pamela S.; Marsh, James M.; Ohl, Raymond G.; Sullivan, Joseph
2007-01-01
The James Webb Space Telescope's (JWST) Integrated Science Instrument Module (ISIM) is the scientific payload of the observatory and contai ns four science instruments. During alignment and test of the integrated ISIM (i.e. ISIM + science instruments) at NASA's Goddard Space Fli ght Center (GSFC), the Optical telescope element SIMulator (OSIM) wil l be used to optically stimulate the science instruments to verify their operation and performance. In this paper we present the design of two cryogenic alignment fixtures that will be used to determine and verify the proper alignment of OSIM to ISIM during testing at GSFC. The se fixtures, the Master Alignment Target Fixture (MATF) and the ISIM Alignment Target Fixture (IATF), will provide continuous, 6 degree of freedom feedback to OSIM during initial ambient alignment as well as during cryogenic vacuum testing.
Integrating Computational Science Tools into a Thermodynamics Course
NASA Astrophysics Data System (ADS)
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; Gerber, Richard
The U.S. Department of Energy (DOE) Office of Science (SC) Offices of High Energy Physics (HEP) and Advanced Scientific Computing Research (ASCR) convened a programmatic Exascale Requirements Review on June 10–12, 2015, in Bethesda, Maryland. This report summarizes the findings, results, and recommendations derived from that meeting. The high-level findings and observations are as follows. Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude — and in some cases greatermore » — than that available currently. The growth rate of data produced by simulations is overwhelming the current ability of both facilities and researchers to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. Data rates and volumes from experimental facilities are also straining the current HEP infrastructure in its ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. A close integration of high-performance computing (HPC) simulation and data analysis will greatly aid in interpreting the results of HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. Long-range planning between HEP and ASCR will be required to meet HEP’s research needs. To best use ASCR HPC resources, the experimental HEP program needs (1) an established, long-term plan for access to ASCR computational and data resources, (2) the ability to map workflows to HPC resources, (3) the ability for ASCR facilities to accommodate workflows run by collaborations potentially comprising thousands of individual members, (4) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, (5) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less
Defending the scientific integrity of conservation-policy processes.
Carroll, Carlos; Hartl, Brett; Goldman, Gretchen T; Rohlf, Daniel J; Treves, Adrian; Kerr, Jeremy T; Ritchie, Euan G; Kingsford, Richard T; Gibbs, Katherine E; Maron, Martine; Watson, James E M
2017-10-01
Government agencies faced with politically controversial decisions often discount or ignore scientific information, whether from agency staff or nongovernmental scientists. Recent developments in scientific integrity (the ability to perform, use, communicate, and publish science free from censorship or political interference) in Canada, Australia, and the United States demonstrate a similar trajectory. A perceived increase in scientific-integrity abuses provokes concerted pressure by the scientific community, leading to efforts to improve scientific-integrity protections under a new administration. However, protections are often inconsistently applied and are at risk of reversal under administrations publicly hostile to evidence-based policy. We compared recent challenges to scientific integrity to determine what aspects of scientific input into conservation policy are most at risk of political distortion and what can be done to strengthen safeguards against such abuses. To ensure the integrity of outbound communications from government scientists to the public, we suggest governments strengthen scientific integrity policies, include scientists' right to speak freely in collective-bargaining agreements, guarantee public access to scientific information, and strengthen agency culture supporting scientific integrity. To ensure the transparency and integrity with which information from nongovernmental scientists (e.g., submitted comments or formal policy reviews) informs the policy process, we suggest governments broaden the scope of independent reviews, ensure greater diversity of expert input and transparency regarding conflicts of interest, require a substantive response to input from agencies, and engage proactively with scientific societies. For their part, scientists and scientific societies have a responsibility to engage with the public to affirm that science is a crucial resource for developing evidence-based policy and regulations in the public interest. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.
2014-08-01
Integrated water system modeling is a reasonable approach to provide scientific understanding and possible solutions to tackle the severe water crisis faced over the world and to promote the implementation of integrated river basin management. Such a modeling practice becomes more feasible nowadays due to better computing facilities and available data sources. In this study, the process-oriented water system model (HEXM) is developed by integrating multiple water related processes including hydrology, biogeochemistry, environment and ecology, as well as the interference of human activities. The model was tested in the Shaying River Catchment, the largest, highly regulated and heavily polluted tributary of Huai River Basin in China. The results show that: HEXM is well integrated with good performance on the key water related components in the complex catchments. The simulated daily runoff series at all the regulated and less-regulated stations matches observations, especially for the high and low flow events. The average values of correlation coefficient and coefficient of efficiency are 0.81 and 0.63, respectively. The dynamics of observed daily ammonia-nitrogen (NH4N) concentration, as an important index to assess water environmental quality in China, are well captured with average correlation coefficient of 0.66. Furthermore, the spatial patterns of nonpoint source pollutant load and grain yield are also simulated properly, and the outputs have good agreements with the statistics at city scale. Our model shows clear superior performance in both calibration and validation in comparison with the widely used SWAT model. This model is expected to give a strong reference for water system modeling in complex basins, and provide the scientific foundation for the implementation of integrated river basin management all over the world as well as the technical guide for the reasonable regulation of dams and sluices and environmental improvement in river basins.
Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)
NASA Astrophysics Data System (ADS)
Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David
2018-01-01
Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of various planetary targets.
Science for the Public Good: Tackling scientific integrity in the federal government
NASA Astrophysics Data System (ADS)
Goldman, G. T.; Halpern, M.; Johnson, C.
2016-12-01
From hydraulic fracturing to climate change to seismic risk, government science and scientists are integral to public decision making in the geosciences. Following calls for increased scientific integrity across the government, policies have been put in place in recent years to be promote transparency and appropriate use of science in government decision making. But how effective have these initiatives been? With the development of scientific integrity policies, new transparency measures, and other efforts in recent years, are we seeing improvements in how federal agencies use science? And importantly, can these safeguards prevent potential future breaches of scientific integrity and misuse science for political gain? Review of recent progress and problems around government scientific integrity, including case studies, policy assessments, and surveys of federal scientists, can shed light on how far the we have come and what areas still need improvement to ensure that government scientific integrity is preserved in the future.
Modeling and analysis of hybrid pixel detector deficiencies for scientific applications
NASA Astrophysics Data System (ADS)
Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman
2015-08-01
Semiconductor hybrid pixel detectors often consist of a pixellated sensor layer bump bonded to a matching pixelated readout integrated circuit (ROIC). The sensor can range from high resistivity Si to III-V materials, whereas a Si CMOS process is typically used to manufacture the ROIC. Independent, device physics and electronic design automation (EDA) tools are used to determine sensor characteristics and verify functional performance of ROICs respectively with significantly different solvers. Some physics solvers provide the capability of transferring data to the EDA tool. However, single pixel transient simulations are either not feasible due to convergence difficulties or are prohibitively long. A simplified sensor model, which includes a current pulse in parallel with detector equivalent capacitor, is often used; even then, spice type top-level (entire array) simulations range from days to weeks. In order to analyze detector deficiencies for a particular scientific application, accurately defined transient behavioral models of all the functional blocks are required. Furthermore, various simulations, such as transient, noise, Monte Carlo, inter-pixel effects, etc. of the entire array need to be performed within a reasonable time frame without trading off accuracy. The sensor and the analog front-end can be modeling using a real number modeling language, as complex mathematical functions or detailed data can be saved to text files, for further top-level digital simulations. Parasitically aware digital timing is extracted in a standard delay format (sdf) from the pixel digital back-end layout as well as the periphery of the ROIC. For any given input, detector level worst-case and best-case simulations are performed using a Verilog simulation environment to determine the output. Each top-level transient simulation takes no more than 10-15 minutes. The impact of changing key parameters such as sensor Poissonian shot noise, analog front-end bandwidth, jitter due to clock distribution etc. can be accurately analyzed to determine ROIC architectural viability and bottlenecks. Hence the impact of the detector parameters on the scientific application can be studied.
Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meisner, Robert; McCoy, Michel; Archer, Bill
2013-09-11
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less
Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security
NASA Astrophysics Data System (ADS)
Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver
This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
Timing generator of scientific grade CCD camera and its implementation based on FPGA technology
NASA Astrophysics Data System (ADS)
Si, Guoliang; Li, Yunfei; Guo, Yongfei
2010-10-01
The Timing Generator's functions of Scientific Grade CCD Camera is briefly presented: it generates various kinds of impulse sequence for the TDI-CCD, video processor and imaging data output, acting as the synchronous coordinator for time in the CCD imaging unit. The IL-E2TDI-CCD sensor produced by DALSA Co.Ltd. use in the Scientific Grade CCD Camera. Driving schedules of IL-E2 TDI-CCD sensor has been examined in detail, the timing generator has been designed for Scientific Grade CCD Camera. FPGA is chosen as the hardware design platform, schedule generator is described with VHDL. The designed generator has been successfully fulfilled function simulation with EDA software and fitted into XC2VP20-FF1152 (a kind of FPGA products made by XILINX). The experiments indicate that the new method improves the integrated level of the system. The Scientific Grade CCD camera system's high reliability, stability and low power supply are achieved. At the same time, the period of design and experiment is sharply shorted.
Water resources planning based on complex system dynamics: A case study of Tianjin city
NASA Astrophysics Data System (ADS)
Zhang, X. H.; Zhang, H. W.; Chen, B.; Chen, G. Q.; Zhao, X. H.
2008-12-01
A complex system dynamic (SD) model focusing on water resources, termed as TianjinSD, is developed for the integrated and scientific management of the water resources of Tianjin, which contains information feedback that governs interactions in the system and is capable of synthesizing component-level knowledge into system behavior simulation at an integrated level, thus presenting reasonable predictive results for policy-making on water resources allocation and management. As for the Tianjin city, interactions among 96 components for 12 years are explored and four planning alternatives are chosen, one of which is based on the conventional mode assuming that the existing pattern of human activities will be prevailed, while the others are alternative planning designs based on the interaction of local authorities and planning researchers. Optimal mode is therefore obtained according to different scenarios when compared the simulation results for evaluation of different decisions and dynamic consequences.
NASA Astrophysics Data System (ADS)
Liu, Y.; Tao, F.; Luo, Y.; Ma, J.
2013-12-01
Appropriate irrigation and nitrogen fertilization, along with suitable crop management strategies, are essential prerequisites for optimum yields in agricultural systems. This research attempts to provide a scientific basis for sustainable agricultural production management for the North China Plain and other semi-arid regions. Based on a series of 72 treatments over 2003-2008, an optimized water and nitrogen scheme for winter wheat/summer maize cropping system was developed. Integrated systems incorporating 120 mm of water with 80 kg N ha-1 N fertilizer were used to simulate winter wheat yields in Hebei and 120 mm of water with 120 kg N ha-1 were used to simulate winter wheat yields in Shandong and Henan provinces in 2000-2007. Similarly, integrated treatments of 40 kg N ha-1 N fertilizer were used to simulate summer maize yields in Hebei, and 80 kg N ha-1 was used to simulate summer maize yields in Shandong and Henan provinces in 2000-2007. Under the optimized scheme, 341.74 107 mm ha-1 of water and 575.79 104 Mg of urea fertilizer could be saved per year under the wheat/maize rotation system. Despite slight drops in the yields of wheat and maize in some areas, water and fertilizer saving has tremendous long-term eco-environmental benefits.
Rothman, Jason S.; Silver, R. Angus
2018-01-01
Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519
Progress in and prospects for fluvial flood modelling.
Wheater, H S
2002-07-15
Recent floods in the UK have raised public and political awareness of flood risk. There is an increasing recognition that flood management and land-use planning are linked, and that decision-support modelling tools are required to address issues of climate and land-use change for integrated catchment management. In this paper, the scientific context for fluvial flood modelling is discussed, current modelling capability is considered and research challenges are identified. Priorities include (i) appropriate representation of spatial precipitation, including scenarios of climate change; (ii) development of a national capability for continuous hydrological simulation of ungauged catchments; (iii) improved scientific understanding of impacts of agricultural land-use and land-management change, and the development of new modelling approaches to represent those impacts; (iv) improved representation of urban flooding, at both local and catchment scale; (v) appropriate parametrizations for hydraulic simulation of in-channel and flood-plain flows, assimilating available ground observations and remotely sensed data; and (vi) a flexible decision-support modelling framework, incorporating developments in computing, data availability, data assimilation and uncertainty analysis.
Scientific integrity - Recent Department of the Interior policies, codes, and their implementation
Thornhill, Alan D.; Coleman, Richard; Gunderson, Linda C.
2017-01-01
Established on January 28, 2011, the Department of Interior's (DOI’s) Scientific and Scholarly Integrity Policy was the first federal agency policy to respond to the Presidential Memorandum on Scientific Integrity (March 9, 2009) and guidance issued by the Office of Science and Technology Policy Memorandum on Scientific Integrity (December 17, 2010). The increasingly important role of science in DOI decision making and heightened awareness of science integrity issues across the science enterprise provided impetus for making this policy a priority and for incorporating it into the DOI Departmental Manual (Part 305: Chapter 3). This paper discusses the history of scientific integrity in the DOI, the key provisions of the first Department-wide policy and its implementation, and the subsequent revision of the policy. During the first 4 years of implementing the scientific integrity policy, the Department received 35 formal complaints. As of March 31, 2015, only two formal scientific integrity complaints resulted in “warranted determinations,” while the other complaints were closed and dismissed as “not warranted.” Based on the experience of the first three years of implementation (2011-2014), the Department policy was revised on December 16, 2014.
NASA Astrophysics Data System (ADS)
Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em
2017-09-01
Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.
NASA Astrophysics Data System (ADS)
Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.
2014-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.
Rodrigues, João P G L M; Melquiond, Adrien S J; Bonvin, Alexandre M J J
2016-01-01
Molecular modelling and simulations are nowadays an integral part of research in areas ranging from physics to chemistry to structural biology, as well as pharmaceutical drug design. This popularity is due to the development of high-performance hardware and of accurate and efficient molecular mechanics algorithms by the scientific community. These improvements are also benefitting scientific education. Molecular simulations, their underlying theory, and their applications are particularly difficult to grasp for undergraduate students. Having hands-on experience with the methods contributes to a better understanding and solidification of the concepts taught during the lectures. To this end, we have created a computer practical class, which has been running for the past five years, composed of several sessions where students characterize the conformational landscape of small peptides using molecular dynamics simulations in order to gain insights on their binding to protein receptors. In this report, we detail the ingredients and recipe necessary to establish and carry out this practical, as well as some of the questions posed to the students and their expected results. Further, we cite some examples of the students' written reports, provide statistics, and share their feedbacks on the structure and execution of the sessions. These sessions were implemented alongside a theoretical molecular modelling course but have also been used successfully as a standalone tutorial during specialized workshops. The availability of the material on our web page also facilitates this integration and dissemination and lends strength to the thesis of open-source science and education. © 2016 The International Union of Biochemistry and Molecular Biology.
Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle
2009-10-19
Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps,more » then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.« less
Torregrosa, Alicia; Casazza, Michael L.; Caldwell, Margaret R.; Mathiasmeier, Teresa A.; Morgan, Peter M.; Overton, Cory T.
2010-01-01
Integration of scientific data and adaptive management techniques is critical to the success of species conservation, however, there are uncertainties about effective methods of knowledge exchange between scientists and decisionmakers. The conservation planning and implementation process for Greater Sage-grouse (Centrocercus urophasianus; ) in the Mono Basin, Calif. region, was used as a case study to observe the exchange of scientific information among stakeholders with differing perspectives; resource manager, scientist, public official, rancher, and others. The collaborative development of a risk-simulation model was explored as a tool to transfer knowledge between stakeholders and inform conservation planning and management decisions. Observations compiled using a transdisciplinary approach were used to compare the exchange of information during the collaborative model development and more traditional interactions such as scientist-led presentations at stakeholder meetings. Lack of congruence around knowledge needs and prioritization led to insufficient commitment to completely implement the risk-simulation model. Ethnographic analysis of the case study suggests that further application of epistemic community theory, which posits a strong boundary condition on knowledge transfer, could help support application of risk simulation models in conservation-planning efforts within similarly complex social and bureaucratic landscapes.
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Modelling the role of forests on water provision services: a hydro-economic valuation approach
NASA Astrophysics Data System (ADS)
Beguería, S.; Campos, P.
2015-12-01
Hydro-economic models that allow integrating the ecological, hydrological, infrastructure, economic and social aspects into a coherent, scientifically- informed framework constitute preferred tools for supporting decision making in the context of integrated water resources management. We present a case study of water regulation and provision services of forests in the Andalusia region of Spain. Our model computes the physical water flows and conducts an economic environmental income and asset valuation of forest surface and underground water yield. Based on available hydrologic and economic data, we develop a comprehensive water account for all the forest lands at the regional scale. This forest water environmental valuation is integrated within a much larger project aiming at providing a robust and easily replicable accounting tool to evaluate yearly the total income and capital of forests, encompassing all measurable sources of private and public incomes (timber and cork production, auto-consumption, recreational activities, biodiversity conservation, carbon sequestration, water production, etc.). We also force our simulation with future socio-economic scenarios to quantify the physical and economic efects of expected trends or simulated public and private policies on future water resources. Only a comprehensive integrated tool may serve as a basis for the development of integrated policies, such as those internationally agreed and recommended for the management of water resources.
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.
Large-Scale NASA Science Applications on the Columbia Supercluster
NASA Technical Reports Server (NTRS)
Brooks, Walter
2005-01-01
Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.
SpikingLab: modelling agents controlled by Spiking Neural Networks in Netlogo.
Jimenez-Romero, Cristian; Johnson, Jeffrey
2017-01-01
The scientific interest attracted by Spiking Neural Networks (SNN) has lead to the development of tools for the simulation and study of neuronal dynamics ranging from phenomenological models to the more sophisticated and biologically accurate Hodgkin-and-Huxley-based and multi-compartmental models. However, despite the multiple features offered by neural modelling tools, their integration with environments for the simulation of robots and agents can be challenging and time consuming. The implementation of artificial neural circuits to control robots generally involves the following tasks: (1) understanding the simulation tools, (2) creating the neural circuit in the neural simulator, (3) linking the simulated neural circuit with the environment of the agent and (4) programming the appropriate interface in the robot or agent to use the neural controller. The accomplishment of the above-mentioned tasks can be challenging, especially for undergraduate students or novice researchers. This paper presents an alternative tool which facilitates the simulation of simple SNN circuits using the multi-agent simulation and the programming environment Netlogo (educational software that simplifies the study and experimentation of complex systems). The engine proposed and implemented in Netlogo for the simulation of a functional model of SNN is a simplification of integrate and fire (I&F) models. The characteristics of the engine (including neuronal dynamics, STDP learning and synaptic delay) are demonstrated through the implementation of an agent representing an artificial insect controlled by a simple neural circuit. The setup of the experiment and its outcomes are described in this work.
Ethics and Scientific Integrity in Public Health, Epidemiological and Clinical Research
Coughlin, Steven S.; Barker, Amyre; Dawson, Angus
2012-01-01
The ethics and scientific integrity of biomedical and public health research requires that researchers behave in appropriate ways. However, this requires more than following of published research guidelines that seek to prevent scientific misconduct relating to serious deviations from widely accepted scientific norms for proposing, conducting, and reporting research (e.g., fabrication or falsification of research data or failures to report potential conflicts of interest). In this paper we argue for a broader account of scientific integrity, one consistent with that defended by the United States Institute of Medicine, involving a commitment to intellectual honesty and personal responsibility for one’s actions as a researcher and to practices consistent with the responsible conduct of research and protection of the research participants. Maintaining high standards of ethical and scientific integrity helps to maintain public trust in the research enterprise. An increasing number of authors have pointed to the importance of mentoring and education in relation to the responsible conduct of science in preventing transgressions of scientific integrity. Just like in clinical research and biomedicine, epidemiologists and other public health researchers have the responsibility to exhibit and foster the very highest standards of scientific integrity. PMID:24532867
Ethics and Scientific Integrity in Public Health, Epidemiological and Clinical Research.
Coughlin, Steven S; Barker, Amyre; Dawson, Angus
2012-01-01
The ethics and scientific integrity of biomedical and public health research requires that researchers behave in appropriate ways. However, this requires more than following of published research guidelines that seek to prevent scientific misconduct relating to serious deviations from widely accepted scientific norms for proposing, conducting, and reporting research (e.g., fabrication or falsification of research data or failures to report potential conflicts of interest). In this paper we argue for a broader account of scientific integrity, one consistent with that defended by the United States Institute of Medicine, involving a commitment to intellectual honesty and personal responsibility for one's actions as a researcher and to practices consistent with the responsible conduct of research and protection of the research participants. Maintaining high standards of ethical and scientific integrity helps to maintain public trust in the research enterprise. An increasing number of authors have pointed to the importance of mentoring and education in relation to the responsible conduct of science in preventing transgressions of scientific integrity. Just like in clinical research and biomedicine, epidemiologists and other public health researchers have the responsibility to exhibit and foster the very highest standards of scientific integrity.
Models and Simulations as a Service: Exploring the Use of Galaxy for Delivering Computational Models
Walker, Mark A.; Madduri, Ravi; Rodriguez, Alex; Greenstein, Joseph L.; Winslow, Raimond L.
2016-01-01
We describe the ways in which Galaxy, a web-based reproducible research platform, can be used for web-based sharing of complex computational models. Galaxy allows users to seamlessly customize and run simulations on cloud computing resources, a concept we refer to as Models and Simulations as a Service (MaSS). To illustrate this application of Galaxy, we have developed a tool suite for simulating a high spatial-resolution model of the cardiac Ca2+ spark that requires supercomputing resources for execution. We also present tools for simulating models encoded in the SBML and CellML model description languages, thus demonstrating how Galaxy’s reproducible research features can be leveraged by existing technologies. Finally, we demonstrate how the Galaxy workflow editor can be used to compose integrative models from constituent submodules. This work represents an important novel approach, to our knowledge, to making computational simulations more accessible to the broader scientific community. PMID:26958881
The X-IFU end-to-end simulations performed for the TES array optimization exercise
NASA Astrophysics Data System (ADS)
Peille, Philippe; Wilms, J.; Brand, T.; Cobo, B.; Ceballos, M. T.; Dauser, T.; Smith, S. J.; Barret, D.; den Herder, J. W.; Piro, L.; Barcons, X.; Pointecouteau, E.; Bandler, S.; den Hartog, R.; de Plaa, J.
2015-09-01
The focal plane assembly of the Athena X-ray Integral Field Unit (X-IFU) includes as the baseline an array of ~4000 single size calorimeters based on Transition Edge Sensors (TES). Other sensor array configurations could however be considered, combining TES of different properties (e.g. size). In attempting to improve the X-IFU performance in terms of field of view, count rate performance, and even spectral resolution, two alternative TES array configurations to the baseline have been simulated, each combining a small and a large pixel array. With the X-IFU end-to-end simulator, a sub-sample of the Athena core science goals, selected by the X-IFU science team as potentially driving the optimal TES array configuration, has been simulated for the results to be scientifically assessed and compared. In this contribution, we will describe the simulation set-up for the various array configurations, and highlight some of the results of the test cases simulated.
NASA Technical Reports Server (NTRS)
Wood, G. M.; Rayborn, G. H.; Ioup, J. W.; Ioup, G. E.; Upchurch, B. T.; Howard, S. J.
1981-01-01
Mathematical deconvolution of digitized analog signals from scientific measuring instruments is shown to be a means of extracting important information which is otherwise hidden due to time-constant and other broadening or distortion effects caused by the experiment. Three different approaches to deconvolution and their subsequent application to recorded data from three analytical instruments are considered. To demonstrate the efficacy of deconvolution, the use of these approaches to solve the convolution integral for the gas chromatograph, magnetic mass spectrometer, and the time-of-flight mass spectrometer are described. Other possible applications of these types of numerical treatment of data to yield superior results from analog signals of the physical parameters normally measured in aerospace simulation facilities are suggested and briefly discussed.
Coordination Procedures between the Scientific Integrity Official and the Office of Inspector General regarding Scientific Misconduct Allegations written March 30, 2015 by the Office of the Science Advisor
2016-12-01
collaborative effort is addressed by six Technical Panels who manage a wide range of scientific research activities, a Group specialising in modelling and...HFM Human Factors and Medicine Panel • IST Information Systems Technology Panel • NMSG NATO Modelling and Simulation Group • SAS System Analysis...and Studies Panel • SCI Systems Concepts and Integration Panel • SET Sensors and Electronics Technology Panel These Panels and Group are the
Ethics Workshop Sheds Light on Gray Areas
NASA Astrophysics Data System (ADS)
Townsend, Randy; Williams, Billy
2014-02-01
AGU's Scientific Integrity and Professional Ethics Workshop at the 2013 Fall Meeting, held on 9 December, highlighted the courageous conversations necessary to navigate through questions of scientific integrity and professional ethics. Participants debated real-world scenarios surrounding authorship, data management, plagiarism, and conflicts of interest. These discussions emphasized the importance of preserving scientific integrity and the responsibility of each member to uphold the standards of scientific conduct.
Serious games experiment toward agent-based simulation
Wein, Anne; Labiosa, William
2013-01-01
We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information within challenging environmental and ecosystem-management contexts and to provide an interactive way of learning about the complexity of interactions between people and natural systems. Further progress on the use of pedagogical games to fulfill the USGS mission will require collaboration among scientists, game developers, educators, and stakeholders. We conclude that as the USGS positions itself to communicate and convey the results of multiple science strategies, including natural-resource security and sustainability, pedagogical game development and agent-based modeling offer a means to (1) establish interdisciplinary and collaborative teams with a focused integrated outcome; (2) contribute to the modeling of interaction, feedback, and adaptation of ecosystems; and (3) enable social learning through a broadly appealing and increasingly sophisticated medium.
EPA scientific integrity policy draft
NASA Astrophysics Data System (ADS)
Showstack, Randy
2011-08-01
The U.S. Environmental Protection Agency (EPA) issued its draft scientific integrity policy on 5 August. The draft policy addresses scientific ethical standards, communications with the public, the use of advisory committees and peer review, and professional development. The draft policy was developed by an ad hoc group of EPA senior staff and scientists in response to a December 2010 memorandum on scientific integrity from the White House Office of Science and Technology Policy. The agency is accepting public comments on the draft through 6 September; comments should be sent to osa.staff@epa.gov. For more information, see http://www.epa.gov/stpc/pdfs/draft-scientific-integrity-policy-aug2011.pdf.
Federal Agency Scientific Integrity Policies and the Legal Landscape
NASA Astrophysics Data System (ADS)
Kurtz, L.
2017-12-01
Federal agencies have worked to develop scientific integrity policies to promote the use of scientific and technical information in policymaking, reduce special-interest influences, and increase transparency. Following recent allegations of agency misconduct, these policies are now more important than ever. In addition to setting standards, scientific integrity policies also provide avenues for whistleblowers to complain about perceived violations. While these policies have their shortcomings (which may differ by agency), they are also one of the better available options for upholding principles of scientific integrity within the federal government. A legal perspective will be offered on what sorts of issues might rise to the threshold to make an official complaint, and the process of actually making a complaint. Other legal avenues for complaining about scientific integrity violations will also be discussed, such as complaints filed with the U.S. Office of Special Counsel or an agency's Office of Inspector General, and bringing the matter to federal court.
Cosmological N-body Simulation
NASA Astrophysics Data System (ADS)
Lake, George
1994-05-01
.90ex> }}} The ``N'' in N-body calculations has doubled every year for the last two decades. To continue this trend, the UW N-body group is working on algorithms for the fast evaluation of gravitational forces on parallel computers and establishing rigorous standards for the computations. In these algorithms, the computational cost per time step is ~ 10(3) pairwise forces per particle. A new adaptive time integrator enables us to perform high quality integrations that are fully temporally and spatially adaptive. SPH--smoothed particle hydrodynamics will be added to simulate the effects of dissipating gas and magnetic fields. The importance of these calculations is two-fold. First, they determine the nonlinear consequences of theories for the structure of the Universe. Second, they are essential for the interpretation of observations. Every galaxy has six coordinates of velocity and position. Observations determine two sky coordinates and a line of sight velocity that bundles universal expansion (distance) together with a random velocity created by the mass distribution. Simulations are needed to determine the underlying structure and masses. The importance of simulations has moved from ex post facto explanation to an integral part of planning large observational programs. I will show why high quality simulations with ``large N'' are essential to accomplish our scientific goals. This year, our simulations have N >~ 10(7) . This is sufficient to tackle some niche problems, but well short of our 5 year goal--simulating The Sloan Digital Sky Survey using a few Billion particles (a Teraflop-year simulation). Extrapolating past trends, we would have to ``wait'' 7 years for this hundred-fold improvement. Like past gains, significant changes in the computational methods are required for these advances. I will describe new algorithms, algorithmic hacks and a dedicated computer to perform Billion particle simulations. Finally, I will describe research that can be enabled by Petaflop computers. This research is supported by the NASA HPCC/ESS program.
Magnetosphere Modeling: From Cartoons to Simulations
NASA Astrophysics Data System (ADS)
Gombosi, T. I.
2017-12-01
Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems, to global MHD to MHD-PIC and discuss the role of state-of-the-art models in forecasting space weather.
HEP - A semaphore-synchronized multiprocessor with central control. [Heterogeneous Element Processor
NASA Technical Reports Server (NTRS)
Gilliland, M. C.; Smith, B. J.; Calvert, W.
1976-01-01
The paper describes the design concept of the Heterogeneous Element Processor (HEP), a system tailored to the special needs of scientific simulation. In order to achieve high-speed computation required by simulation, HEP features a hierarchy of processes executing in parallel on a number of processors, with synchronization being largely accomplished by hardware. A full-empty-reserve scheme of synchronization is realized by zero-one-valued hardware semaphores. A typical system has, besides the control computer and the scheduler, an algebraic module, a memory module, a first-in first-out (FIFO) module, an integrator module, and an I/O module. The architecture of the scheduler and the algebraic module is examined in detail.
Role of High-End Computing in Meeting NASA's Science and Engineering Challenges
NASA Technical Reports Server (NTRS)
Biswas, Rupak
2006-01-01
High-End Computing (HEC) has always played a major role in meeting the modeling and simulation needs of various NASA missions. With NASA's newest 62 teraflops Columbia supercomputer, HEC is having an even greater impact within the Agency and beyond. Significant cutting-edge science and engineering simulations in the areas of space exploration, Shuttle operations, Earth sciences, and aeronautics research, are already occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. The talk will describe how the integrated supercomputing production environment is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions.
NAS technical summaries. Numerical aerodynamic simulation program, March 1992 - February 1993
NASA Technical Reports Server (NTRS)
1994-01-01
NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefitting other supercomputer centers in government and industry. The 1992-93 operational year concluded with 399 high-speed processor projects and 91 parallel projects representing NASA, the Department of Defense, other government agencies, private industry, and universities. This document provides a glimpse at some of the significant scientific results for the year.
77 FR 21158 - VA Directive 0005 on Scientific Integrity: Availability for Review and Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-09
... the Director, Office of Science and Technology Policy's Memorandum of December 17, 2010, on scientific integrity. It addresses how VA ensures quality science in its methods, review, policy application, and...: Background The Presidential Memorandum on Scientific Integrity and the Office of Science and Technology...
The roles of integration in molecular systems biology.
O'Malley, Maureen A; Soyer, Orkun S
2012-03-01
A common way to think about scientific practice involves classifying it as hypothesis- or data-driven. We argue that although such distinctions might illuminate scientific practice very generally, they are not sufficient to understand the day-to-day dynamics of scientific activity and the development of programmes of research. One aspect of everyday scientific practice that is beginning to gain more attention is integration. This paper outlines what is meant by this term and how it has been discussed from scientific and philosophical points of view. We focus on methodological, data and explanatory integration, and show how they are connected. Then, using some examples from molecular systems biology, we will show how integration works in a range of inquiries to generate surprising insights and even new fields of research. From these examples we try to gain a broader perspective on integration in relation to the contexts of inquiry in which it is implemented. In today's environment of data-intensive large-scale science, integration has become both a practical and normative requirement with corresponding implications for meta-methodological accounts of scientific practice. We conclude with a discussion of why an understanding of integration and its dynamics is useful for philosophy of science and scientific practice in general. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel
2016-08-01
In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving teaching materials and assess their effectiveness in improving students' ability to solve problems in university-level physics. Firstly, we analyze the effect of using simulation-based materials in the development of students' skills in employing procedures that are typically used in the scientific method of problem-solving. We found that a significant percentage of the experimental students used expert-type scientific procedures such as qualitative analysis of the problem, making hypotheses, and analysis of results. At the end of the course, only a minority of the students persisted with habits based solely on mathematical equations. Secondly, we compare the effectiveness in terms of problem-solving of the experimental group students with the students who are taught conventionally. We found that the implementation of the problem-solving strategy improved experimental students' results regarding obtaining a correct solution from the academic point of view, in standard textbook problems. Thirdly, we explore students' satisfaction with simulation-based problem-solving teaching materials and we found that the majority appear to be satisfied with the methodology proposed and took on a favorable attitude to learning problem-solving. The research was carried out among first-year Engineering Degree students.
Molecular dynamics simulations through GPU video games technologies
Loukatou, Styliani; Papageorgiou, Louis; Fakourelis, Paraskevas; Filntisi, Arianna; Polychronidou, Eleftheria; Bassis, Ioannis; Megalooikonomou, Vasileios; Makałowski, Wojciech; Vlachakis, Dimitrios; Kossida, Sophia
2016-01-01
Bioinformatics is the scientific field that focuses on the application of computer technology to the management of biological information. Over the years, bioinformatics applications have been used to store, process and integrate biological and genetic information, using a wide range of methodologies. One of the most de novo techniques used to understand the physical movements of atoms and molecules is molecular dynamics (MD). MD is an in silico method to simulate the physical motions of atoms and molecules under certain conditions. This has become a state strategic technique and now plays a key role in many areas of exact sciences, such as chemistry, biology, physics and medicine. Due to their complexity, MD calculations could require enormous amounts of computer memory and time and therefore their execution has been a big problem. Despite the huge computational cost, molecular dynamics have been implemented using traditional computers with a central memory unit (CPU). A graphics processing unit (GPU) computing technology was first designed with the goal to improve video games, by rapidly creating and displaying images in a frame buffer such as screens. The hybrid GPU-CPU implementation, combined with parallel computing is a novel technology to perform a wide range of calculations. GPUs have been proposed and used to accelerate many scientific computations including MD simulations. Herein, we describe the new methodologies developed initially as video games and how they are now applied in MD simulations. PMID:27525251
THE TOPIC OF RESEARCH INTEGRITY IN LATINAMERICA1
Rodríguez, Eduardo; Lolas, Fernando
2012-01-01
Present article narrates the experience of trainees of the ethics of biomedical and psychosocial research program of the Interdisciplinary Center for studies on bioethics (CIEB) of the University of Chile on the topic of research integrity in Latin America. The following problems are covered: integrity of publications, reporting of scientific research misconduct, definitions of research integrity, scientific ethical review committees functioning, international multi-centric clinical trials monitoring and norms for scientific integrity and ethical oversight. PMID:22679532
On (scientific) integrity: conceptual clarification.
Patrão Neves, Maria do Céu
2018-06-01
The notion of "integrity" is currently quite common and broadly recognized as complex, mostly due to its recurring and diverse application in various distinct domains such as the physical, psychic or moral, the personal or professional, that of the human being or of the totality of beings. Nevertheless, its adjectivation imprints a specific meaning, as happens in the case of "scientific integrity". This concept has been defined mostly by via negativa, by pointing out what goes against integrity, that is, through the identification of its infringements, which has also not facilitated the elaboration of an overarching and consensual code of scientific integrity. In this context, it is deemed necessary to clarify the notion of "integrity", first etymologically, recovering the original meaning of the term, and then in a specifically conceptual way, through the identification of the various meanings with which the term can be legitimately used, particularly in the domain of scientific research and innovation. These two steps are fundamental and indispensable for a forthcoming attempt at systematizing the requirements of "scientific integrity".
NASA Technical Reports Server (NTRS)
Goodrich, Charles C.
1993-01-01
The goal of this project is to investigate the use of visualization software based on the visual programming and data-flow paradigms to meet the needs of the SPOF and through it the International Solar Terrestrial Physics (ISTP) science community. Specific needs we address include science planning, data interpretation, comparisons of data with simulation and model results, and data acquisition. Our accomplishments during the twelve month grant period are discussed below.
Fifth Conference on Artificial Intelligence for Space Applications
NASA Technical Reports Server (NTRS)
Odell, Steve L. (Compiler)
1990-01-01
The Fifth Conference on Artificial Intelligence for Space Applications brings together diverse technical and scientific work in order to help those who employ AI methods in space applications to identify common goals and to address issues of general interest in the AI community. Topics include the following: automation for Space Station; intelligent control, testing, and fault diagnosis; robotics and vision; planning and scheduling; simulation, modeling, and tutoring; development tools and automatic programming; knowledge representation and acquisition; and knowledge base/data base integration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Andew; Di Vittorio, Alan; Collins, William
The integrated Earth system model (iESM) has been developed as a new tool for projecting the joint human/climate system. The iESM is based upon coupling an integrated assessment model (IAM) and an Earth system model (ESM) into a common modeling infrastructure. IAMs are the primary tool for describing the human-Earth system, including the sources of global greenhouse gases (GHGs) and short-lived species (SLS), land use and land cover change (LULCC), and other resource-related drivers of anthropogenic climate change. ESMs are the primary scientific tools for examining the physical, chemical, and biogeochemical impacts of human-induced changes to the climate system. Themore » iESM project integrates the economic and human-dimension modeling of an IAM and a fully coupled ESM within a single simulation system while maintaining the separability of each model if needed. Both IAM and ESM codes are developed and used by large communities and have been extensively applied in recent national and international climate assessments. By introducing heretofore-omitted feedbacks between natural and societal drivers, we can improve scientific understanding of the human-Earth system dynamics. Potential applications include studies of the interactions and feedbacks leading to the timing, scale, and geographic distribution of emissions trajectories and other human influences, corresponding climate effects, and the subsequent impacts of a changing climate on human and natural systems.« less
Building a federated data infrastructure for integrating the European Supersites
NASA Astrophysics Data System (ADS)
Freda, Carmela; Cocco, Massimo; Puglisi, Giuseppe; Borgstrom, Sven; Vogfjord, Kristin; Sigmundsson, Freysteinn; Ergintav, Semih; Meral Ozel, Nurcan; Consortium, Epos
2017-04-01
The integration of satellite and in-situ Earth observations fostered by the GEO Geohazards Supersites and National Laboratories (GSNL) initiative is aimed at providing access to spaceborne and in-situ geoscience data for selected sites prone to earthquake, volcanic eruptions and/or other environmental hazards. The initiative was launched with the "Frascati declaration" at the conclusion of the 3rd International Geohazards workshop of the Group of Earth Observation (GEO) held in November 2007 in Frascati, Italy. The development of the GSNL and the integration of in-situ and space Earth observations require the implementation of in-situ e-infrastructures and services for scientific users and other stakeholders. The European Commission has funded three projects to support the development of the European supersites: FUTUREVOLC for the Icelandic volcanoes, MED-SUV for Mt. Etna and Campi Flegrei/Vesuvius (Italy), and MARSITE for the Marmara Sea near fault observatory (Turkey). Because the establishment of a network of supersites in Europe will, among other advantages, facilitate the link with the Global Earth Observation System of Systems (GEOSS), EPOS (the European Plate Observing System) has supported these initiatives by integrating the observing systems and infrastructures developed in these three projects in its implementation plan aimed at integrating existing and new research infrastructures for solid Earth sciences. In this contribution we will present the EPOS federated approach and the key actions needed to: i) develop sustainable long-term Earth observation strategies preceding and following earthquakes and volcanic eruptions; ii) develop an innovative integrated e-infrastructure component necessary to create an effective service for users; iii) promote the strategic and outreach actions to meet the specific user needs; iv) develop expertise in the use and interpretation of Supersites data in order to promote capacity building and timely transfer of scientific knowledge. All these will facilitate new scientific discoveries through the availability of unprecedented data sets and it will increase resilience and preparedness in the society. Making straightway available observations of natural processes controlling natural phenomena and promoting their comparison with numerical simulations and their interpretation through theoretical analyses will foster scientific excellence in solid Earth research. The EPOS federated approach might be considered as a proxy for other regions of the world and therefore it could contribute to develop the supersite initiative globally.
ASCR/HEP Exascale Requirements Review Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; Gerber, Richard
This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, tomore » store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less
ASCR/HEP Exascale Requirements Review Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; et al.
2016-03-30
This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, tomore » store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less
Hypothesis testing of scientific Monte Carlo calculations.
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Hypothesis testing of scientific Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Zhenhuan; Boyuka, David; Zou, X
Download Citation Email Print Request Permissions Save to Project The size and scope of cutting-edge scientific simulations are growing much faster than the I/O and storage capabilities of their run-time environments. The growing gap is exacerbated by exploratory, data-intensive analytics, such as querying simulation data with multivariate, spatio-temporal constraints, which induces heterogeneous access patterns that stress the performance of the underlying storage system. Previous work addresses data layout and indexing techniques to improve query performance for a single access pattern, which is not sufficient for complex analytics jobs. We present PARLO a parallel run-time layout optimization framework, to achieve multi-levelmore » data layout optimization for scientific applications at run-time before data is written to storage. The layout schemes optimize for heterogeneous access patterns with user-specified priorities. PARLO is integrated with ADIOS, a high-performance parallel I/O middleware for large-scale HPC applications, to achieve user-transparent, light-weight layout optimization for scientific datasets. It offers simple XML-based configuration for users to achieve flexible layout optimization without the need to modify or recompile application codes. Experiments show that PARLO improves performance by 2 to 26 times for queries with heterogeneous access patterns compared to state-of-the-art scientific database management systems. Compared to traditional post-processing approaches, its underlying run-time layout optimization achieves a 56% savings in processing time and a reduction in storage overhead of up to 50%. PARLO also exhibits a low run-time resource requirement, while also limiting the performance impact on running applications to a reasonable level.« less
Simulation Data as Data Streams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdulla, G; Arrighi, W; Critchlow, T
2003-11-18
Computational or scientific simulations are increasingly being applied to solve a variety of scientific problems. Domains such as astrophysics, engineering, chemistry, biology, and environmental studies are benefiting from this important capability. Simulations, however, produce enormous amounts of data that need to be analyzed and understood. In this overview paper, we describe scientific simulation data, its characteristics, and the way scientists generate and use the data. We then compare and contrast simulation data to data streams. Finally, we describe our approach to analyzing simulation data, present the AQSim (Ad-hoc Queries for Simulation data) system, and discuss some of the challenges thatmore » result from handling this kind of data.« less
Challenges in studying the effects of scientific societies on research integrity.
Levine, Felice J; Iutcovich, Joyce M
2003-04-01
Beyond impressionistic observations, little is known about the role and influence of scientific societies on research conduct. Acknowledging that the influence of scientific societies is not easily disentangled from other factors that shape norms and practices, this article addresses how best to study the promotion of research integrity generally as well as the role and impact of scientific societies as part of that process. In setting forth the parameters of a research agenda, the article addresses four issues: (1) how to conceptualize research on scientific societies and research integrity; (2) challenges and complexities in undertaking basic research; (3) strategies for undertaking basic research that is attentive to individual, situational, organizational, and environmental levels of analysis; and (4) the need for evaluation research as integral to programmatic change and to assessment of the impact of activities by scientific societies.
The Need for Analogue Missions in Scientific Human and Robotic Planetary Exploration
NASA Technical Reports Server (NTRS)
Snook, K. J.; Mendell, W. W.
2004-01-01
With the increasing challenges of planetary missions, and especially with the prospect of human exploration of the moon and Mars, the need for earth-based mission simulations has never been greater. The current focus on science as a major driver for planetary exploration introduces new constraints in mission design, planning, operations, and technology development. Analogue missions can be designed to address critical new integration issues arising from the new science-driven exploration paradigm. This next step builds on existing field studies and technology development at analogue sites, providing engineering, programmatic, and scientific lessons-learned in relatively low-cost and low-risk environments. One of the most important outstanding questions in planetary exploration is how to optimize the human and robotic interaction to achieve maximum science return with minimum cost and risk. To answer this question, researchers are faced with the task of defining scientific return and devising ways of measuring the benefit of scientific planetary exploration to humanity. Earth-based and spacebased analogue missions are uniquely suited to answer this question. Moreover, they represent the only means for integrating science operations, mission operations, crew training, technology development, psychology and human factors, and all other mission elements prior to final mission design and launch. Eventually, success in future planetary exploration will depend on our ability to prepare adequately for missions, requiring improved quality and quantity of analogue activities. This effort demands more than simply developing new technologies needed for future missions and increasing our scientific understanding of our destinations. It requires a systematic approach to the identification and evaluation of the categories of analogue activities. This paper presents one possible approach to the classification and design of analogue missions based on their degree of fidelity in ten key areas. Various case studies are discussed to illustrate the approach.
Consciousness and the Invention of Morel
Perogamvros, Lampros
2013-01-01
A scientific study of consciousness should take into consideration both objective and subjective measures of conscious experiences. To this date, very few studies have tried to integrate third-person data, or data about the neurophysiological correlates of conscious states, with first-person data, or data about subjective experience. Inspired by Morel's invention (Casares, 1940), a literary machine capable of reproducing sensory-dependent external reality, this article suggests that combination of virtual reality techniques and brain reading technologies, that is, decoding of conscious states by brain activity alone, can offer this integration. It is also proposed that the multimodal, simulating, and integrative capacities of the dreaming brain render it an “endogenous” Morel's machine, which can potentially be used in studying consciousness, but not always in a reliable way. Both the literary machine and dreaming could contribute to a better understanding of conscious states. PMID:23467765
Scientific integrity memorandum
NASA Astrophysics Data System (ADS)
Showstack, Randy
2009-03-01
U.S. President Barack Obama signed a presidential memorandum on 9 March to help restore scientific integrity in government decision making. The memorandum directs the White House Office of Science and Technology Policy to develop a strategy within 120 days that ensures that "the selection of scientists and technology professionals for science and technology positions in the executive branch is based on those individuals' scientific and technological knowledge, credentials, and experience; agencies make available to the public the scientific or technological findings or conclusions considered or relied upon in policy decisions; agencies use scientific and technological information that has been subject to well-established scientific processes such as peer review; and agencies have appropriate rules and procedures to ensure the integrity of the scientific process within the agency, including whistleblower protection."
DOE Office of Scientific and Technical Information (OSTI.GOV)
Underwood, Keith D; Ulmer, Craig D.; Thompson, David
Field programmable gate arrays (FPGAs) have been used as alternative computational de-vices for over a decade; however, they have not been used for traditional scientific com-puting due to their perceived lack of floating-point performance. In recent years, there hasbeen a surge of interest in alternatives to traditional microprocessors for high performancecomputing. Sandia National Labs began two projects to determine whether FPGAs wouldbe a suitable alternative to microprocessors for high performance scientific computing and,if so, how they should be integrated into the system. We present results that indicate thatFPGAs could have a significant impact on future systems. FPGAs have thepotentialtohave ordermore » of magnitude levels of performance wins on several key algorithms; however,there are serious questions as to whether the system integration challenge can be met. Fur-thermore, there remain challenges in FPGA programming and system level reliability whenusing FPGA devices.4 AcknowledgmentArun Rodrigues provided valuable support and assistance in the use of the Structural Sim-ulation Toolkit within an FPGA context. Curtis Janssen and Steve Plimpton provided valu-able insights into the workings of two Sandia applications (MPQC and LAMMPS, respec-tively).5« less
NASA Astrophysics Data System (ADS)
Kaplinger, Brian Douglas
For the past few decades, both the scientific community and the general public have been becoming more aware that the Earth lives in a shooting gallery of small objects. We classify all of these asteroids and comets, known or unknown, that cross Earth's orbit as near-Earth objects (NEOs). A look at our geologic history tells us that NEOs have collided with Earth in the past, and we expect that they will continue to do so. With thousands of known NEOs crossing the orbit of Earth, there has been significant scientific interest in developing the capability to deflect an NEO from an impacting trajectory. This thesis applies the ideas of Smoothed Particle Hydrodynamics (SPH) theory to the NEO disruption problem. A simulation package was designed that allows efficacy simulation to be integrated into the mission planning and design process. This is done by applying ideas in high-performance computing (HPC) on the computer graphics processing unit (GPU). Rather than prove a concept through large standalone simulations on a supercomputer, a highly parallel structure allows for flexible, target dependent questions to be resolved. Built around nonclassified data and analysis, this computer package will allow academic institutions to better tackle the issue of NEO mitigation effectiveness.
AGU Launches Web Site for New Scientific Integrity and Professional Ethics Policy
NASA Astrophysics Data System (ADS)
Townsend, Randy
2013-03-01
AGU's Scientific Integrity and Professional Ethics policy, approved by the AGU Board of Directors and Council in December 2012, is now available online on a new Web site, http://ethics.agu.org. As the Web site states, the policy embodies a "set of guidelines for scientific integrity and professional ethics for the actions of the members and the governance of the Union in its internal activities; in its public persona; and most importantly, in the research and peer review processes of its scientific publications, its communications and outreach, and its scientific meetings."
NASA Astrophysics Data System (ADS)
Tian, Y.; Zheng, Y.; Zheng, C.; Han, F., Sr.
2017-12-01
Physically based and fully-distributed integrated hydrological models (IHMs) can quantitatively depict hydrological processes, both surface and subsurface, with sufficient spatial and temporal details. However, the complexity involved in pre-processing data and setting up models seriously hindered the wider application of IHMs in scientific research and management practice. This study introduces our design and development of Visual HEIFLOW, hereafter referred to as VHF, a comprehensive graphical data processing and modeling system for integrated hydrological simulation. The current version of VHF has been structured to accommodate an IHM named HEIFLOW (Hydrological-Ecological Integrated watershed-scale FLOW model). HEIFLOW is a model being developed by the authors, which has all typical elements of physically based and fully-distributed IHMs. It is based on GSFLOW, a representative integrated surface water-groundwater model developed by USGS. HEIFLOW provides several ecological modules that enable to simulate growth cycle of general vegetation and special plants (maize and populus euphratica). VHF incorporates and streamlines all key steps of the integrated modeling, and accommodates all types of GIS data necessary to hydrological simulation. It provides a GIS-based data processing framework to prepare an IHM for simulations, and has functionalities to flexibly display and modify model features (e.g., model grids, streams, boundary conditions, observational sites, etc.) and their associated data. It enables visualization and various spatio-temporal analyses of all model inputs and outputs at different scales (i.e., computing unit, sub-basin, basin, or user-defined spatial extent). The above system features, as well as many others, can significantly reduce the difficulty and time cost of building and using a complex IHM. The case study in the Heihe River Basin demonstrated the applicability of VHF for large scale integrated SW-GW modeling. Visualization and spatial-temporal analysis of the modeling results by HEIFLOW greatly facilitates our understanding on the complicated hydrologic cycle and relationship among the hydrological and ecological variables in the study area, and provides insights into the regional water resources management.
Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.
2014-12-01
The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.
AGU President's Message: Obama Administration's Commitment to Scientific Integrity
NASA Astrophysics Data System (ADS)
McPhaden, Michael J.
2011-01-01
In March 2009, President Barack Obama issued a memorandum on the subject of scientific integrity in which he stated emphatically, 'Science and the scientific process must inform and guide decisions of my Administration on a wide range of issues, including improvement of public health, protection of the environment, increased efficiency in the use of energy and other resources, mitigation of the threat of climate change, and protection of national security.” The president charged John Holdren, director of the Office of Science and Technology Policy (OSTP), with developing specific recommendations “for ensuring the highest level of integrity in all aspects of the executive branch's involvement with scientific and technological processes.” On Friday, 17 December, OSTP released federal department and agency guidelines for implementing the administration’s policies on scientific integrity.
NASA/ESA CV-990 airborne simulation of Spacelab
NASA Technical Reports Server (NTRS)
Mulholland, D.; Neel, C.; De Waard, J.; Lovelett, R.; Weaver, L.; Parker, R.
1975-01-01
The paper describes the joint NASA/ESA extensive Spacelab simulation using the NASA CV-990 airborne laboratory. The scientific payload was selected to conduct studies in upper atmospheric physics and infrared astronomy. Two experiment operators from Europe and two from the U.S. were selected to live aboard the aircraft along with a mission manager for a six-day period and operate the experiments in behalf of the principal scientists. The mission was successful and provided extensive data relevant to Spacelab objectives on overall management of a complex international payload; experiment preparation, testing, and integration; training for proxy operation in space; data handling; multiexperimenter use of common experimenter facilities (telescopes); and schedule requirements to prepare for such a Spacelab mission.
VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.
2017-12-01
We present the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal. This allows for the dissemination of data, simulation of physical processes, and promotion of climate literacy. The current iteration leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. Moreover, the portal allows for real time visualization and editing of models, cloud based computational simulation, and downloads of relevant data. This allows for faster publication in peer-reviewed journals and adaption of results for educational applications. Through application of this concept to multiple aspects of the Earth System, VESL is able to broaden data applications in the geosciences and beyond. At this stage, we still seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL. As we plan its expansion, we aim to achieve more rapid communication and presentation of scientific results.
The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC
NASA Astrophysics Data System (ADS)
Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan
2016-04-01
The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.
Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Matzen, M. Keith
2014-09-16
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less
Scientific integrity in Brazil.
Lins, Liliane; Carvalho, Fernando Martins
2014-09-01
This article focuses on scientific integrity and the identification of predisposing factors to scientific misconduct in Brazil. Brazilian scientific production has increased in the last ten years, but the quality of the articles has decreased. Pressure on researchers and students for increasing scientific production may contribute to scientific misconduct. Cases of misconduct in science have been recently denounced in the country. Brazil has important institutions for controlling ethical and safety aspects of human research, but there is a lack of specific offices to investigate suspected cases of misconduct and policies to deal with scientific dishonesty.
Virtual Reality Simulation for the Operating Room
Gallagher, Anthony G.; Ritter, E Matt; Champion, Howard; Higgins, Gerald; Fried, Marvin P.; Moses, Gerald; Smith, C Daniel; Satava, Richard M.
2005-01-01
Summary Background Data: To inform surgeons about the practical issues to be considered for successful integration of virtual reality simulation into a surgical training program. The learning and practice of minimally invasive surgery (MIS) makes unique demands on surgical training programs. A decade ago Satava proposed virtual reality (VR) surgical simulation as a solution for this problem. Only recently have robust scientific studies supported that vision Methods: A review of the surgical education, human-factor, and psychology literature to identify important factors which will impinge on the successful integration of VR training into a surgical training program. Results: VR is more likely to be successful if it is systematically integrated into a well-thought-out education and training program which objectively assesses technical skills improvement proximate to the learning experience. Validated performance metrics should be relevant to the surgical task being trained but in general will require trainees to reach an objectively determined proficiency criterion, based on tightly defined metrics and perform at this level consistently. VR training is more likely to be successful if the training schedule takes place on an interval basis rather than massed into a short period of extensive practice. High-fidelity VR simulations will confer the greatest skills transfer to the in vivo surgical situation, but less expensive VR trainers will also lead to considerably improved skills generalizations. Conclusions: VR for improved performance of MIS is now a reality. However, VR is only a training tool that must be thoughtfully introduced into a surgical training curriculum for it to successfully improve surgical technical skills. PMID:15650649
Art Advancing Science: Filmmaking Leads to Molecular Insights at the Nanoscale.
Reilly, Charles; Ingber, Donald E
2017-12-26
Many have recognized the potential value of facilitating activities that span the art-science interface for the benefit of society; however, there are few examples that demonstrate how pursuit of an artistic agenda can lead to scientific insights. Here, we describe how we set out to produce an entertaining short film depicting the fertilization of the egg by sperm as a parody of a preview for another Star Wars movie to excite the public about science, but ended up developing a simulation tool for multiscale modeling. To produce an aesthetic that communicates mechanical continuity across spatial scales, we developed custom strategies that integrate physics-based animation software from the entertainment industry with molecular dynamics simulation tools, using experimental data from research publications. Using this approach, we were able to depict biological physicality across multiple spatial scales, from how sperm tails move to collective molecular behavior within the axoneme to how the molecular motor, dynein, produces force at the nanometer scale. The dynein simulations, which were validated by replicating results of past simulations and cryo-electron microscopic studies, also predicted a potential mechanism for how ATP hydrolysis drives dynein motion along the microtubule as well as how dynein changes its conformation when it goes through the power stroke. Thus, pursuit of an artistic work led to insights into biology at the nanoscale as well as the development of a highly generalizable modeling and simulation technology that has utility for nanoscience and any other area of scientific investigation that involves analysis of complex multiscale systems.
SIGNUM: A Matlab, TIN-based landscape evolution model
NASA Astrophysics Data System (ADS)
Refice, A.; Giachetta, E.; Capolongo, D.
2012-08-01
Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.
Gaming science: the "Gamification" of scientific thinking.
Morris, Bradley J; Croker, Steve; Zimmerman, Corinne; Gill, Devin; Romig, Connie
2013-09-09
Science is critically important for advancing economics, health, and social well-being in the twenty-first century. A scientifically literate workforce is one that is well-suited to meet the challenges of an information economy. However, scientific thinking skills do not routinely develop and must be scaffolded via educational and cultural tools. In this paper we outline a rationale for why we believe that video games have the potential to be exploited for gain in science education. The premise we entertain is that several classes of video games can be viewed as a type of cultural tool that is capable of supporting three key elements of scientific literacy: content knowledge, process skills, and understanding the nature of science. We argue that there are three classes of mechanisms through which video games can support scientific thinking. First, there are a number of motivational scaffolds, such as feedback, rewards, and flow states that engage students relative to traditional cultural learning tools. Second, there are a number of cognitive scaffolds, such as simulations and embedded reasoning skills that compensate for the limitations of the individual cognitive system. Third, fully developed scientific thinking requires metacognition, and video games provide metacognitive scaffolding in the form of constrained learning and identity adoption. We conclude by outlining a series of recommendations for integrating games and game elements in science education and provide suggestions for evaluating their effectiveness.
Gaming science: the “Gamification” of scientific thinking
Morris, Bradley J.; Croker, Steve; Zimmerman, Corinne; Gill, Devin; Romig, Connie
2013-01-01
Science is critically important for advancing economics, health, and social well-being in the twenty-first century. A scientifically literate workforce is one that is well-suited to meet the challenges of an information economy. However, scientific thinking skills do not routinely develop and must be scaffolded via educational and cultural tools. In this paper we outline a rationale for why we believe that video games have the potential to be exploited for gain in science education. The premise we entertain is that several classes of video games can be viewed as a type of cultural tool that is capable of supporting three key elements of scientific literacy: content knowledge, process skills, and understanding the nature of science. We argue that there are three classes of mechanisms through which video games can support scientific thinking. First, there are a number of motivational scaffolds, such as feedback, rewards, and flow states that engage students relative to traditional cultural learning tools. Second, there are a number of cognitive scaffolds, such as simulations and embedded reasoning skills that compensate for the limitations of the individual cognitive system. Third, fully developed scientific thinking requires metacognition, and video games provide metacognitive scaffolding in the form of constrained learning and identity adoption. We conclude by outlining a series of recommendations for integrating games and game elements in science education and provide suggestions for evaluating their effectiveness. PMID:24058354
NASA Astrophysics Data System (ADS)
Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.
2015-11-01
We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.
Statistical Methodologies to Integrate Experimental and Computational Research
NASA Technical Reports Server (NTRS)
Parker, P. A.; Johnson, R. T.; Montgomery, D. C.
2008-01-01
Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.
Strategic Integration: The Practical Politics of Integrated Research in Context
ERIC Educational Resources Information Center
van Kerkhoff, Lorrae
2005-01-01
Designing an integrative research program requires that research leaders negotiate a balance between the scientific interest of research and the practical interests of non-scientific partners. This paper examines the ways integrated research is formally categorised, and analyses the tangible expressions of the practical politics involved in…
Dharmawan, Budi; Böcher, Michael; Krott, Max
2017-09-01
The success of scientific knowledge transfer depends on if the decision maker can transform the scientific advice into a policy that can be accepted by all involved actors. We use a science-policy interactions model called research-integration-utilization to observe the process of scientific knowledge transfer in the case of endangered mangroves in Segara Anakan, Indonesia. Scientific knowledge is produced within the scientific system (research), science-based solutions to problems are practically utilized by political actors (utilization), and important links between research and utilization must be made (integration). We looked for empirical evidence to test hypotheses about the research-integration-utilization model based on document analysis and expert interviews. Our study finds that the failures in knowledge transfer are caused by the inappropriate use of scientific findings. The district government is expected by presidential decree to only used scientifically sound recommendations as a prerequisite for designing the regulation. However, the district government prefers to implement their own solutions because they believe that they understand the solutions better than the researcher. In the process of integration, the researcher cannot be involved, since the selection of scientific recommendations here fully depends on the interests of the district government as the powerful ally.
Endangered Mangroves in Segara Anakan, Indonesia: Effective and Failed Problem-Solving Policy Advice
NASA Astrophysics Data System (ADS)
Dharmawan, Budi; Böcher, Michael; Krott, Max
2017-09-01
The success of scientific knowledge transfer depends on if the decision maker can transform the scientific advice into a policy that can be accepted by all involved actors. We use a science-policy interactions model called research-integration-utilization to observe the process of scientific knowledge transfer in the case of endangered mangroves in Segara Anakan, Indonesia. Scientific knowledge is produced within the scientific system (research), science-based solutions to problems are practically utilized by political actors (utilization), and important links between research and utilization must be made (integration). We looked for empirical evidence to test hypotheses about the research-integration-utilization model based on document analysis and expert interviews. Our study finds that the failures in knowledge transfer are caused by the inappropriate use of scientific findings. The district government is expected by presidential decree to only used scientifically sound recommendations as a prerequisite for designing the regulation. However, the district government prefers to implement their own solutions because they believe that they understand the solutions better than the researcher. In the process of integration, the researcher cannot be involved, since the selection of scientific recommendations here fully depends on the interests of the district government as the powerful ally.
Scientific Integrity: The Need for Government Standards
NASA Astrophysics Data System (ADS)
McPhaden, Michael J.
2010-11-01
The U.S. government makes substantial investments in scientific research that address the nation’s need for accurate and authoritative information to guide federal policy decisions. Therefore, there is a lot at stake in having a consistent and explicit federal policy on scientific integrity to increase transparency and build trust in government science. Scientific integrity is an issue that applies not only to individual scientists working within the federal system but also to government agencies in how they use scientific information to formulate policy. The White House issued a memorandum on scientific integrity in March 2009, and it is regrettable that it has taken so much longer than the 120 days stipulated in the president's memo for the release of recommendations by the Office of Science and Technology Policy (OSTP) (see related news item in this issue). While it is also understandable given the welter of different agencies and organizations that make up the executive branch of the government, AGU urges that these recommendations be finalized and published as soon as possible.
Supporting observation campaigns with high resolution modeling
NASA Astrophysics Data System (ADS)
Klocke, Daniel; Brueck, Matthias; Voigt, Aiko
2017-04-01
High resolution simulation in support of measurement campaigns offers a promising and emerging way to create large-scale context for small-scale observations of clouds and precipitation processes. As these simulation include the coupling of measured small-scale processes with the circulation, they also help to integrate the research communities from modeling and observations and allow for detailed model evaluations against dedicated observations. In connection with the measurement campaign NARVAL (August 2016 and December 2013) simulations with a grid-spacing of 2.5 km for the tropical Atlantic region (9000x3300 km), with local refinement to 1.2 km for the western part of the domain, were performed using the icosahedral non-hydrostatic (ICON) general circulation model. These simulations are again used to drive large eddy resolving simulations with the same model for selected days in the high definition clouds and precipitation for advancing climate prediction (HD(CP)2) project. The simulations are presented with the focus on selected results showing the benefit for the scientific communities doing atmospheric measurements and numerical modeling of climate and weather. Additionally, an outlook will be given on how similar simulations will support the NAWDEX measurement campaign in the North Atlantic and AC3 measurement campaign in the Arctic.
77 FR 29361 - Scientific Integrity: Statement of Policy
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-17
... DEPARTMENT OF LABOR Scientific Integrity: Statement of Policy AGENCY: Office of the Secretary... Integrity Policy, originally published April 17, 2012. FOR FURTHER INFORMATION CONTACT: E. Christi Cunningham, Associate Assistant Secretary for Regulatory Policy, U.S. Department of Labor, 200 Constitution...
Quasi-isochronous muon collection channels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ankenbrandt, Charles M.; Neuffer, David; Johnson, Rolland P.
2015-04-26
Intense muon beams have many potential commercial and scientific applications, ranging from low-energy investigations of the basic properties of matter using spin resonance to large energy-frontier muon colliders. However, muons originate from a tertiary process that produces a diffuse swarm. To make useful beams, the swarm must be rapidly captured and cooled before the muons decay. In this STTR project a promising new concept for the collection and cooling of muon beams to increase their intensity and reduce their emittances was investigated, namely, the use of a nearly isochronous helical cooling channel (HCC) to facilitate capture of the muons intomore » RF bunches. The muon beam can then be cooled quickly and coalesced efficiently to optimize the luminosity of a muon collider, or could provide compressed muon beams for other applications. Optimal ways to integrate such a subsystem into the rest of a muon collection and cooling system, for collider and other applications, were developed by analysis and simulation. The application of quasi-isochronous helical cooling channels (QIHCC) for RF capture of muon beams was developed. Innovative design concepts for a channel incorporating straight solenoids, a matching section, and an HCC, including RF and absorber, were developed, and its subsystems were simulated. Additionally, a procedure that uses an HCC to combine bunches for a muon collider was invented and simulated. Difficult design aspects such as matching sections between subsystems and intensity-dependent effects were addressed. The bunch recombination procedure was developed into a complete design with 3-D simulations. Bright muon beams are needed for many commercial and scientific reasons. Potential commercial applications include low-dose radiography, muon catalyzed fusion, and the use of muon beams to screen cargo containers for homeland security. Scientific uses include low energy beams for rare process searches, muon spin resonance applications, muon beams for neutrino factories, and muon colliders as Higgs factories or energy-frontier discovery machines.« less
USDA-ARS?s Scientific Manuscript database
On December 2-4, 2014, the US Environmental Protection Agency convened a public meeting of the FIFRA Scientific Advisory Panel (SAP) to address scientific issues associated with the agency’s “Integrated Endocrine Bioactivity and Exposure-Based Prioritization and Screening” methods. EPA is proposing ...
Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam
2018-03-11
To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.
Estrada, Mica; Woodcock, Anna; Hernandez, Paul R.; Schultz, P. Wesley
2010-01-01
Students from several ethnic minority groups are underrepresented in the sciences, such that minority students more frequently drop out of the scientific career path than non-minority students. Viewed from a perspective of social influence, this pattern suggests that minority students do not integrate into the scientific community at the same rate as non-minority students. Kelman (1958, 2006) describes a tripartite integration model of social influence (TIMSI) by which a person orients to a social system. To test if this model predicts integration into the scientific community, we conducted analyses of data from a national panel of minority science students. A structural equation model framework showed that self-efficacy (operationalized consistent with Kelman’s ‘rule-orientation’) predicted student intentions to pursue a scientific career. However, when identification as a scientist and internalization of values are added to the model, self-efficacy becomes a poorer predictor of intention. Additional mediation analyses support the conclusion that while having scientific self-efficacy is important, identifying with and endorsing the values of the social system reflect a deeper integration and more durable motivation to persist as a scientist. PMID:21552374
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Hendrickson, Bruce
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less
NASA Astrophysics Data System (ADS)
Boyce, S. E.; Hanson, R. T.; Henson, W.; Ferguson, I. M.; Schmid, W.; Reimann, T.; Mehl, S.
2017-12-01
The One-Water Hydrologic Flow Model (One-Water) is a MODFLOW-based integrated hydrologic flow model designed for the analysis of a broad range of conjunctive-use and sustainability issues. It was motivated by the need to merge the multiple variants of MODFLOW-2005 to yield an enhanced unified version capable of simulating conjunctive use and management, sustainability, climate-related issues, and managing the relationships between groundwater, surface water, and land usage. One-Water links the movement and use of groundwater, surface water, and imported water for consumption by agriculture and natural vegetation on the landscape, and for potable and other uses within a supply-and-demand framework. The first version, released in 2014, was selected by The World Bank Water Resource Software Review in 2016 as one of three recommended simulation programs for conjunctive use and management modeling. One-Water is also being used as the primary simulation engine for FREEWAT, a European Union sponsored open-source water management software environment. The next version of One-Water will include a new surface-water operations module that simulates dynamic reservoir operations and a conduit-flow process for karst aquifers and leaky pipe networks. It will also include enhancements to local grid refinement, and additional features to facilitate easier model updates, faster execution, better error messages, and more integration/cross communication between the traditional MODFLOW packages. The new structure also helps facilitate the new integration into a "Self-Updating" structure of data streams, simulation, and analysis needed for modern water resource management. By retaining and tracking the water within the hydrosphere, One-Water accounts for "all of the water everywhere and all of the time." This philosophy provides more confidence in the water accounting to the scientific community and provides the public a foundation needed to address wider classes of problems. Ultimately, more complex questions are being asked about water resources, requiring tools that more completely answer conjunctive-use management questions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D E; McGraw, J R
2006-02-02
Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less
NASA Astrophysics Data System (ADS)
Strassmann, Kuno M.; Joos, Fortunat
2018-05-01
The Bern Simple Climate Model (BernSCM) is a free open-source re-implementation of a reduced-form carbon cycle-climate model which has been used widely in previous scientific work and IPCC assessments. BernSCM represents the carbon cycle and climate system with a small set of equations for the heat and carbon budget, the parametrization of major nonlinearities, and the substitution of complex component systems with impulse response functions (IRFs). The IRF approach allows cost-efficient yet accurate substitution of detailed parent models of climate system components with near-linear behavior. Illustrative simulations of scenarios from previous multimodel studies show that BernSCM is broadly representative of the range of the climate-carbon cycle response simulated by more complex and detailed models. Model code (in Fortran) was written from scratch with transparency and extensibility in mind, and is provided open source. BernSCM makes scientifically sound carbon cycle-climate modeling available for many applications. Supporting up to decadal time steps with high accuracy, it is suitable for studies with high computational load and for coupling with integrated assessment models (IAMs), for example. Further applications include climate risk assessment in a business, public, or educational context and the estimation of CO2 and climate benefits of emission mitigation options.
Continuum-kinetic approach to sheath simulations
NASA Astrophysics Data System (ADS)
Cagas, Petr; Hakim, Ammar; Srinivasan, Bhuvana
2016-10-01
Simulations of sheaths are performed using a novel continuum-kinetic model with collisions including ionization/recombination. A discontinuous Galerkin method is used to directly solve the Boltzmann-Poisson system to obtain a particle distribution function. Direct discretization of the distribution function has advantages of being noise-free compared to particle-in-cell methods. The distribution function, which is available at each node of the configuration space, can be readily used to calculate the collision integrals in order to get ionization and recombination operators. Analytical models are used to obtain the cross-sections as a function of energy. Results will be presented incorporating surface physics with a classical sheath in Hall thruster-relevant geometry. This work was sponsored by the Air Force Office of Scientific Research under Grant Number FA9550-15-1-0193.
Land Cover Applications, Landscape Dynamics, and Global Change
Tieszen, Larry L.
2007-01-01
The Land Cover Applications, Landscape Dynamics, and Global Change project at U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) seeks to integrate remote sensing and simulation models to better understand and seek solutions to national and global issues. Modeling processes related to population impacts, natural resource management, climate change, invasive species, land use changes, energy development, and climate mitigation all pose significant scientific opportunities. The project activities use remotely sensed data to support spatial monitoring, provide sensitivity analyses across landscapes and large regions, and make the data and results available on the Internet with data access and distribution, decision support systems, and on-line modeling. Applications support sustainable natural resource use, carbon cycle science, biodiversity conservation, climate change mitigation, and robust simulation modeling approaches that evaluate ecosystem and landscape dynamics.
ISCR Annual Report: Fical Year 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGraw, J R
2005-03-03
Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less
AGU's Updated Scientific Integrity and Professional Ethics Policy
NASA Astrophysics Data System (ADS)
McPhaden, M. J.
2017-12-01
AGU'S mission is to promote discovery in Earth and space science for the benefit of humanity. This mission can only be accomplished if all those engaged in the scientific enterprise uphold the highest standards of scientific integrity and professional ethics. AGU's Scientific Integrity and Professional Ethics Policy provides a set of principles and guidelines for AGU members, staff, volunteers, contractors, and non-members participating in AGU sponsored programs and activities. The policy has recently been updated to include a new code of conduct that broadens the definition of scientific misconduct to include discrimination, harassment, and bullying. This presentation provides the context for what motivated the updated policy, an outline of the policy itself, and a discussion of how it is being communicated and applied.
Explicit integration with GPU acceleration for large kinetic networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brock, Benjamin; Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37830; Belt, Andrew
2015-12-01
We demonstrate the first implementation of recently-developed fast explicit kinetic integration algorithms on modern graphics processing unit (GPU) accelerators. Taking as a generic test case a Type Ia supernova explosion with an extremely stiff thermonuclear network having 150 isotopic species and 1604 reactions coupled to hydrodynamics using operator splitting, we demonstrate the capability to solve of order 100 realistic kinetic networks in parallel in the same time that standard implicit methods can solve a single such network on a CPU. This orders-of-magnitude decrease in computation time for solving systems of realistic kinetic networks implies that important coupled, multiphysics problems inmore » various scientific and technical fields that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible.« less
Discovery informatics in biological and biomedical sciences: research challenges and opportunities.
Honavar, Vasant
2015-01-01
New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).
A domain-specific compiler for a parallel multiresolution adaptive numerical simulation environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram
This paper describes the design and implementation of a layered domain-specific compiler to support MADNESS---Multiresolution ADaptive Numerical Environment for Scientific Simulation. MADNESS is a high-level software environment for the solution of integral and differential equations in many dimensions, using adaptive and fast harmonic analysis methods with guaranteed precision. MADNESS uses k-d trees to represent spatial functions and implements operators like addition, multiplication, differentiation, and integration on the numerical representation of functions. The MADNESS runtime system provides global namespace support and a task-based execution model including futures. MADNESS is currently deployed on massively parallel supercomputers and has enabled many science advances.more » Due to the highly irregular and statically unpredictable structure of the k-d trees representing the spatial functions encountered in MADNESS applications, only purely runtime approaches to optimization have previously been implemented in the MADNESS framework. This paper describes a layered domain-specific compiler developed to address some performance bottlenecks in MADNESS. The newly developed static compile-time optimizations, in conjunction with the MADNESS runtime support, enable significant performance improvement for the MADNESS framework.« less
Visualization and analysis of vortex-turbine intersections in wind farms.
Shafii, Sohail; Obermaier, Herald; Linn, Rodman; Koo, Eunmo; Hlawitschka, Mario; Garth, Christoph; Hamann, Bernd; Joy, Kenneth I
2013-09-01
Characterizing the interplay between the vortices and forces acting on a wind turbine's blades in a qualitative and quantitative way holds the potential for significantly improving large wind turbine design. This paper introduces an integrated pipeline for highly effective wind and force field analysis and visualization. We extract vortices induced by a turbine's rotation in a wind field, and characterize vortices in conjunction with numerically simulated forces on the blade surfaces as these vortices strike another turbine's blades downstream. The scientifically relevant issue to be studied is the relationship between the extracted, approximate locations on the blades where vortices strike the blades and the forces that exist in those locations. This integrated approach is used to detect and analyze turbulent flow that causes local impact on the wind turbine blade structure. The results that we present are based on analyzing the wind and force field data sets generated by numerical simulations, and allow domain scientists to relate vortex-blade interactions with power output loss in turbines and turbine life expectancy. Our methods have the potential to improve turbine design to save costs related to turbine operation and maintenance.
Enlightening Students about Dark Matter
NASA Astrophysics Data System (ADS)
Hamilton, Kathleen; Barr, Alex; Eidelman, Dave
2018-01-01
Dark matter pervades the universe. While it is invisible to us, we can detect its influence on matter we can see. To illuminate this concept, we have created an interactive javascript program illustrating predictions made by six different models for dark matter distributions in galaxies. Students are able to match the predicted data with actual experimental results, drawn from several astronomy papers discussing dark matter’s impact on galactic rotation curves. Programming each new model requires integration of density equations with parameters determined by nonlinear curve-fitting using MATLAB scripts we developed. Using our javascript simulation, students can determine the most plausible dark matter models as well as the average percentage of dark matter lurking in galaxies, areas where the scientific community is still continuing to research. In that light, we strive to use the most up-to-date and accepted concepts: two of our dark matter models are the pseudo-isothermal halo and Navarro-Frenk-White, and we integrate out to each galaxy’s virial radius. Currently, our simulation includes NGC3198, NGC2403, and our own Milky Way.
Cryosphere Science Outreach using the NASA/JPL Virtual Earth System Laboratory
NASA Astrophysics Data System (ADS)
Larour, E. Y.; Cheng, D. L. C.; Quinn, J.; Halkides, D. J.; Perez, G. L.
2016-12-01
Understanding the role of Cryosphere Science within the larger context of Sea Level Rise is both a technical and educational challenge that needs to be addressed if the public at large is to truly understand the implications and consequences of Climate Change. Within this context, we propose a new approach in which scientific tools are used directly inside a mobile/website platform geared towards Education/Outreach. Here, we apply this approach by using the Ice Sheet System Model, a state of the art Cryosphere model developed at NASA, and integrated within a Virtual Earth System Laboratory, with the goal to outreach Cryosphere science to K-12 and College level students. The approach mixes laboratory experiments, interactive classes/lessons on a website, and a simplified interface to a full-fledged instance of ISSM to validate the classes/lessons. This novel approach leverages new insights from the Outreach/Educational community and the interest of new generations in web based technologies and simulation tools, all of it delivered in a seamlessly integrated web platform, relying on a state of the art climate model and live simulations.
Embedding Scientific Integrity and Ethics into the Scientific Process and Research Data Lifecycle
NASA Astrophysics Data System (ADS)
Gundersen, L. C.
2016-12-01
Predicting climate change, developing resources sustainably, and mitigating natural hazard risk are complex interdisciplinary challenges in the geosciences that require the integration of data and knowledge from disparate disciplines and scales. This kind of interdisciplinary science can only thrive if scientific communities work together and adhere to common standards of scientific integrity, ethics, data management, curation, and sharing. Science and data without integrity and ethics can erode the very fabric of the scientific enterprise and potentially harm society and the planet. Inaccurate risk analyses of natural hazards can lead to poor choices in construction, insurance, and emergency response. Incorrect assessment of mineral resources can bankrupt a company, destroy a local economy, and contaminate an ecosystem. This paper presents key ethics and integrity questions paired with the major components of the research data life cycle. The questions can be used by the researcher during the scientific process to help ensure the integrity and ethics of their research and adherence to sound data management practice. Questions include considerations for open, collaborative science, which is fundamentally changing the responsibility of scientists regarding data sharing and reproducibility. The publication of primary data, methods, models, software, and workflows must become a norm of science. There are also questions that prompt the scientist to think about the benefit of their work to society; ensuring equity, respect, and fairness in working with others; and always striving for honesty, excellence, and transparency.
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience
Stockton, David B.; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.
Stockton, David B; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
The Nature Index: a general framework for synthesizing knowledge on the state of biodiversity.
Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G; Nybø, Signe
2011-04-22
The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide.
The Nature Index: A General Framework for Synthesizing Knowledge on the State of Biodiversity
Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I.; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G.; Nybø, Signe
2011-01-01
The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide. PMID:21526118
In-Situ-measurement of restraining forces during forming of rectangular cups
NASA Astrophysics Data System (ADS)
Singer, M.; Liewald, M.
2016-11-01
This contribution introduces a new method for evaluating the restraining forces during forming of rectangular cups with the goal of eliminating the disadvantages of the currently used scientifically established measurement procedures. With this method forming forces are measured indirectly by the elastic deformation of die structure caused by locally varying tribological system. Therefore, two sensors were integrated into the punch, which measure the restraining forces during the forming process. Furthermore, it was possible to evaluate the effects of different lubricants showing the time dependent trend as a function of stroke during the forming of the materials DP600 and DC04. A main advantage of this testing method is to get real friction corresponding data out of the physical deep drawing process as well as the measurement of real acting restraining forces at different areas of the deep drawing part by one single test. Measurement results gained by both sensors have been integrated into LS-Dyna simulation in which the coefficient of friction was regarded as a function of time. The simulated and deep drawn parts afterwards are analysed and compared to specific areas with regard to locally measured thickness of part. Results show an improvement of simulation quality when using locally varying, time dependent coefficients of friction compared to commonly used constant values.
Evaluation of the whole body physiologically based pharmacokinetic (WB-PBPK) modeling of drugs.
Munir, Anum; Azam, Shumaila; Fazal, Sahar; Bhatti, A I
2018-08-14
The Physiologically based pharmacokinetic (PBPK) modeling is a supporting tool in drug discovery and improvement. Simulations produced by these models help to save time and aids in examining the effects of different variables on the pharmacokinetics of drugs. For this purpose, Sheila and Peters suggested a PBPK model capable of performing simulations to study a given drug absorption. There is a need to extend this model to the whole body entailing all another process like distribution, metabolism, and elimination, besides absorption. The aim of this scientific study is to hypothesize a WB-PBPK model through integrating absorption, distribution, metabolism, and elimination processes with the existing PBPK model.Absorption, distribution, metabolism, and elimination models are designed, integrated with PBPK model and validated. For validation purposes, clinical records of few drugs are collected from the literature. The developed WB-PBPK model is affirmed by comparing the simulations produced by the model against the searched clinical data. . It is proposed that the WB-PBPK model may be used in pharmaceutical industries to create of the pharmacokinetic profiles of drug candidates for better outcomes, as it is advance PBPK model and creates comprehensive PK profiles for drug ADME in concentration-time plots. Copyright © 2018 Elsevier Ltd. All rights reserved.
Christensen, A. J.; Srinivasan, V.; Hart, J. C.; ...
2018-03-17
Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, A. J.; Srinivasan, V.; Hart, J. C.
Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less
Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy
2018-05-01
Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in "big data" analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.
Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy
2018-01-01
Abstract Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields. PMID:29562368
Society for Academic Emergency Medicine Statement on Plagiarism.
Asher, Shellie L; Iserson, Kenneth V; Merck, Lisa H
2017-10-01
The integrity of the research enterprise is of the utmost importance for the advancement of safe and effective medical practice for patients and for maintaining the public trust in health care. Academic societies and editors of journals are key participants in guarding scientific integrity. Avoiding and preventing plagiarism helps to preserve the scientific integrity of professional presentations and publications. The Society for Academic Emergency Medicine (SAEM) Ethics Committee discusses current issues in scientific publishing integrity and provides a guideline to avoid plagiarism in SAEM presentations and publications. © 2017 by the Society for Academic Emergency Medicine.
Advanced Technology Training System on Motor-Operated Valves
NASA Technical Reports Server (NTRS)
Wiederholt, Bradley J.; Widjaja, T. Kiki; Yasutake, Joseph Y.; Isoda, Hachiro
1993-01-01
This paper describes how features from the field of Intelligent Tutoring Systems are applied to the Motor-Operated Valve (MOV) Advanced Technology Training System (ATTS). The MOV ATTS is a training system developed at Galaxy Scientific Corporation for the Central Research Institute of Electric Power Industry in Japan and the Electric Power Research Institute in the United States. The MOV ATTS combines traditional computer-based training approaches with system simulation, integrated expert systems, and student and expert modeling. The primary goal of the MOV ATTS is to reduce human errors that occur during MOV overhaul and repair. The MOV ATTS addresses this goal by providing basic operational information of the MOV, simulating MOV operation, providing troubleshooting practice of MOV failures, and tailoring this training to the needs of each individual student. The MOV ATTS integrates multiple expert models (functional and procedural) to provide advice and feedback to students. The integration also provides expert model validation support to developers. Student modeling is supported by two separate student models: one model registers and updates the student's current knowledge of basic MOV information, while another model logs the student's actions and errors during troubleshooting exercises. These two models are used to provide tailored feedback to the student during the MOV course.
NASA Astrophysics Data System (ADS)
Pansing, Craig W.; Hua, Hong; Rolland, Jannick P.
2005-08-01
Head-mounted display (HMD) technologies find a variety of applications in the field of 3D virtual and augmented environments, 3D scientific visualization, as well as wearable displays. While most of the current HMDs use head pose to approximate line of sight, we propose to investigate approaches and designs for integrating eye tracking capability into HMDs from a low-level system design perspective and to explore schemes for optimizing system performance. In this paper, we particularly propose to optimize the illumination scheme, which is a critical component in designing an eye tracking-HMD (ET-HMD) integrated system. An optimal design can improve not only eye tracking accuracy, but also robustness. Using LightTools, we present the simulation of a complete eye illumination and imaging system using an eye model along with multiple near infrared LED (IRLED) illuminators and imaging optics, showing the irradiance variation of the different eye structures. The simulation of dark pupil effects along with multiple 1st-order Purkinje images will be presented. A parametric analysis is performed to investigate the relationships between the IRLED configurations and the irradiance distribution at the eye, and a set of optimal configuration parameters is recommended. The analysis will be further refined by actual eye image acquisition and processing.
[Identification of ecological corridors and its importance by integrating circuit theory].
Song, Li Li; Qin, Ming Zhou
2016-10-01
Landscape connectivity is considered as an extraordinarily important factor affecting various ecological processes. The least cost path (LCP) on the basis of minimum cumulative resis-tance model (MCRM) may provide a more efficient approach to identify functional connectivity in heterogeneous landscapes, and is already adopted by the research of landscape functional connecti-vity assessment and ecological corridor simulation. Connectivity model on circuit theory (CMCT) replaced the edges in the graph theory with resistors, cost distance with resistance distance to measure the functional connectivity in heterogeneous landscapes. By means of Linkage Mapper tool and Circuitscape software, the simulated landscape generated from SIMMAP 2.0 software was viewed as the study object in this article, aimed at exploring how to integrate MCRM with CMCT to identify ecological corridors and relative importance of landscape factors. The results showed that two models had their individual advantages and mutual complement. MCRM could effectively identify least cost corridors among habitats. CMCT could effectively identify important landscape factor and pinch point, which had important influence on landscape connectivity. We also found that the position of pinch point was not affected by corridor width, which had obvious advantage in the research of identifying the importance of corridors. The integrated method could provide certain scientific basis for regional ecological protection planning and ecological corridor design.
NASA Astrophysics Data System (ADS)
Ortiz, Marco; Wolff, Matthias
2004-10-01
The sustainability of different integrated management regimes for the mangrove ecosystem of the Caeté Estuary (North Brazil) were assessed using a holistic theoretical framework. As a way to demonstrate that the behaviour and trajectory of complex whole systems are not epiphenomenal to the properties of the small parts, a set of conceptual models from more reductionistic to more holistic were enunciated. These models integrate the scientific information published until present for this mangrove ecosystem. The sustainability of different management scenarios (forestry and fishery) was assessed. Since the exploitation of mangrove trees is not allowed according Brazilian laws, the forestry was only included for simulation purposes. The model simulations revealed that sustainability predictions of reductionistic models should not be extrapolated into holistic approaches. Forestry and fishery activities seem to be sustainable only if they are self-damped. The exploitation of the two mangrove species Rhizophora mangle and Avicenia germinans does not appear to be sustainable, thus a rotation harvest is recommended. A similar conclusion holds for the exploitation of invertebrate species. Our results suggest that more studies should be focused on the estimation of maximum sustainable yield based on a multispecies approach. Any reference to holistic sustainability based on reductionistic approaches may distort our understanding of the natural complex ecosystems.
MODFLOW-OWHM v2: The next generation of fully integrated hydrologic simulation software
NASA Astrophysics Data System (ADS)
Boyce, S. E.; Hanson, R. T.; Ferguson, I. M.; Reimann, T.; Henson, W.; Mehl, S.; Leake, S.; Maddock, T.
2016-12-01
The One-Water Hydrologic Flow Model (One-Water) is a MODFLOW-based integrated hydrologic flow model designed for the analysis of a broad range of conjunctive-use and climate-related issues. One-Water fully links the movement and use of groundwater, surface water, and imported water for consumption by agriculture and natural vegetation on the landscape, and for potable and other uses within a supply-and-demand framework. One-Water includes linkages for deformation-, flow-, and head-dependent flows; additional observation and parameter options for higher-order calibrations; and redesigned code for facilitation of self-updating models and faster simulation run times. The next version of One-Water, currently under development, will include a new surface-water operations module that simulates dynamic reservoir operations, a new sustainability analysis package that facilitates the estimation and simulation of reduced storage depletion and captured discharge, a conduit-flow process for karst aquifers and leaky pipe networks, a soil zone process that adds an enhanced infiltration process, interflow, deep percolation and soil moisture, and a new subsidence and aquifer compaction package. It will also include enhancements to local grid refinement, and additional features to facilitate easier model updates, faster execution, better error messages, and more integration/cross communication between the traditional MODFLOW packages. By retaining and tracking the water within the hydrosphere, One-Water accounts for "all of the water everywhere and all of the time." This philosophy provides more confidence in the water accounting by the scientific community and provides the public a foundation needed to address wider classes of problems. Ultimately, more complex questions are being asked about water resources, so they require a more complete answer about conjunctive-use and climate-related issues.
NASA Astrophysics Data System (ADS)
Boyce, S. E.; Hanson, R. T.
2015-12-01
The One-Water Hydrologic Flow Model (MF-OWHM) is a MODFLOW-based integrated hydrologic flow model that is the most complete version, to date, of the MODFLOW family of hydrologic simulators needed for the analysis of a broad range of conjunctive-use issues. MF-OWHM fully links the movement and use of groundwater, surface water, and imported water for consumption by agriculture and natural vegetation on the landscape, and for potable and other uses within a supply-and-demand framework. MF-OWHM is based on the Farm Process for MODFLOW-2005 combined with Local Grid Refinement, Streamflow Routing, Surface-water Routing Process, Seawater Intrusion, Riparian Evapotranspiration, and the Newton-Raphson solver. MF-OWHM also includes linkages for deformation-, flow-, and head-dependent flows; additional observation and parameter options for higher-order calibrations; and redesigned code for facilitation of self-updating models and faster simulation run times. The next version of MF-OWHM, currently under development, will include a new surface-water operations module that simulates dynamic reservoir operations, the conduit flow process for karst aquifers and leaky pipe networks, a new subsidence and aquifer compaction package, and additional features and enhancements to enable more integration and cross communication between traditional MODFLOW packages. By retaining and tracking the water within the hydrosphere, MF-OWHM accounts for "all of the water everywhere and all of the time." This philosophy provides more confidence in the water accounting by the scientific community and provides the public a foundation needed to address wider classes of problems such as evaluation of conjunctive-use alternatives and sustainability analysis, including potential adaptation and mitigation strategies, and best management practices. By Scott E. Boyce and Randall T. Hanson
Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.
2017-01-01
Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.
ERIC Educational Resources Information Center
Gutierez, Sally B.
2015-01-01
Scientific literacy has been focused on the construction of students' knowledge to use appropriate and meaningful concepts, critically think, and make balanced, well-informed decisions relevant to their lives. This study presents the effects of integrating socio-scientific issues to enhance the bioethical decision-making skills of biology…
Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model
NASA Astrophysics Data System (ADS)
Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi
Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.
Center for Nanophase Materials Sciences
NASA Astrophysics Data System (ADS)
Horton, Linda
2002-10-01
The Center for Nanophase Materials Sciences (CNMS) will be a user facility with a strong component of joint, collaborative research. CNMS is being developed, together with the scientific community, with support from DOE's Office of Basic Energy Sciences. The Center will provide a thriving, multidisciplinary environment for research as well as the education of students and postdoctoral scholars. It will be co-located with the Spallation Neutron Source (SNS) and the Joint Institute for Neutron Sciences (JINS). The CNMS will integrate nanoscale research with neutron science, synthesis science, and theory/modeling/simulation, bringing together four areas in which the United States has clear national research and educational needs. The Center's research will be organized under three scientific thrusts: nano-dimensioned "soft" materials (including organic, hybrid, and interfacial nanophases); complex "hard" materials systems (including the crosscutting areas of interfaces and reduced dimensionality that become scientifically critical on the nanoscale); and theory/modeling/simulation. This presentation will summarize the progress towards identification of the specific research focus topics for the Center. Currently proposed topics, based on two workshops with the potential user community, include catalysis, nanomagnetism, synthetic and bio-inspired macromolecular materials, nanophase biomaterials, nanofluidics, optics/photonics, carbon-based nanostructures, collective behavior, nanoscale interface science, virtual synthesis and nanomaterials design, and electronic structure, correlations, and transport. In addition, the proposed 80,000 square foot facility (wet/dry labs, nanofabrication clean rooms, and offices) and the associated technical equipment will be described. The CNMS is scheduled to begin construction in spring, 2003. Initial operations are planned for late in 2004.
ERIC Educational Resources Information Center
Hulshof, Casper D.; de Jong, Ton
2006-01-01
Students encounter many obstacles during scientific discovery learning with computer-based simulations. It is hypothesized that an effective type of support, that does not interfere with the scientific discovery learning process, should be delivered on a "just-in-time" base. This study explores the effect of facilitating access to…
Preparing for joint operation of numerical modelling and observational data in IMPEx
NASA Astrophysics Data System (ADS)
Al-Ubaidi, Tarek; Khodachenko, Maxim; Kallio, Esa; Génot, Vincent; Modolo, Ronan; Hess, Sébastien; Schmidt, Walter; Topf, Florian; Alexeev, Igor; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Renard, Benjamin; Bourrel, Natacha; Penou, Emmanuel; André, Nicolas; Belenkaya, Elena
2013-04-01
The FP7-SPACE project IMPEx (http://impex-fp7.oeaw.ac.at) was established as a result of scientific collaboration between research teams from Austria, Finland, France, and Russia, working on the integration of a set of interactive data mining, analysis and modeling tools in the field of space plasma and planetary physics. The primary goal of the project is, to bridge the gap between spacecraft measurements and modern computational models of the planetary near-by space environments, enabling their joint operation for better understanding of related physical phenomena. The major challenge of IMPEx development consists in the need to connect different types of data sources, in particular numerical simulation results and observational data. To do so, every IMPEx tool must be able to handle both kinds of data in a consistent way. Thus, a considerable part of effort is dedicated to the development of standardized (web service-) interfaces and protocols for communication between the components, as well as a common approach to share user credentials. One of the systems' cornerstones is the specification of a standard for describing and storing the different data products involved that is able to include simulation outputs as well as observational data within a common standard, i.e. Data Model (DM). The IMPEx DM is an extension of the widely used SPASE DM and constitutes the first attempt in the field of space plasma physics worldwide, to create a unified data model that is able to store simulation outputs as well as observational data products in a shared data structure. To meet the requirement of extendibility, i.e. to have a possibility to include new computational models as well as analysis and visualization tools, the IMPEx DM as well as communication protocols have been designed to be as compact as possible and yet general and powerful enough to integrate a wide range of data sets and to allow for simple procedures when attaching new components to the system. A draft version of the IMPEx DM that is being developed as this abstract is written, was presented on several international scientific conferences and recognized by professionals in the field as an important contribution and a sound starting point for the development of a unified approach to work with experimental and computationally modelled scientific data.
New AGU scientific integrity and professional ethics policy available for review
Gundersen, Linda C.
2012-01-01
The AGU Task Force on Scientific Ethics welcomes your review and comments on AGU's new Scientific Integrity and Professional Ethics Policy. The policy has at its heart a code of conduct adopted from the internationally accepted "Singapore Statement," originally created by the Second World Conference on Research Integrity (http://www.singaporestatement.org/), held in 2010. The new policy also encompasses professional and publishing ethics, providing a single source of guidance to AGU members, officers, authors, and editors
Advanced Integration Matrix Education Outreach
NASA Technical Reports Server (NTRS)
Paul Heather L.
2004-01-01
The Advanced Integration Matrix (AIM) will design a ground-based test facility for developing revolutionary integrated systems for joint human-robotic missions in order to study and solve systems-level integration issues for exploration missions beyond Low Earth Orbit (LEO). This paper describes development plans for educational outreach activities related to technological and operational integration scenarios similar to the challenges that will be encountered through this project. The education outreach activities will provide hands-on, interactive exercises to allow students of all levels to experience design and operational challenges similar to what NASA deals with everyday in performing the integration of complex missions. These experiences will relate to and impact students everyday lives by demonstrating how their interests in science and engineering can develop into future careers, and reinforcing the concepts of teamwork and conflict resolution. Allowing students to experience and contribute to real-world development, research, and scientific studies of ground-based simulations for complex exploration missions will stimulate interest in the space program, and bring NASA's challenges to the student level. By enhancing existing educational programs and developing innovative activities and presentations, AIM will support NASA s endeavor to "inspire the next generation of explorers.. .as only NASA can."
Scientific misconduct: also an issue in nursing science?
Fierz, Katharina; Gennaro, Susan; Dierickx, Kris; Van Achterberg, Theo; Morin, Karen H; De Geest, Sabina
2014-07-01
Scientific misconduct (SMC) is an increasing concern in nursing science. This article discusses the prevalence of SMC, risk factors and correlates of scientific misconduct in nursing science, and highlights interventional approaches to foster good scientific conduct. Using the "Fostering Research Integrity in Europe" report of the European Science Foundation as a framework, we reviewed the literature in research integrity promotion. Although little empirical data exist regarding prevalence of scientific misconduct in the field of nursing science, available evidence suggests a similar prevalence as elsewhere. In studies of prospective graduate nurses, 4% to 17% admit data falsification or fabrication, while 8.8% to 26.4% report plagiarizing material. Risk factors for SMC exist at the macro, meso, and micro levels of the research system. Intervention research on preventing scientific misconduct in nursing is limited, yet findings from the wider field of medicine and allied health professions suggest that honor codes, training programs, and clearly communicated misconduct control mechanisms and misconduct consequences improve ethical behavior. Scientific misconduct is a multilevel phenomenon. Interventions to decrease scientific misconduct must therefore target every level of the nursing research systems. Scientific misconduct not only compromises scientific integrity by distorting empirical evidence, but it might endanger patients. Because nurses are involved in clinical research, raising their awareness of scientifically inappropriate behavior is essential. © 2014 Sigma Theta Tau International.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.
Semiconductor hybrid pixel detectors often consist of a pixellated sensor layer bump bonded to a matching pixelated readout integrated circuit (ROIC). The sensor can range from high resistivity Si to III-V materials, whereas a Si CMOS process is typically used to manufacture the ROIC. Independent, device physics and electronic design automation (EDA) tools are used to determine sensor characteristics and verify functional performance of ROICs respectively with significantly different solvers. Some physics solvers provide the capability of transferring data to the EDA tool. However, single pixel transient simulations are either not feasible due to convergence difficulties or are prohibitively long.more » A simplified sensor model, which includes a current pulse in parallel with detector equivalent capacitor, is often used; even then, spice type top-level (entire array) simulations range from days to weeks. In order to analyze detector deficiencies for a particular scientific application, accurately defined transient behavioral models of all the functional blocks are required. Furthermore, various simulations, such as transient, noise, Monte Carlo, inter-pixel effects, etc. of the entire array need to be performed within a reasonable time frame without trading off accuracy. The sensor and the analog front-end can be modeling using a real number modeling language, as complex mathematical functions or detailed data can be saved to text files, for further top-level digital simulations. Parasitically aware digital timing is extracted in a standard delay format (sdf) from the pixel digital back-end layout as well as the periphery of the ROIC. For any given input, detector level worst-case and best-case simulations are performed using a Verilog simulation environment to determine the output. Each top-level transient simulation takes no more than 10-15 minutes. The impact of changing key parameters such as sensor Poissonian shot noise, analog front-end bandwidth, jitter due to clock distribution etc. can be accurately analyzed to determine ROIC architectural viability and bottlenecks. Hence the impact of the detector parameters on the scientific application can be studied.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
NOAA draft scientific integrity policy: Comment period open through 20 August
NASA Astrophysics Data System (ADS)
Showstack, Randy
2011-08-01
The National Oceanic and Atmospheric Administration (NOAA) is aiming to finalize its draft scientific integrity policy possibly by the end of the year, Larry Robinson, NOAA assistant secretary for conservation and management, indicated during a 28 July teleconference. The policy “is key to fostering an environment where science is encouraged, nurtured, respected, rewarded, and protected,” Robinson said, adding that the agency's comment period for the draft policy, which was released on 16 June, ends on 20 August. “Science underpins all that NOAA does. This policy is one piece of a broader effort to strengthen NOAA science,” Robinson said, noting that the draft “represents the first ever scientific integrity policy for NOAA. Previously, our policy only addressed research misconduct and focused on external grants. What's new about this policy is that it establishes NOAA's principles for scientific integrity, a scientific code of conduct, and a code of ethics for science supervision and management.”
Teaching Harmonic Motion in Trigonometry: Inductive Inquiry Supported by Physics Simulations
ERIC Educational Resources Information Center
Sokolowski, Andrzej; Rackley, Robin
2011-01-01
In this article, the authors present a lesson whose goal is to utilise a scientific environment to immerse a trigonometry student in the process of mathematical modelling. The scientific environment utilised during this activity is a physics simulation called "Wave on a String" created by the PhET Interactive Simulations Project at…
The integrated Earth system model version 1: formulation and functionality
Collins, W. D.; Craig, A. P.; Truesdale, J. E.; ...
2015-07-23
The integrated Earth system model (iESM) has been developed as a new tool for projecting the joint human/climate system. The iESM is based upon coupling an integrated assessment model (IAM) and an Earth system model (ESM) into a common modeling infrastructure. IAMs are the primary tool for describing the human–Earth system, including the sources of global greenhouse gases (GHGs) and short-lived species (SLS), land use and land cover change (LULCC), and other resource-related drivers of anthropogenic climate change. ESMs are the primary scientific tools for examining the physical, chemical, and biogeochemical impacts of human-induced changes to the climate system. Themore » iESM project integrates the economic and human-dimension modeling of an IAM and a fully coupled ESM within a single simulation system while maintaining the separability of each model if needed. Both IAM and ESM codes are developed and used by large communities and have been extensively applied in recent national and international climate assessments. By introducing heretofore-omitted feedbacks between natural and societal drivers, we can improve scientific understanding of the human–Earth system dynamics. Potential applications include studies of the interactions and feedbacks leading to the timing, scale, and geographic distribution of emissions trajectories and other human influences, corresponding climate effects, and the subsequent impacts of a changing climate on human and natural systems. This paper describes the formulation, requirements, implementation, testing, and resulting functionality of the first version of the iESM released to the global climate community.« less
ERIC Educational Resources Information Center
Estrada, Mica; Woodcock, Anna; Hernandez, Paul R.; Schultz, P. Wesley
2011-01-01
Students from several ethnic minority groups are underrepresented in the sciences, indicating that minority students more frequently drop out of the scientific career path than nonminority students. Viewed from a perspective of social influence, this pattern suggests that minority students do not integrate into the scientific community at the same…
A generative model for scientific concept hierarchies.
Datta, Srayan; Adar, Eytan
2018-01-01
In many scientific disciplines, each new 'product' of research (method, finding, artifact, etc.) is often built upon previous findings-leading to extension and branching of scientific concepts over time. We aim to understand the evolution of scientific concepts by placing them in phylogenetic hierarchies where scientific keyphrases from a large, longitudinal academic corpora are used as a proxy of scientific concepts. These hierarchies exhibit various important properties, including power-law degree distribution, power-law component size distribution, existence of a giant component and less probability of extending an older concept. We present a generative model based on preferential attachment to simulate the graphical and temporal properties of these hierarchies which helps us understand the underlying process behind scientific concept evolution and may be useful in simulating and predicting scientific evolution.
A generative model for scientific concept hierarchies
Adar, Eytan
2018-01-01
In many scientific disciplines, each new ‘product’ of research (method, finding, artifact, etc.) is often built upon previous findings–leading to extension and branching of scientific concepts over time. We aim to understand the evolution of scientific concepts by placing them in phylogenetic hierarchies where scientific keyphrases from a large, longitudinal academic corpora are used as a proxy of scientific concepts. These hierarchies exhibit various important properties, including power-law degree distribution, power-law component size distribution, existence of a giant component and less probability of extending an older concept. We present a generative model based on preferential attachment to simulate the graphical and temporal properties of these hierarchies which helps us understand the underlying process behind scientific concept evolution and may be useful in simulating and predicting scientific evolution. PMID:29474409
NASA Astrophysics Data System (ADS)
Develaki, Maria
2017-11-01
Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.
ERIC Educational Resources Information Center
Robinson, William R.
2000-01-01
Describes a review of research that addresses the effectiveness of simulations in promoting scientific discovery learning and the problems that learners may encounter when using discovery learning. (WRM)
NASA Technical Reports Server (NTRS)
Helmreich, R.; Wilhelm, J.; Tanner, T. A.; Sieber, J. E.; Burgenbauch, S.
1978-01-01
A management study was conducted to specify activities and problems encountered during the development of procedures for documentation and crew training on experiments, as well as during the design, integration, and delivery of a life sciences experiment payload to Johnson Space Center for a 7 day simulation of a Spacelab mission. Conclusions and recommendations to project management for current and future Ames' life sciences projects are included. Broader issues relevant to the conduct of future scientific missions under the constraints imposed by the environment of space are also addressed.
NASA Astrophysics Data System (ADS)
Xu, Zhicheng; Yuan, Bo; Zhang, Fuqiang
2018-06-01
In this paper, a power supply optimization model is proposed. The model takes the minimum fossil energy consumption as the target, considering the output characteristics of the conventional power supply and the renewable power supply. The optimal capacity ratio of wind-solar in the power supply under various constraints is calculated, and the interrelation between conventional power supply and renewable energy is analyzed in the system of high proportion renewable energy integration. Using the model, we can provide scientific guidance for the coordinated and orderly development of renewable energy and conventional power sources.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
NASA Astrophysics Data System (ADS)
Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.
2007-12-01
Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.
The Effectiveness of Scientific Inquiry With/Without Integration of Scientific Reasoning
ERIC Educational Resources Information Center
Chen, Chun-Ting; She, Hsiao-Ching
2015-01-01
This study examines the difference in effectiveness between two scientific inquiry programs-one with an emphasis on scientific reasoning and one without a scientific reasoning component-on students' scientific concepts, scientific concept-dependent reasoning, and scientific inquiry. A mixed-method approach was used in which 115 grade 5…
Simultaneous EEG and MEG source reconstruction in sparse electromagnetic source imaging.
Ding, Lei; Yuan, Han
2013-04-01
Electroencephalography (EEG) and magnetoencephalography (MEG) have different sensitivities to differently configured brain activations, making them complimentary in providing independent information for better detection and inverse reconstruction of brain sources. In the present study, we developed an integrative approach, which integrates a novel sparse electromagnetic source imaging method, i.e., variation-based cortical current density (VB-SCCD), together with the combined use of EEG and MEG data in reconstructing complex brain activity. To perform simultaneous analysis of multimodal data, we proposed to normalize EEG and MEG signals according to their individual noise levels to create unit-free measures. Our Monte Carlo simulations demonstrated that this integrative approach is capable of reconstructing complex cortical brain activations (up to 10 simultaneously activated and randomly located sources). Results from experimental data showed that complex brain activations evoked in a face recognition task were successfully reconstructed using the integrative approach, which were consistent with other research findings and validated by independent data from functional magnetic resonance imaging using the same stimulus protocol. Reconstructed cortical brain activations from both simulations and experimental data provided precise source localizations as well as accurate spatial extents of localized sources. In comparison with studies using EEG or MEG alone, the performance of cortical source reconstructions using combined EEG and MEG was significantly improved. We demonstrated that this new sparse ESI methodology with integrated analysis of EEG and MEG data could accurately probe spatiotemporal processes of complex human brain activations. This is promising for noninvasively studying large-scale brain networks of high clinical and scientific significance. Copyright © 2011 Wiley Periodicals, Inc.
X-ray simulation for structural integrity for aerospace components - A case study
NASA Astrophysics Data System (ADS)
Singh, Surendra; Gray, Joseph
2016-02-01
The use of Integrated Computational Materials Engineering (ICME) has rapidly evolved from an emerging technology to the industry standards in Materials, Manufacturing, Chemical, Civil, and Aerospace engineering. Despite this the recognition of the ICME merits has been somewhat lacking within NDE community. This is due in part to the makeup of NDE practitioners. They are a very diverse but regimented group. More than 80% of NDE experts are trained and certified as NDT Level 3's and auditors in order to perform their daily inspection jobs. These jobs involve detection of attribute of interest, which may be a defect or condition or both, in a material. These jobs are performed in strict compliance with procedures that have been developed over many years by trial-and-error with minimal understanding of the underlying physics and interplay between the NDE methods setup parameters. It is not in the nature of these trained Level 3's experts to look for alternate or out-of-the box, solutions. Instead, they follow the procedures for compliance as required by regulatory agencies. This approach is time-consuming, subjective, and is treated as a bottleneck in today's manufacturing environments. As such, there is a need for new NDE tools that provide rapid, high quality solutions for studying structural and dimensional integrity in parts at a reduced cost. NDE simulations offer such options by a shortening NDE technique development-time, attaining a new level in the scientific understanding of physics of interactions between interrogating energy and materials, and reducing costs. In this paper, we apply NDE simulation (XRSIM as an example) for simulating X-Ray techniques for studying aerospace components. These results show that NDE simulations help: 1) significantly shorten NDE technique development-time, 2) assist in training NDE experts, by facilitating the understanding of the underlying physics, and 3) improve both capability and reliability of NDE methods in terms of coverage maps.
Current Driven Instabilities and Anomalous Mobility in Hall-effect Thrusters
NASA Astrophysics Data System (ADS)
Tran, Jonathan; Eckhardt, Daniel; Martin, Robert
2017-10-01
Due to the extreme cost of fully resolving the Debye length and plasma frequency, hybrid plasma simulations utilizing kinetic ions and quasi-steady state fluid electrons have long been the principle workhorse methodology for Hall-effect thruster (HET) modeling. Plasma turbulence and the resulting anomalous electron transport in HETs is a promising candidate for developing predictive models for the observed anomalous transport. In this work, we investigate the implementation of an anomalous electron cross field transport model for hybrid HET simulations such a HPHall. A theory for anomalous transport in HETs and current driven instabilities has been recently studied by Lafleur et al. This work has shown collective electron-wave scattering due to large amplitude azimuthal fluctuations of the electric field. We will further adapt the previous results for related current driven instabilities to electric propulsion relevant mass ratios and conduct a preliminary study of resolving this instability with a modified hybrid (fluid electron and kinetic ion) simulation with the hope of integration with established hybrid HET simulations. This work is supported by the Air Force Office of Scientific Research award FA9950-17RQCOR465.
The epistemic integrity of scientific research.
De Winter, Jan; Kosolosky, Laszlo
2013-09-01
We live in a world in which scientific expertise and its epistemic authority become more important. On the other hand, the financial interests in research, which could potentially corrupt science, are increasing. Due to these two tendencies, a concern for the integrity of scientific research becomes increasingly vital. This concern is, however, hollow if we do not have a clear account of research integrity. Therefore, it is important that we explicate this concept. Following Rudolf Carnap's characterization of the task of explication, this means that we should develop a concept that is (1) similar to our common sense notion of research integrity, (2) exact, (3) fruitful, and (4) as simple as possible. Since existing concepts do not meet these four requirements, we develop a new concept in this article. We describe a concept of epistemic integrity that is based on the property of deceptiveness, and argue that this concept does meet Carnap's four requirements of explication. To illustrate and support our claims we use several examples from scientific practice, mainly from biomedical research.
Collaborative research among academia, business, and government
SETAC is a tripartite organization comprised chiefly of three sectors: academia, government, and industry. Collaborative connections within and among these sectors provide the basis for scientific structural integrity. Such interactions generally foster scientific integrity, tra...
NASA Astrophysics Data System (ADS)
Sun, Yun-Ping; Ju, Jiun-Yan; Liang, Yen-Chu
2008-12-01
Since the unmanned aerial vehicles (UAVs) bring forth many innovative applications in scientific, civilian, and military fields, the development of UAVs is rapidly growing every year. The on-board autopilot that reliably performs attitude and guidance control is a vital part for out-of-sight flights. However, the control law in autopilot is designed according to a simplified plant model in which the dynamics of real hardware are usually not taken into consideration. It is a necessity to develop a test-bed including real servos to make real-time control experiments for prototype autopilots, so called hardware-in-the-loop (HIL) simulation. In this paper on the basis of the graphical application software LabVIEW, the real-time HIL simulation system is realized efficiently by the virtual instrumentation approach. The proportional-integral-derivative (PID) controller in autopilot for the pitch angle control loop is experimentally determined by the classical Ziegler-Nichols tuning rule and exhibits good transient and steady-state response in real-time HIL simulation. From the results the differences between numerical simulation and real-time HIL simulation are also clearly presented. The effectiveness of HIL simulation for UAV autopilot design is definitely confirmed
ERIC Educational Resources Information Center
Abdullah, Sopiah; Shariff, Adilah
2008-01-01
The purpose of the study was to investigate the effects of inquiry-based computer simulation with heterogeneous-ability cooperative learning (HACL) and inquiry-based computer simulation with friendship cooperative learning (FCL) on (a) scientific reasoning (SR) and (b) conceptual understanding (CU) among Form Four students in Malaysian Smart…
IEA EBC Annex 66: Definition and simulation of occupant behavior in buildings
Yan, Da; Hong, Tianzhen; Dong, Bing; ...
2017-09-28
Here, more than 30% of the total primary energy in the world is consumed in buildings. It is crucial to reduce building energy consumption in order to preserve energy resources and mitigate global climate change. Building performance simulations have been widely used for the estimation and optimization of building performance, providing reference values for the assessment of building energy consumption and the effects of energy-saving technologies. Among the various factors influencing building energy consumption, occupant behavior has drawn increasing attention. Occupant behavior includes occupant presence, movement, and interaction with building energy devices and systems. However, there are gaps in occupantmore » behavior modeling as different energy modelers have employed varied data and tools to simulate occupant behavior, therefore producing different and incomparable results. Aiming to address these gaps, the International Energy Agency (IEA) Energy in Buildings and Community (EBC) Programme Annex 66 has established a scientific methodological framework for occupant behavior research, including data collection, behavior model representation, modeling and evaluation approaches, and the integration of behavior modeling tools with building performance simulation programs. Annex 66 also includes case studies and application guidelines to assist in building design, operation, and policymaking, using interdisciplinary approaches to reduce energy use in buildings and improve occupant comfort and productivity. This paper highlights the key research issues, methods, and outcomes pertaining to Annex 66, and offers perspectives on future research needs to integrate occupant behavior with the building life cycle.« less
IEA EBC Annex 66: Definition and simulation of occupant behavior in buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Da; Hong, Tianzhen; Dong, Bing
Here, more than 30% of the total primary energy in the world is consumed in buildings. It is crucial to reduce building energy consumption in order to preserve energy resources and mitigate global climate change. Building performance simulations have been widely used for the estimation and optimization of building performance, providing reference values for the assessment of building energy consumption and the effects of energy-saving technologies. Among the various factors influencing building energy consumption, occupant behavior has drawn increasing attention. Occupant behavior includes occupant presence, movement, and interaction with building energy devices and systems. However, there are gaps in occupantmore » behavior modeling as different energy modelers have employed varied data and tools to simulate occupant behavior, therefore producing different and incomparable results. Aiming to address these gaps, the International Energy Agency (IEA) Energy in Buildings and Community (EBC) Programme Annex 66 has established a scientific methodological framework for occupant behavior research, including data collection, behavior model representation, modeling and evaluation approaches, and the integration of behavior modeling tools with building performance simulation programs. Annex 66 also includes case studies and application guidelines to assist in building design, operation, and policymaking, using interdisciplinary approaches to reduce energy use in buildings and improve occupant comfort and productivity. This paper highlights the key research issues, methods, and outcomes pertaining to Annex 66, and offers perspectives on future research needs to integrate occupant behavior with the building life cycle.« less
The Integration of HIV and AIDS as a Socio-Scientific Issue in the Life Sciences Curriculum
ERIC Educational Resources Information Center
Wolff, Eugenie; Mnguni, Lindelani
2015-01-01
The potential of science to transform lives has been highlighted by a number of scholars. This means that critical socio-scientific issues (SSIs) must be integrated into science curricula. Development of context-specific scientific knowledge and twenty-first-century learning skills in science education could be used to address SSIs such as…
NASA Technical Reports Server (NTRS)
Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard
2003-01-01
The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.
Clarifying the Dynamics of the General Circulation: Phillips's 1956 Experiment.
NASA Astrophysics Data System (ADS)
Lewis, John M.
1998-01-01
In the mid-1950s, amid heated debate over the physical mechanisms that controlled the known features of the atmosphere's general circulation, Norman Phillips simulated hemispheric motion on the high-speed computer at the Institute for Advanced Study. A simple energetically consistent model was integrated for a simulated time of approximately 1 month. Analysis of the model results clarified the respective roles of the synoptic-scale eddies (cyclones-anticyclones) and mean meridional circulation in the maintenance of the upper-level westerlies and the surface wind regimes. Furthermore, the modeled cyclones clearly linked surface frontogenesis with the upper-level Charney-Eady wave. In addition to discussing the model results in light of the controversy and ferment that surrounded general circulation theory in the 1940s-1950s, an effort is made to follow Phillips's scientific path to the experiment.
The Quest for Clarity in Research Integrity: A Conceptual Schema.
Shaw, David
2018-03-28
Researchers often refer to "research integrity", "scientific integrity", "research misconduct", "scientific misconduct" and "research ethics". However, they may use some of these terms interchangeably despite conceptual distinctions. The aim of this paper is to clarify what is signified by several key terms related to research integrity, and to suggest clearer conceptual delineation between them. To accomplish this task, it provides a conceptual analysis based upon definitions and general usage of these phrases and categorization of integrity-breaching behaviours in literature and guidelines, including clarification of the different domains and agents involved. In the first part of the analysis, following some initial clarifications, I explore the distinction between internal and external rules of integrity. In the second part, I explore the distinction between integrity and lack of misconduct, before suggesting a recategorisation of different types of integrity breach. I conclude that greater clarity is needed in the debate on research integrity. Distinguishing between scientific and research integrity, reassessing the relative gravity of different misbehaviours in light of this distinction, and recognising all intentional breaches of integrity as misconduct may help to improve guidelines and education.
A GIS-based modeling system for petroleum waste management. Geographical information system.
Chen, Z; Huang, G H; Li, J B
2003-01-01
With an urgent need for effective management of petroleum-contaminated sites, a GIS-aided simulation (GISSIM) system is presented in this study. The GISSIM contains two components: an advanced 3D numerical model and a geographical information system (GIS), which are integrated within a general framework. The modeling component undertakes simulation for the fate of contaminants in subsurface unsaturated and saturated zones. The GIS component is used in three areas throughout the system development and implementation process: (i) managing spatial and non-spatial databases; (ii) linking inputs, model, and outputs; and (iii) providing an interface between the GISSIM and its users. The developed system is applied to a North American case study. Concentrations of benzene, toluene, and xylenes in groundwater under a petroleum-contaminated site are dynamically simulated. Reasonable outputs have been obtained and presented graphically. They provide quantitative and scientific bases for further assessment of site-contamination impacts and risks, as well as decisions on practical remediation actions.
Explicit integration with GPU acceleration for large kinetic networks
Brock, Benjamin; Belt, Andrew; Billings, Jay Jay; ...
2015-09-15
In this study, we demonstrate the first implementation of recently-developed fast explicit kinetic integration algorithms on modern graphics processing unit (GPU) accelerators. Taking as a generic test case a Type Ia supernova explosion with an extremely stiff thermonuclear network having 150 isotopic species and 1604 reactions coupled to hydrodynamics using operator splitting, we demonstrate the capability to solve of order 100 realistic kinetic networks in parallel in the same time that standard implicit methods can solve a single such network on a CPU. In addition, this orders-of-magnitude decrease in computation time for solving systems of realistic kinetic networks implies thatmore » important coupled, multiphysics problems in various scientific and technical fields that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible.« less
UAS-NAS Integrated Human in the Loop: Test Environment Report
NASA Technical Reports Server (NTRS)
Murphy, Jim; Otto, Neil; Jovic, Srba
2015-01-01
The desire and ability to fly Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS) is of increasing urgency. The application of unmanned aircraft to perform national security, defense, scientific, and emergency management are driving the critical need for less restrictive access by UAS to the NAS. UAS represent a new capability that will provide a variety of services in the government (public) and commercial (civil) aviation sectors. The growth of this potential industry has not yet been realized due to the lack of a common understanding of what is required to safely operate UAS in the NAS. NASA's UAS Integration in the NAS Project is conducting research in the areas of Separation Assurance/Sense and Avoid Interoperability (SSI), Human Systems Integration (HSI), and Communication to support reducing the barriers of UAS access to the NAS. This research was broken into two research themes namely, UAS Integration and Test Infrastructure. UAS Integration focuses on airspace integration procedures and performance standards to enable UAS integration in the air transportation system, covering Sense and Avoid (SAA) performance standards, command and control performance standards, and human systems integration. The focus of the Test Infrastructure theme was to enable development and validation of airspace integration procedures and performance standards, including the execution of integrated test and evaluation. In support of the integrated test and evaluation efforts, the Project developed an adaptable, scalable, and schedulable relevant test environment incorporating live, virtual, and constructive elements capable of validating concepts and technologies for unmanned aircraft systems to safely operate in the NAS. To accomplish this task, the Project planned to conduct three integrated events: a Human-in-the-Loop simulation and two Flight Test series that integrated key concepts, technologies and/or procedures in a relevant air traffic environment. Each of the integrated events were built on the technical achievements, fidelity and complexity of previous simulations and tests, resulting in research findings that support the development of regulations governing the access of UAS into the NAS. The purpose of this document is to describe how well the system under test was representative
Thorn, Christine Johanna; Bissinger, Kerstin; Thorn, Simon; Bogner, Franz Xaver
2016-01-01
Successful learning is the integration of new knowledge into existing schemes, leading to an integrated and correct scientific conception. By contrast, the co-existence of scientific and alternative conceptions may indicate a fragmented knowledge profile. Every learner is unique and thus carries an individual set of preconceptions before classroom engagement due to prior experiences. Hence, instructors and teachers have to consider the heterogeneous knowledge profiles of their class when teaching. However, determinants of fragmented knowledge profiles are not well understood yet, which may hamper a development of adapted teaching schemes. We used a questionnaire-based approach to assess conceptual knowledge of tree assimilation and wood synthesis surveying 885 students of four educational levels: 6th graders, 10th graders, natural science freshmen and other academic studies freshmen. We analysed the influence of learner's characteristics such as educational level, age and sex on the coexistence of scientific and alternative conceptions. Within all subsamples well-known alternative conceptions regarding tree assimilation and wood synthesis coexisted with correct scientific ones. For example, students describe trees to be living on "soil and sunshine", representing scientific knowledge of photosynthesis mingled with an alternative conception of trees eating like animals. Fragmented knowledge profiles occurred in all subsamples, but our models showed that improved education and age foster knowledge integration. Sex had almost no influence on the existing scientific conceptions and evolution of knowledge integration. Consequently, complex biological issues such as tree assimilation and wood synthesis need specific support e.g. through repeated learning units in class- and seminar-rooms in order to help especially young students to handle and overcome common alternative conceptions and appropriately integrate scientific conceptions into their knowledge profile.
Thorn, Simon; Bogner, Franz Xaver
2016-01-01
Successful learning is the integration of new knowledge into existing schemes, leading to an integrated and correct scientific conception. By contrast, the co-existence of scientific and alternative conceptions may indicate a fragmented knowledge profile. Every learner is unique and thus carries an individual set of preconceptions before classroom engagement due to prior experiences. Hence, instructors and teachers have to consider the heterogeneous knowledge profiles of their class when teaching. However, determinants of fragmented knowledge profiles are not well understood yet, which may hamper a development of adapted teaching schemes. We used a questionnaire-based approach to assess conceptual knowledge of tree assimilation and wood synthesis surveying 885 students of four educational levels: 6th graders, 10th graders, natural science freshmen and other academic studies freshmen. We analysed the influence of learner’s characteristics such as educational level, age and sex on the coexistence of scientific and alternative conceptions. Within all subsamples well-known alternative conceptions regarding tree assimilation and wood synthesis coexisted with correct scientific ones. For example, students describe trees to be living on “soil and sunshine”, representing scientific knowledge of photosynthesis mingled with an alternative conception of trees eating like animals. Fragmented knowledge profiles occurred in all subsamples, but our models showed that improved education and age foster knowledge integration. Sex had almost no influence on the existing scientific conceptions and evolution of knowledge integration. Consequently, complex biological issues such as tree assimilation and wood synthesis need specific support e.g. through repeated learning units in class- and seminar-rooms in order to help especially young students to handle and overcome common alternative conceptions and appropriately integrate scientific conceptions into their knowledge profile. PMID:26807974
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-02
...: Background The Presidential Memorandum on Scientific Integrity dated March 9, 2009, and the Office of Science... the Secretary's directive. The policy covers all Department employees, including political appointees...
Stewardship of Integrity in Scientific Communication.
Albertine, Kurt H
2018-06-14
Integrity in the pursuit of discovery through application of the scientific method and reporting the results is an obligation for each of us as scientists. We cannot let the value of science be diminished because discovering knowledge is vital to understand ourselves and our impacts on the earth. We support the value of science by our stewardship of integrity in the conduct, training, reporting, and proposing of scientific investigation. The players who have these responsibilities are authors, reviewers, editors, and readers. Each role has to be played with vigilance for ethical behavior, including compliance with regulations for protections of study subjects, use of select agents and biohazards, regulations of use of stem cells, resource sharing, posting datasets to public repositories, etc. The positive take-home message is that the scientific community is taking steps in behavior to protect the integrity of science. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.
Women scientists' scientific and spiritual ways of knowing
NASA Astrophysics Data System (ADS)
Buffington, Angela Cunningham
While science education aims for literacy regarding scientific knowledge and the work of scientists, the separation of scientific knowing from other knowing may misrepresent the knowing of scientists. The majority of science educators K-university are women. Many of these women are spiritual and integrate their scientific and spiritual ways of knowing. Understanding spiritual women of science would inform science education and serve to advance the scientific reason and spirituality debate. Using interviews and grounded theory, this study explores scientific and spiritual ways of knowing in six women of science who hold strong spiritual commitments and portray science to non-scientists. From various lived experiences, each woman comes to know through a Passive knowing of exposure and attendance, an Engaged knowing of choice, commitment and action, an Mindful/Inner knowing of prayer and meaning, a Relational knowing with others, and an Integrated lifeworld knowing where scientific knowing, spiritual knowing, and other ways of knowing are integrated. Consequences of separating ways of knowing are discussed, as are connections to current research, implications to science education, and ideas for future research. Understanding women scientists' scientific/ spiritual ways of knowing may aid science educators in linking academic science to the life-worlds of students.
ERIC Educational Resources Information Center
van Zee, Emily H.; Jansen, Henri; Winograd, Kenneth; Crowl, Michele; Devitt, Adam
2013-01-01
We designed a physics course for prospective elementary and middle school teachers to foster aspects of scientific thinking recommended in reform documents. Because the elementary school curriculum focuses heavily on literacy, we also explicitly integrated physics and literacy learning in this course. By integrating physics and literacy learning,…
Scientific workflows as productivity tools for drug discovery.
Shon, John; Ohkawa, Hitomi; Hammer, Juergen
2008-05-01
Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.
Evaluating lossy data compression on climate simulation data within a large ensemble
Baker, Allison H.; Hammerling, Dorit M.; Mickelson, Sheri A.; ...
2016-12-07
High-resolution Earth system model simulations generate enormous data volumes, and retaining the data from these simulations often strains institutional storage resources. Further, these exceedingly large storage requirements negatively impact science objectives, for example, by forcing reductions in data output frequency, simulation length, or ensemble size. To lessen data volumes from the Community Earth System Model (CESM), we advocate the use of lossy data compression techniques. While lossy data compression does not exactly preserve the original data (as lossless compression does), lossy techniques have an advantage in terms of smaller storage requirements. To preserve the integrity of the scientific simulation data,more » the effects of lossy data compression on the original data should, at a minimum, not be statistically distinguishable from the natural variability of the climate system, and previous preliminary work with data from CESM has shown this goal to be attainable. However, to ultimately convince climate scientists that it is acceptable to use lossy data compression, we provide climate scientists with access to publicly available climate data that have undergone lossy data compression. In particular, we report on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which we challenge climate scientists to examine features of the data relevant to their interests, and attempt to identify which of the ensemble members have been compressed and reconstructed. We find that while detecting distinguishing features is certainly possible, the compression effects noticeable in these features are often unimportant or disappear in post-processing analyses. In addition, we perform several analyses that directly compare the original data to the reconstructed data to investigate the preservation, or lack thereof, of specific features critical to climate science. Overall, we conclude that applying lossy data compression to climate simulation data is both advantageous in terms of data reduction and generally acceptable in terms of effects on scientific results.« less
Evaluating lossy data compression on climate simulation data within a large ensemble
NASA Astrophysics Data System (ADS)
Baker, Allison H.; Hammerling, Dorit M.; Mickelson, Sheri A.; Xu, Haiying; Stolpe, Martin B.; Naveau, Phillipe; Sanderson, Ben; Ebert-Uphoff, Imme; Samarasinghe, Savini; De Simone, Francesco; Carbone, Francesco; Gencarelli, Christian N.; Dennis, John M.; Kay, Jennifer E.; Lindstrom, Peter
2016-12-01
High-resolution Earth system model simulations generate enormous data volumes, and retaining the data from these simulations often strains institutional storage resources. Further, these exceedingly large storage requirements negatively impact science objectives, for example, by forcing reductions in data output frequency, simulation length, or ensemble size. To lessen data volumes from the Community Earth System Model (CESM), we advocate the use of lossy data compression techniques. While lossy data compression does not exactly preserve the original data (as lossless compression does), lossy techniques have an advantage in terms of smaller storage requirements. To preserve the integrity of the scientific simulation data, the effects of lossy data compression on the original data should, at a minimum, not be statistically distinguishable from the natural variability of the climate system, and previous preliminary work with data from CESM has shown this goal to be attainable. However, to ultimately convince climate scientists that it is acceptable to use lossy data compression, we provide climate scientists with access to publicly available climate data that have undergone lossy data compression. In particular, we report on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which we challenge climate scientists to examine features of the data relevant to their interests, and attempt to identify which of the ensemble members have been compressed and reconstructed. We find that while detecting distinguishing features is certainly possible, the compression effects noticeable in these features are often unimportant or disappear in post-processing analyses. In addition, we perform several analyses that directly compare the original data to the reconstructed data to investigate the preservation, or lack thereof, of specific features critical to climate science. Overall, we conclude that applying lossy data compression to climate simulation data is both advantageous in terms of data reduction and generally acceptable in terms of effects on scientific results.
Evaluating lossy data compression on climate simulation data within a large ensemble
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Allison H.; Hammerling, Dorit M.; Mickelson, Sheri A.
High-resolution Earth system model simulations generate enormous data volumes, and retaining the data from these simulations often strains institutional storage resources. Further, these exceedingly large storage requirements negatively impact science objectives, for example, by forcing reductions in data output frequency, simulation length, or ensemble size. To lessen data volumes from the Community Earth System Model (CESM), we advocate the use of lossy data compression techniques. While lossy data compression does not exactly preserve the original data (as lossless compression does), lossy techniques have an advantage in terms of smaller storage requirements. To preserve the integrity of the scientific simulation data,more » the effects of lossy data compression on the original data should, at a minimum, not be statistically distinguishable from the natural variability of the climate system, and previous preliminary work with data from CESM has shown this goal to be attainable. However, to ultimately convince climate scientists that it is acceptable to use lossy data compression, we provide climate scientists with access to publicly available climate data that have undergone lossy data compression. In particular, we report on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which we challenge climate scientists to examine features of the data relevant to their interests, and attempt to identify which of the ensemble members have been compressed and reconstructed. We find that while detecting distinguishing features is certainly possible, the compression effects noticeable in these features are often unimportant or disappear in post-processing analyses. In addition, we perform several analyses that directly compare the original data to the reconstructed data to investigate the preservation, or lack thereof, of specific features critical to climate science. Overall, we conclude that applying lossy data compression to climate simulation data is both advantageous in terms of data reduction and generally acceptable in terms of effects on scientific results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William D.; Craig, Anthony P.; Truesdale, John E.
The integrated Earth System Model (iESM) has been developed as a new tool for pro- jecting the joint human/climate system. The iESM is based upon coupling an Integrated Assessment Model (IAM) and an Earth System Model (ESM) into a common modeling in- frastructure. IAMs are the primary tool for describing the human–Earth system, including the sources of global greenhouse gases (GHGs) and short-lived species, land use and land cover change, and other resource-related drivers of anthropogenic climate change. ESMs are the primary scientific tools for examining the physical, chemical, and biogeochemical impacts of human-induced changes to the climate system. Themore » iESM project integrates the economic and human dimension modeling of an IAM and a fully coupled ESM within a sin- gle simulation system while maintaining the separability of each model if needed. Both IAM and ESM codes are developed and used by large communities and have been extensively applied in recent national and international climate assessments. By introducing heretofore- omitted feedbacks between natural and societal drivers, we can improve scientific under- standing of the human–Earth system dynamics. Potential applications include studies of the interactions and feedbacks leading to the timing, scale, and geographic distribution of emissions trajectories and other human influences, corresponding climate effects, and the subsequent impacts of a changing climate on human and natural systems. This paper de- scribes the formulation, requirements, implementation, testing, and resulting functionality of the first version of the iESM released to the global climate community.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. D.; Craig, A. P.; Truesdale, J. E.
The integrated Earth system model (iESM) has been developed as a new tool for projecting the joint human/climate system. The iESM is based upon coupling an integrated assessment model (IAM) and an Earth system model (ESM) into a common modeling infrastructure. IAMs are the primary tool for describing the human–Earth system, including the sources of global greenhouse gases (GHGs) and short-lived species (SLS), land use and land cover change (LULCC), and other resource-related drivers of anthropogenic climate change. ESMs are the primary scientific tools for examining the physical, chemical, and biogeochemical impacts of human-induced changes to the climate system. Themore » iESM project integrates the economic and human-dimension modeling of an IAM and a fully coupled ESM within a single simulation system while maintaining the separability of each model if needed. Both IAM and ESM codes are developed and used by large communities and have been extensively applied in recent national and international climate assessments. By introducing heretofore-omitted feedbacks between natural and societal drivers, we can improve scientific understanding of the human–Earth system dynamics. Potential applications include studies of the interactions and feedbacks leading to the timing, scale, and geographic distribution of emissions trajectories and other human influences, corresponding climate effects, and the subsequent impacts of a changing climate on human and natural systems. This paper describes the formulation, requirements, implementation, testing, and resulting functionality of the first version of the iESM released to the global climate community.« less
Interfacing with in-Situ Data Networks during the Arctic Boreal Vulnerability Experiment (ABoVE)
NASA Astrophysics Data System (ADS)
McInerney, M.; Griffith, P. C.; Duffy, D.; Hoy, E.; Schnase, J. L.; Sinno, S.; Thompson, J. H.
2014-12-01
The Arctic Boreal Vulnerability Experiment (ABoVE) is designed to improve understanding of the causes and impacts of ecological changes in Arctic/boreal regions, and will integrate field-based studies, modeling, and data from airborne and satellite remote sensing. ABoVE will result in a fuller understanding of ecosystem vulnerability and resilience to environmental change in the Arctic and boreal regions of western North America, and provide scientific information required to develop options for societal responses to the impacts of these changes. The studies sponsored by NASA during ABoVE will be coordinated with research and in-situ monitoring activities being sponsored by a number of national and international partners. The NASA Center for Climate Simulation at the Goddard Space Flight Center has partnered with the NASA Carbon Cycle & Ecosystems Office to create a science cloud designed for this field campaign - the ABoVE Science Cloud (ASC). The ASC combines high performance computing with emerging technologies to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage with integrated data management, and integration of core variables from in-situ networks identified by the ABoVE Science Definition Team. In this talk, we will present the scientific requirements driving the development of the ABoVE Science Cloud, discuss the necessary interfaces, both computational and human, with in-situ monitoring networks, and show examples of how the ASC is being used to meet the needs of the ABoVE campaign.
Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Yier
As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from thismore » project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.« less
Computational Simulations and the Scientific Method
NASA Technical Reports Server (NTRS)
Kleb, Bil; Wood, Bill
2005-01-01
As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
NAS technical summaries: Numerical aerodynamic simulation program, March 1991 - February 1992
NASA Technical Reports Server (NTRS)
1992-01-01
NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefiting other supercomputer centers in Government and industry. This report contains selected scientific results from the 1991-92 NAS Operational Year, March 4, 1991 to March 3, 1992, which is the fifth year of operation. During this year, the scientific community was given access to a Cray-2 and a Cray Y-MP. The Cray-2, the first generation supercomputer, has four processors, 256 megawords of central memory, and a total sustained speed of 250 million floating point operations per second. The Cray Y-MP, the second generation supercomputer, has eight processors and a total sustained speed of one billion floating point operations per second. Additional memory was installed this year, doubling capacity from 128 to 256 megawords of solid-state storage-device memory. Because of its higher performance, the Cray Y-MP delivered approximately 77 percent of the total number of supercomputer hours used during this year.
Hartin, Corinne A.; Patel, Pralit L.; Schwarber, Adria; ...
2015-04-01
Simple climate models play an integral role in the policy and scientific communities. They are used for climate mitigation scenarios within integrated assessment models, complex climate model emulation, and uncertainty analyses. Here we describe Hector v1.0, an open source, object-oriented, simple global climate carbon-cycle model. This model runs essentially instantaneously while still representing the most critical global-scale earth system processes. Hector has a three-part main carbon cycle: a one-pool atmosphere, land, and ocean. The model's terrestrial carbon cycle includes primary production and respiration fluxes, accommodating arbitrary geographic divisions into, e.g., ecological biomes or political units. Hector actively solves the inorganicmore » carbon system in the surface ocean, directly calculating air–sea fluxes of carbon and ocean pH. Hector reproduces the global historical trends of atmospheric [CO 2], radiative forcing, and surface temperatures. The model simulates all four Representative Concentration Pathways (RCPs) with equivalent rates of change of key variables over time compared to current observations, MAGICC (a well-known simple climate model), and models from the 5th Coupled Model Intercomparison Project. Hector's flexibility, open-source nature, and modular design will facilitate a broad range of research in various areas.« less
NASA Astrophysics Data System (ADS)
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
2017-12-01
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
NASA Astrophysics Data System (ADS)
Cakir, Mustafa
The primary objective of this case study was to examine prospective secondary science teachers' developing understanding of scientific inquiry and Mendelian genetics. A computer simulation of basic Mendelian inheritance processes (Catlab) was used in combination with small-group discussions and other instructional scaffolds to enhance prospective science teachers' understandings. The theoretical background for this research is derived from a social constructivist perspective. Structuring scientific inquiry as investigation to develop explanations presents meaningful context for the enhancement of inquiry abilities and understanding of the science content. The context of the study was a teaching and learning course focused on inquiry and technology. Twelve prospective science teachers participated in this study. Multiple data sources included pre- and post-module questionnaires of participants' view of scientific inquiry, pre-posttests of understandings of Mendelian concepts, inquiry project reports, class presentations, process videotapes of participants interacting with the simulation, and semi-structured interviews. Seven selected prospective science teachers participated in in-depth interviews. Findings suggest that while studying important concepts in science, carefully designed inquiry experiences can help prospective science teachers to develop an understanding about the types of questions scientists in that field ask, the methodological and epistemological issues that constrain their pursuit of answers to those questions, and the ways in which they construct and share their explanations. Key findings included prospective teachers' initial limited abilities to create evidence-based arguments, their hesitancy to include inquiry in their future teaching, and the impact of collaboration on thinking. Prior to this experience the prospective teachers held uninformed views of scientific inquiry. After the module, participants demonstrated extended expertise in their understandings of following aspects of scientific inquiry: (a) the iterative nature of scientific inquiry; (b) the tentativeness of specific knowledge claims; (c) the degree to which scientists rely on empirical data, as well as broader conceptual and metaphysical commitments, to assess models and to direct future inquiries; (d) the need for conceptual consistency; (e) multiple methods of investigations and multiple interpretations of data; and (f) social and cultural aspects of scientific inquiry. This research provided evidence that hypothesis testing can support the integrated acquisition of conceptual and procedural knowledge in science. Participants' conceptual elaborations of Mendelian inheritance were enhanced. There were qualitative changes in the nature of the participants' explanations. Moreover, the average percentage of correct responses improved from 39% on the pretest to 67% on the posttest. Findings also suggest those prospective science teachers' experiences as learners of science in their methods course served as a powerful tool for thinking about the role of inquiry in teaching and learning science. They had mixed views about enacting inquiry in their teaching in the future. All of them stated some kind of general willingness to do so; yet, they also mentioned some reservations and practical considerations about inquiry-based teaching.
NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware
NASA Astrophysics Data System (ADS)
Johnson, V. L.; Teuben, P. J.; Penprase, B. E.
An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.
NASA Astrophysics Data System (ADS)
Caplan, B.; Morrison, A.; Moore, J. C.; Berkowitz, A. R.
2017-12-01
Understanding water is central to understanding environmental challenges. Scientists use `big data' and computational models to develop knowledge about the structure and function of complex systems, and to make predictions about changes in climate, weather, hydrology, and ecology. Large environmental systems-related data sets and simulation models are difficult for high school teachers and students to access and make sense of. Comp Hydro, a collaboration across four states and multiple school districts, integrates computational thinking and data-related science practices into water systems instruction to enhance development of scientific model-based reasoning, through curriculum, assessment and teacher professional development. Comp Hydro addresses the need for 1) teaching materials for using data and physical models of hydrological phenomena, 2) building teachers' and students' comfort or familiarity with data analysis and modeling, and 3) infusing the computational knowledge and practices necessary to model and visualize hydrologic processes into instruction. Comp Hydro teams in Baltimore, MD and Fort Collins, CO are integrating teaching about surface water systems into high school courses focusing on flooding (MD) and surface water reservoirs (CO). This interactive session will highlight the successes and challenges of our physical and simulation models in helping teachers and students develop proficiency with computational thinking about surface water. We also will share insights from comparing teacher-led vs. project-led development of curriculum and our simulations.
Integration of Basic Knowledge Models for the Simulation of Cereal Foods Processing and Properties.
Kristiawan, Magdalena; Kansou, Kamal; Valle, Guy Della
Cereal processing (breadmaking, extrusion, pasting, etc.) covers a range of mechanisms that, despite their diversity, can be often reduced to a succession of two core phenomena: (1) the transition from a divided solid medium (the flour) to a continuous one through hydration, mechanical, biochemical, and thermal actions and (2) the expansion of a continuous matrix toward a porous structure as a result of the growth of bubble nuclei either by yeast fermentation or by water vaporization after a sudden pressure drop. Modeling them is critical for the domain, but can be quite challenging to address with mechanistic approaches relying on partial differential equations. In this chapter we present alternative approaches through basic knowledge models (BKM) that integrate scientific and expert knowledge, and possess operational interest for domain specialists. Using these BKMs, simulations of two cereal foods processes, extrusion and breadmaking, are provided by focusing on the two core phenomena. To support the use by non-specialists, these BKMs are implemented as computer tools, a Knowledge-Based System developed for the modeling of the flour mixing operation or Ludovic ® , a simulation software for twin screw extrusion. They can be applied to a wide domain of compositions, provided that the data on product rheological properties are available. Finally, it is stated that the use of such systems can help food engineers to design cereal food products and predict their texture properties.
Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.
2013-12-01
Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.
78 FR 3009 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-15
... Committee: Center for Scientific Review Special Emphasis Panel; Behavioral Neuroscience. Date: February 6... . Name of Committee: Molecular, Cellular and Developmental Neuroscience Integrated Review Group... Panel; Member Conflict: Integrative Functional and Cognitive Neurobiology. Date: February 13-14, 2013...
Thomas M. Quigley; Richard W Haynes; Russell T. Graham
1996-01-01
The Integrated Scientific Assessment for Ecosystem Management for the Interior Columbia Basin links landscape, aquatic, terrestrial, social, and economic characterizations to describe biophysical and social systems. Integration was achieved through a framework built around six goals for ecosystem management and three different views of the future. These goals are:...
Scientific Design of a High Contrast Integral Field Spectrograph for the Subaru Telescope
NASA Technical Reports Server (NTRS)
McElwain, Michael W.
2012-01-01
Ground based telescopes equipped with adaptive optics systems and specialized science cameras are now capable of directly detecting extrasolar planets. We present the scientific design for a high contrast integral field spectrograph for the Subaru Telescope. This lenslet based integral field spectrograph will be implemented into the new extreme adaptive optics system at Subaru, called SCExAO.
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Rest, Kathleen M.; Halpern, Michael H.
2007-01-01
Our nation’s health and prosperity are based on a foundation of independent scientific discovery. Yet in recent years, political interference in federal government science has become widespread, threatening this legacy. We explore the ways science has been misused, the attempts to measure the pervasiveness of this problem, and the effects on our long-term capacity to meet today’s most complex public health challenges. Good government and a functioning democracy require public policy decisions to be informed by independent science. The scientific and public health communities must speak out to defend taxpayer-funded science from political interference. Encouragingly, both the scientific community and Congress are exploring ways to restore scientific integrity to federal policymaking. PMID:17901422
Rest, Kathleen M; Halpern, Michael H
2007-11-01
Our nation's health and prosperity are based on a foundation of independent scientific discovery. Yet in recent years, political interference in federal government science has become widespread, threatening this legacy. We explore the ways science has been misused, the attempts to measure the pervasiveness of this problem, and the effects on our long-term capacity to meet today's most complex public health challenges. Good government and a functioning democracy require public policy decisions to be informed by independent science. The scientific and public health communities must speak out to defend taxpayer-funded science from political interference. Encouragingly, both the scientific community and Congress are exploring ways to restore scientific integrity to federal policymaking.
Simulation and Experimentation in an Astronomy Laboratory, Part II
NASA Astrophysics Data System (ADS)
Maloney, F. P.; Maurone, P. A.; Hones, M.
1995-12-01
The availability of low-cost, high-performance computing hardware and software has transformed the manner by which astronomical concepts can be re-discovered and explored in a laboratory that accompanies an astronomy course for non-scientist students. We report on a strategy for allowing each student to understand fundamental scientific principles by interactively confronting astronomical and physical phenomena, through direct observation and by computer simulation. Direct observation of physical phenomena, such as Hooke's Law, begins by using a computer and hardware interface as a data-collection and presentation tool. In this way, the student is encouraged to explore the physical conditions of the experiment and re-discover the fundamentals involved. The hardware frees the student from the tedium of manual data collection and presentation, and permits experimental design which utilizes data that would otherwise be too fleeting, too imprecise, or too voluminous. Computer simulation of astronomical phenomena allows the student to travel in time and space, freed from the vagaries of weather, to re-discover such phenomena as the daily and yearly cycles, the reason for the seasons, the saros, and Kepler's Laws. By integrating the knowledge gained by experimentation and simulation, the student can understand both the scientific concepts and the methods by which they are discovered and explored. Further, students are encouraged to place these discoveries in an historical context, by discovering, for example, the night sky as seen by the survivors of the sinking Titanic, or Halley's comet as depicted on the Bayeux tapestry. We report on the continuing development of these laboratory experiments. Futher details and the text for the experiments are available at the following site: http://astro4.ast.vill.edu/ This work is supported by a grant from The Pew Charitable Trusts.
DCMS: A data analytics and management system for molecular simulation.
Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni
Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.
NASA Technical Reports Server (NTRS)
Adams, Robert B.; LaPointe, Michael; Wilks, Rod; Allen, Brian
2009-01-01
This poster reviews the planning and design for an integrated architecture for characterization, mitigation, scientific evaluation and resource utilization of near earth objects. This includes tracks to observe and characterize the nature of the threat posed by a NEO, and deflect if a significant threat is posed. The observation stack can also be used for a more complete scientific analysis of the NEO.
Visualization and Analysis of Vortex-Turbine Intersections in Wind Farms.
Shafii, Sohail; Obermaier, Harald; Linn, Rodman; Koo, Eunmo; Hlawitschka, Mario; Garth, Christoph; Hamann, Bernd; Joy, Kenneth
2013-02-13
Characterizing the interplay between the vortices and forces acting on a wind turbine's blades in a qualitative and quantitative way holds the potential for significantly improving large wind turbine design. The paper introduces an integrated pipeline for highly effective wind and force field analysis and visualization. We extract vortices induced by a turbine's rotation in a wind field, and characterize vortices in conjunction with numerically simulated forces on the blade surfaces as these vortices strike another turbine's blades downstream. The scientifically relevant issue to be studied is the relationship between the extracted, approximate locations on the blades where vortices strike the blades and the forces that exist in those locations. This integrated approach is used to detect and analyze turbulent flow that causes local impact on the wind turbine blade structure. The results that we present are based on analyzing the wind and force field data sets generated by numerical simulations, and allow domain scientists to relate vortex-blade interactions with power output loss in turbines and turbine life-expectancy. Our methods have the potential to improve turbine design in order to save costs related to turbine operation and maintenance.
VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.
2016-12-01
We introduce the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal for dissemination of data, simulation of physical processes, and promotion of climate literacy. The current prototype leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. This will allow for faster publication in peer-reviewed journals and adaption of results for educational applications. Through future application of this concept to multiple aspects of the Earth System, VESL has the potential to broaden data applications in the geosciences and beyond. At this stage, we seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL, as we plan its expansion, and aim to achieve more rapid communication and presentation of scientific results.
The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications
NASA Technical Reports Server (NTRS)
Johnston, William E.
2002-01-01
With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.
NASA Astrophysics Data System (ADS)
Haltigin, T.; Hipkin, V.; Picard, M.
2016-12-01
Mars Sample Return (MSR) remains one of the highest priorities of the international planetary science community. While the overall mission architecture required for MSR is relatively well defined, there remain a number of open questions regarding its implementation. In preparing for an eventual MSR campaign, simulating portions of the sample collection mission can provide important insight to address existing knowledge gaps. In 2015 and 2016, the Canadian Space Agency (CSA) led robotic deployments to address a variety of technical, scientific, operational, and educational objectives. Here we report on the results. The deployments were conducted at a field site near Hanskville, UT, USA, chosen to satisfy scientific, technical, and logistical considerations. The geology of the region is dominated by Jurassic-aged sandstones and mudstones, indicative of an ancient sedimentary environment. Moreover, a series of linear topographically inverted features are present, similar to morphologies observed in particular Martian landscapes. On both Earth and Mars, these features are interpreted as lithified and exhumed river channels. A science operations center was established in London, ON, Canada, at Western University. Here, a science team of > 30 students and professionals - unaware of the rover's actual location - were responsible for generating daily science plans, requesting observations, and interpreting downloaded data, all while respecting Mars-realistic flight rules and constraints for power, scheduling, and data. Rover commanding was performed by an engineering team at CSA headquarters in St. Hubert, QC, Canada, while a small out-of-simulation field team was present on-site to ensure safe operations of the rover and to provide data transfers. Between the 2015 and 2016 campaigns, nearly five weeks of operations were conducted. The team successfully collected scientifically-selected samples to address the group objectives, and the rover demonstrated system integration and a variety of navigational techniques. Forward work involves laboratory-based validation of the returned samples to evaluate the efficiency of the in-simulation operational decision-making.
Engineering America's Future in Space: Systems Engineering Innovations for Sustainable Exploration
NASA Technical Reports Server (NTRS)
Dumbacher, Daniel L.; Jones, Carl P.
2008-01-01
The National Aeronautics and Space Administration (NASA) delivers space transportation solutions for America's complex missions, ranging from scientific payloads that expand knowledge, such as the Hubble Space Telescope, to astronauts and lunar rovers destined for voyages to the Moon. Currently, the venerable Space Shuttle, which has been in service since 1981, provides U.S. capability for both crew and cargo to low-Earth orbit to construct the International Space Station, before the Shuttle is retired in 2010, as outlined in the 2006 NASA Strategic Plan. I In the next decade, NASA will replace this system with a duo of launch vehicles: the Ares I Crew Launch Vehicle/Orion Crew Exploration Vehicle and the Ares V Cargo Launch Vehicle/Altair Lunar Lander. The goals for this new system include increased safety and reliability, coupled with lower operations costs that promote sustainable space exploration over a multi-decade schedule. This paper will provide details of the in-house systems engineering and vehicle integration work now being performed for the Ares I and planned for the Ares V. It will give an overview of the Ares I system-level test activities, such as the ground vibration testing that will be conducted in the Marshall Center's Dynamic Test Stand to verify the integrated vehicle stack's structural integrity against predictions made by modern modeling and simulation analysis. It also will give information about the work in progress for the Ares I-X developmental test flight planned in 2009 to provide key data before the Ares I Critical Design Review. Activities such as these will help prove and refine mission concepts of operation, while supporting the spectrum of design and development tasks being performed by Marshall's Engineering Directorate, ranging from launch vehicles and lunar rovers to scientific spacecraft and associated experiments. Ultimately, the work performed will lead to the fielding of a robust space transportation solution that will carry international explorers and essential payloads for sustainable scientific discovery beyond planet Earth.
NASA/ESA CV-990 Spacelab Simulation (ASSESS 2)
NASA Technical Reports Server (NTRS)
1977-01-01
Cost effective techniques for addressing management and operational activities on Spacelab were identified and analyzed during a ten day NASA-ESA cooperative mission with payload and flight responsibilities handled by the organization assigned for early Spacelabs. Topics discussed include: (1) management concepts and interface relationships; (2) experiment selection; (3) hardware development; (4) payload integration and checkout; (5) selection and training of mission specialists and payload specialists; (6) mission control center/payload operations control center interactions with ground and flight problems; (7) real time interaction during flight between principal investigators and the mission specialist/payload specialist flight crew; and (8) retrieval of scientific data and its analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramos, Jaime
2012-12-14
To unlock the potential of micro grids we plan to build, commission and operate a 5 kWDC PV array and integrate it to the UTPA Engineering building low voltage network, as a micro grid; and promote community awareness. Assisted by a solar radiation tracker providing on-line information of its measurements and performing analysis for the use by the scientific and engineering community, we will write, perform and operate a set of Laboratory experiments and computer simulations supporting Electrical Engineering (graduate and undergraduate) courses on Renewable Energy, as well as Senior Design projects.
NASA Astrophysics Data System (ADS)
Roesch-McNally, G.; Prendeville, H. R.
2017-12-01
A lack of coproduction, the joint production of new technologies or knowledge among technical experts and other groups, is arguably one of the reasons why much scientific information and resulting decision support systems are not very usable. Increasingly, public agencies and academic institutions are emphasizing the importance of coproduction of scientific knowledge and decision support systems in order to facilitate greater engagement between the scientific community and key stakeholder groups. Coproduction has been embraced as a way for the scientific community to develop actionable scientific information that will assist end users in solving real-world problems. Increasing the level of engagement and stakeholder buy-in to the scientific process is increasingly necessary, particularly in the context of growing politicization of science and the scientific process. Coproduction can be an effective way to build trust and can build-on and integrate local and traditional knowledge. Employing coproduction strategies may enable the development of more relevant and useful information and decision support tools that address stakeholder challenges at relevant scales. The USDA Northwest Climate Hub has increasingly sought ways to integrate coproduction in the development of both applied research projects and the development of decision support systems. Integrating coproduction, however, within existing institutions is not always simple, given that coproduction is often more focused on process than products and products are, for better or worse, often the primary focus of applied research and tool development projects. The USDA Northwest Climate Hub sought to integrate coproduction into our FY2017 call for proposal process. As a result we have a set of proposals and fledgling projects that fall along the engagement continuum (see Figure 1- attached). We will share the challenges and opportunities that emerged from this purposeful integration of coproduction into the work that we prioritized for funding. This effort highlights strategies for how federal agencies might consider how and whether to codify coproduction tenets into their collaborations and agenda setting.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
NASA/ESA CV-990 spacelab simulation
NASA Technical Reports Server (NTRS)
1975-01-01
Due to interest in the application of simplified techniques used to conduct airborne science missions at NASA's Ames Research Center, a joint NASA/ESA endeavor was established to conduct an extensive Spacelab simulation using the NASA CV-990 airborne laboratory. The scientific payload was selected to perform studies in upper atmospheric physics and infrared astronomy with principal investigators from France, the Netherlands, England, and several groups from the United States. Communication links between the 'Spacelab' and a ground based mission operations center were limited consistent with Spacelab plans. The mission was successful and provided extensive data relevant to Spacelab objectives on overall management of a complex international payload; experiment preparation, testing, and integration; training for proxy operation in space; data handling; multiexperimenter use of common experimenter facilities (telescopes); multiexperiment operation by experiment operators; selection criteria for Spacelab experiment operators; and schedule requirements to prepare for such a Spacelab mission.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Schumacher, German
2014-08-01
The Large Synoptic Survey Telescope (LSST) is a complex system of systems with demanding performance and operational requirements. The nature of its scientific goals requires a special Observatory Control System (OCS) and particularly a very specialized automatic Scheduler. The OCS Scheduler is an autonomous software component that drives the survey, selecting the detailed sequence of visits in real time, taking into account multiple science programs, the current external and internal conditions, and the history of observations. We have developed a SysML model for the OCS Scheduler that fits coherently in the OCS and LSST integrated model. We have also developed a prototype of the Scheduler that implements the scheduling algorithms in the simulation environment provided by the Operations Simulator, where the environment and the observatory are modeled with real weather data and detailed kinematics parameters. This paper expands on the Scheduler architecture and the proposed algorithms to achieve the survey goals.
A system for environmental model coupling and code reuse: The Great Rivers Project
NASA Astrophysics Data System (ADS)
Eckman, B.; Rice, J.; Treinish, L.; Barford, C.
2008-12-01
As part of the Great Rivers Project, IBM is collaborating with The Nature Conservancy and the Center for Sustainability and the Global Environment (SAGE) at the University of Wisconsin, Madison to build a Modeling Framework and Decision Support System (DSS) designed to help policy makers and a variety of stakeholders (farmers, fish & wildlife managers, hydropower operators, et al.) to assess, come to consensus, and act on land use decisions representing effective compromises between human use and ecosystem preservation/restoration. Initially focused on Brazil's Paraguay-Parana, China's Yangtze, and the Mississippi Basin in the US, the DSS integrates data and models from a wide variety of environmental sectors, including water balance, water quality, carbon balance, crop production, hydropower, and biodiversity. In this presentation we focus on the modeling framework aspect of this project. In our approach to these and other environmental modeling projects, we see a flexible, extensible modeling framework infrastructure for defining and running multi-step analytic simulations as critical. In this framework, we divide monolithic models into atomic components with clearly defined semantics encoded via rich metadata representation. Once models and their semantics and composition rules have been registered with the system by their authors or other experts, non-expert users may construct simulations as workflows of these atomic model components. A model composition engine enforces rules/constraints for composing model components into simulations, to avoid the creation of Frankenmodels, models that execute but produce scientifically invalid results. A common software environment and common representations of data and models are required, as well as an adapter strategy for code written in e.g., Fortran or python, that still enables efficient simulation runs, including parallelization. Since each new simulation, as a new composition of model components, requires calibration of parameters (fudge factors) to produce scientifically valid results, we are also developing an autocalibration engine. Finally, visualization is a key element of this modeling framework strategy, both to convey complex scientific data effectively, and also to enable non-expert users to make full use of the relevant features of the framework. We are developing a visualization environment with a strong data model, to enable visualizations, model results, and data all to be handled similarly.
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
The Modular Modeling System (MMS): A toolbox for water- and environmental-resources management
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.; Hay, L.E.; ,
2005-01-01
The increasing complexity of water- and environmental-resource problems require modeling approaches that incorporate knowledge from a broad range of scientific and software disciplines. To address this need, the U.S. Geological Survey (USGS) has developed the Modular Modeling System (MMS). MMS is an integrated system of computer software for model development, integration, and application. Its modular design allows a high level of flexibility and adaptability to enable modelers to incorporate their own software into a rich array of built-in models and modeling tools. These include individual process models, tightly coupled models, loosely coupled models, and fully- integrated decision support systems. A geographic information system (GIS) interface, the USGS GIS Weasel, has been integrated with MMS to enable spatial delineation and characterization of basin and ecosystem features, and to provide objective parameter-estimation methods for models using available digital data. MMS provides optimization and sensitivity-analysis tools to analyze model parameters and evaluate the extent to which uncertainty in model parameters affects uncertainty in simulation results. MMS has been coupled with the Bureau of Reclamation object-oriented reservoir and river-system modeling framework, RiverWare, to develop models to evaluate and apply optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. This decision support system approach has been developed, tested, and implemented in the Gunnison, Yakima, San Joaquin, Rio Grande, and Truckee River basins of the western United States. MMS is currently being coupled with the U.S. Forest Service model SIMulating Patterns and Processes at Landscape Scales (SIMPPLLE) to assess the effects of alternative vegetation-management strategies on a variety of hydrological and ecological responses. Initial development and testing of the MMS-SIMPPLLE integration is being conducted on the Colorado Plateau region of the western United Sates.
Operational plans for life science payloads - From experiment selection through postflight reporting
NASA Technical Reports Server (NTRS)
Mccollum, G. W.; Nelson, W. G.; Wells, G. W.
1976-01-01
Key features of operational plans developed in a study of the Space Shuttle era life science payloads program are presented. The data describes the overall acquisition, staging, and integration of payload elements, as well as program implementation methods and mission support requirements. Five configurations were selected as representative payloads: (a) carry-on laboratories - medical emphasis experiments, (b) mini-laboratories - medical/biology experiments, (c) seven-day dedicated laboratories - medical/biology experiments, (d) 30-day dedicated laboratories - Regenerative Life Support Evaluation (RLSE) with selected life science experiments, and (e) Biomedical Experiments Scientific Satellite (BESS) - extended duration primate (Type I) and small vertebrate (Type II) missions. The recommended operational methods described in the paper are compared to the fundamental data which has been developed in the life science Spacelab Mission Simulation (SMS) test series. Areas assessed include crew training, experiment development and integration, testing, data-dissemination, organization interfaces, and principal investigator working relationships.
A low-power RFID integrated circuits for intelligent healthcare systems.
Lee, Shuenn-Yuh; Wang, Liang-Hung; Fang, Qiang
2010-11-01
This paper presents low-power radio-frequency identification (RFID) technology for intelligent healthcare systems. With attention to power-efficient communication in the body sensor network, RF power transfer was estimated and the required low-power ICs, which are important in the development of a healthcare system with miniaturization and system integration, are discussed based on the RFID platform. To analyze the power transformation, this paper adopts a 915-MHz industrial, scientific, and medical RF with a radiation power of 70 mW to estimate the power loss under the 1-m communication distance between an RFID reader (bioinformation node) and a transponder (biosignal acquisition nodes). The low-power ICs of the transponder will be implemented in the TSMC 0.18-μm CMOS process. The simulation result reveals that the transponder's IC can fit in with the link budget of the UHF RFID system.
NASA Technical Reports Server (NTRS)
Biggerstaff, J. A. (Editor)
1985-01-01
Topics related to physics instrumentation are discussed, taking into account cryostat and electronic development associated with multidetector spectrometer systems, the influence of materials and counting-rate effects on He-3 neutron spectrometry, a data acquisition system for time-resolved muscle experiments, and a sensitive null detector for precise measurements of integral linearity. Other subjects explored are concerned with space instrumentation, computer applications, detectors, instrumentation for high energy physics, instrumentation for nuclear medicine, environmental monitoring and health physics instrumentation, nuclear safeguards and reactor instrumentation, and a 1984 symposium on nuclear power systems. Attention is given to the application of multiprocessors to scientific problems, a large-scale computer facility for computational aerodynamics, a single-board 32-bit computer for the Fastbus, the integration of detector arrays and readout electronics on a single chip, and three-dimensional Monte Carlo simulation of the electron avalanche in a proportional counter.
Resource Materials on Scientific Integrity Issues.
ERIC Educational Resources Information Center
Macrina, Francis L., Ed.; Munro, Cindy L., Ed.
1993-01-01
The annotated bibliography contains 26 citations of books, monographs, and articles that may be useful to faculty and students in courses on scientific integrity. Topics addressed include ethical and legal considerations, fraud, technical writing and publication, intellectual property, notetaking, case study approach, conflict of interest, and…
78 FR 26376 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-06
...; Bioengineering of Neuroscience, Vision and Low Vision Technologies Study Section. Date: May 30-31, 2013. Time: 8... of Committee: Integrative, Functional and Cognitive Neuroscience Integrated Review Group..., [email protected] . Name of Committee: Center for Scientific Review Special Emphasis Panel; Vision...
75 FR 73114 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-29
... Scientific Review Special Emphasis Panel, Topics in Microbiology. Date: December 28-29, 2010. Time: 8 a.m. to... Committee: Oncology 2--Translational Clinical Integrated Review Group, Developmental Therapeutics Study....gov . Name of Committee: Brain Disorders and Clinical Neuroscience Integrated Review Group, Acute...
Reactor Pressure Vessel Integrity Assessments with the Grizzly Aging Simulation Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Hoffman, William
Grizzly is a simulation tool being developed at Idaho National Laboratory (INL) as part of the US Department of Energy’s Light Water Reactor Sustainability program to provide improved safety assessments of systems, components, and structures in nuclear power plants subjected to age-related degradation. Its goal is to provide an improved scientific basis for decisions surrounding license renewal, which would permit operation of commercial nuclear power plants beyond 60 years. Grizzly is based on INL’s MOOSE framework, which enables multiphysics simulations in a parallel computing environment. It will address a wide variety of aging issues in nuclear power plant systems, components,more » and structures, modelling both the aging processes and the ability of age-degraded components to perform safely. The reactor pressure vessel (RPV) was chosen as the initial application for Grizzly. Grizzly solves tightly coupled equations of heat conduction and solid mechanics to simulate the global response of the RPV to accident conditions, and uses submodels to represent regions with pre-existing flaws. Domain integrals are used to calculate stress intensity factors on those flaws. A physically based empirical model is used to evaluate material embrittlement, and is used to evaluate whether crack growth would occur. Grizzly can represent the RPV in 2D or 3D, allowing it to evaluate effects that require higher dimensionality models to capture. Work is underway to use lower length scale models of material evolution to inform engineering models of embrittlement. This paper demonstrates an application of Grizzly to RPV failure assessment, and summarizes on-going work.« less
XML-Based Generator of C++ Code for Integration With GUIs
NASA Technical Reports Server (NTRS)
Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard
2003-01-01
An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
Integration of Information and Scientific Literacy: Promoting Literacy in Undergraduates
ERIC Educational Resources Information Center
Porter, Jason A.; Wolbach, Kevin C.; Purzycki, Catherine B.; Bowman, Leslie A.; Agbada, Eva; Mostrom, Alison M.
2010-01-01
The Association of College and Research Libraries recommends incorporating information literacy (IL) skills across university and college curricula, for the goal of developing information literate graduates. Congruent with this goal, the Departments of Biological Sciences and Information Science developed an integrated IL and scientific literacy…
Case Studies in Describing Scientific Research Efforts as Linked Data
NASA Astrophysics Data System (ADS)
Gandara, A.; Villanueva-Rosales, N.; Gates, A.
2013-12-01
The Web is growing with numerous scientific resources, prompting increased efforts in information management to consider integration and exchange of scientific resources. Scientists have many options to share scientific resources on the Web; however, existing options provide limited support to scientists in annotating and relating research resources resulting from a scientific research effort. Moreover, there is no systematic approach to documenting scientific research and sharing it on the Web. This research proposes the Collect-Annotate-Refine-Publish (CARP) Methodology as an approach for guiding documentation of scientific research on the Semantic Web as scientific collections. Scientific collections are structured descriptions about scientific research that make scientific results accessible based on context. In addition, scientific collections enhance the Linked Data data space and can be queried by machines. Three case studies were conducted on research efforts at the Cyber-ShARE Research Center of Excellence in order to assess the effectiveness of the methodology to create scientific collections. The case studies exposed the challenges and benefits of leveraging the Semantic Web and Linked Data data space to facilitate access, integration and processing of Web-accessible scientific resources and research documentation. As such, we present the case study findings and lessons learned in documenting scientific research using CARP.
NASA Astrophysics Data System (ADS)
Mendoza, A. M.; Bakshi, S.; Berrios, D.; Chulaki, A.; Evans, R. M.; Kuznetsova, M. M.; Lee, H.; MacNeice, P. J.; Maddox, M. M.; Mays, M. L.; Mullinix, R. E.; Ngwira, C. M.; Patel, K.; Pulkkinen, A.; Rastaetter, L.; Shim, J.; Taktakishvili, A.; Zheng, Y.
2012-12-01
Community Coordinated Modeling Center (CCMC) was established to enhance basic solar terrestrial research and to aid in the development of models for specifying and forecasting conditions in the space environment. In achieving this goal, CCMC has developed and provides a set of innovative tools varying from: Integrated Space Weather Analysis (iSWA) web -based dissemination system for space weather information, Runs-On-Request System providing access to unique collection of state-of-the-art solar and space physics models (unmatched anywhere in the world), Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and recently Mobile apps (iPhone/Android) to view space weather data anywhere to the scientific community. The number of runs requested and the number of resulting scientific publications and presentations from the research community has not only been an indication of the broad scientific usage of the CCMC and effective participation by space scientists and researchers, but also guarantees active collaboration and coordination amongst the space weather research community. Arising from the course of CCMC activities, CCMC also supports community-wide model validation challenges and research focus group projects for a broad range of programs such as the multi-agency National Space Weather Program, NSF's CEDAR (Coupling, Energetics and Dynamics of Atmospheric Regions), GEM (Geospace Environment Modeling) and Shine (Solar Heliospheric and INterplanetary Environment) programs. In addition to performing research and model development, CCMC also supports space science education by hosting summer students through local universities; through the provision of simulations in support of classroom programs such as Heliophysics Summer School (with student research contest) and CCMC Workshops; training next generation of junior scientists in space weather forecasting; and educating the general public about the importance and impacts of space weather effects. Although CCMC is organizationally comprised of United States federal agencies, CCMC services are open to members of the international science community and encourages interagency and international collaboration. In this poster, we provide an overview of using Community Coordinated Modeling Center (CCMC) tools and services to support worldwide space weather scientific communities and networks.;
Schaffranek, Raymond W.
2004-01-01
A numerical model for simulation of surface-water integrated flow and transport in two (horizontal-space) dimensions is documented. The model solves vertically integrated forms of the equations of mass and momentum conservation and solute transport equations for heat, salt, and constituent fluxes. An equation of state for salt balance directly couples solution of the hydrodynamic and transport equations to account for the horizontal density gradient effects of salt concentrations on flow. The model can be used to simulate the hydrodynamics, transport, and water quality of well-mixed bodies of water, such as estuaries, coastal seas, harbors, lakes, rivers, and inland waterways. The finite-difference model can be applied to geographical areas bounded by any combination of closed land or open water boundaries. The simulation program accounts for sources of internal discharges (such as tributary rivers or hydraulic outfalls), tidal flats, islands, dams, and movable flow barriers or sluices. Water-quality computations can treat reactive and (or) conservative constituents simultaneously. Input requirements include bathymetric and topographic data defining land-surface elevations, time-varying water level or flow conditions at open boundaries, and hydraulic coefficients. Optional input includes the geometry of hydraulic barriers and constituent concentrations at open boundaries. Time-dependent water level, flow, and constituent-concentration data are required for model calibration and verification. Model output consists of printed reports and digital files of numerical results in forms suitable for postprocessing by graphical software programs and (or) scientific visualization packages. The model is compatible with most mainframe, workstation, mini- and micro-computer operating systems and FORTRAN compilers. This report defines the mathematical formulation and computational features of the model, explains the solution technique and related model constraints, describes the model framework, documents the type and format of inputs required, and identifies the type and format of output available.
Science Classroom Inquiry (SCI) Simulations: A Novel Method to Scaffold Science Learning
Peffer, Melanie E.; Beckler, Matthew L.; Schunn, Christian; Renken, Maggie; Revak, Amanda
2015-01-01
Science education is progressively more focused on employing inquiry-based learning methods in the classroom and increasing scientific literacy among students. However, due to time and resource constraints, many classroom science activities and laboratory experiments focus on simple inquiry, with a step-by-step approach to reach predetermined outcomes. The science classroom inquiry (SCI) simulations were designed to give students real life, authentic science experiences within the confines of a typical classroom. The SCI simulations allow students to engage with a science problem in a meaningful, inquiry-based manner. Three discrete SCI simulations were created as website applications for use with middle school and high school students. For each simulation, students were tasked with solving a scientific problem through investigation and hypothesis testing. After completion of the simulation, 67% of students reported a change in how they perceived authentic science practices, specifically related to the complex and dynamic nature of scientific research and how scientists approach problems. Moreover, 80% of the students who did not report a change in how they viewed the practice of science indicated that the simulation confirmed or strengthened their prior understanding. Additionally, we found a statistically significant positive correlation between students’ self-reported changes in understanding of authentic science practices and the degree to which each simulation benefitted learning. Since SCI simulations were effective in promoting both student learning and student understanding of authentic science practices with both middle and high school students, we propose that SCI simulations are a valuable and versatile technology that can be used to educate and inspire a wide range of science students on the real-world complexities inherent in scientific study. PMID:25786245
Science classroom inquiry (SCI) simulations: a novel method to scaffold science learning.
Peffer, Melanie E; Beckler, Matthew L; Schunn, Christian; Renken, Maggie; Revak, Amanda
2015-01-01
Science education is progressively more focused on employing inquiry-based learning methods in the classroom and increasing scientific literacy among students. However, due to time and resource constraints, many classroom science activities and laboratory experiments focus on simple inquiry, with a step-by-step approach to reach predetermined outcomes. The science classroom inquiry (SCI) simulations were designed to give students real life, authentic science experiences within the confines of a typical classroom. The SCI simulations allow students to engage with a science problem in a meaningful, inquiry-based manner. Three discrete SCI simulations were created as website applications for use with middle school and high school students. For each simulation, students were tasked with solving a scientific problem through investigation and hypothesis testing. After completion of the simulation, 67% of students reported a change in how they perceived authentic science practices, specifically related to the complex and dynamic nature of scientific research and how scientists approach problems. Moreover, 80% of the students who did not report a change in how they viewed the practice of science indicated that the simulation confirmed or strengthened their prior understanding. Additionally, we found a statistically significant positive correlation between students' self-reported changes in understanding of authentic science practices and the degree to which each simulation benefitted learning. Since SCI simulations were effective in promoting both student learning and student understanding of authentic science practices with both middle and high school students, we propose that SCI simulations are a valuable and versatile technology that can be used to educate and inspire a wide range of science students on the real-world complexities inherent in scientific study.
NASA Astrophysics Data System (ADS)
Abdul Ghani, B.
2005-09-01
"TEA CO 2 Laser Simulator" has been designed to simulate the dynamic emission processes of the TEA CO 2 laser based on the six-temperature model. The program predicts the behavior of the laser output pulse (power, energy, pulse duration, delay time, FWHM, etc.) depending on the physical and geometrical input parameters (pressure ratio of gas mixture, reflecting area of the output mirror, media length, losses, filling and decay factors, etc.). Program summaryTitle of program: TEA_CO2 Catalogue identifier: ADVW Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVW Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: P.IV DELL PC Setup: Atomic Energy Commission of Syria, Scientific Services Department, Mathematics and Informatics Division Operating system: MS-Windows 9x, 2000, XP Programming language: Delphi 6.0 No. of lines in distributed program, including test data, etc.: 47 315 No. of bytes in distributed program, including test data, etc.:7 681 109 Distribution format:tar.gz Classification: 15 Laser Physics Nature of the physical problem: "TEA CO 2 Laser Simulator" is a program that predicts the behavior of the laser output pulse by studying the effect of the physical and geometrical input parameters on the characteristics of the output laser pulse. The laser active medium consists of a CO 2-N 2-He gas mixture. Method of solution: Six-temperature model, for the dynamics emission of TEA CO 2 laser, has been adapted in order to predict the parameters of laser output pulses. A simulation of the laser electrical pumping was carried out using two approaches; empirical function equation (8) and differential equation (9). Typical running time: The program's running time mainly depends on both integration interval and step; for a 4 μs period of time and 0.001 μs integration step (defaults values used in the program), the running time will be about 4 seconds. Restrictions on the complexity: Using a very small integration step might leads to stop the program run due to the huge number of calculating points and to a small paging file size of the MS-Windows virtual memory. In such case, it is recommended to enlarge the paging file size to the appropriate size, or to use a bigger value of integration step.
Simulation-Based e-Learning Tools for Science,Engineering, and Technology Education(SimBeLT)
NASA Astrophysics Data System (ADS)
Davis, Doyle V.; Cherner, Y.
2006-12-01
The focus of Project SimBeLT is the research, development, testing, and dissemination of a new type of simulation-based integrated e-learning set of modules for two-year college technical and engineering curricula in the areas of thermodynamics, fluid physics, and fiber optics that can also be used in secondary schools and four-year colleges. A collection of sophisticated virtual labs is the core component of the SimBeLT modules. These labs will be designed to enhance the understanding of technical concepts and underlying fundamental principles of these topics, as well as to master certain performance based skills online. SimBeLT software will help educators to meet the National Science Education Standard that "learning science and technology is something that students do, not something that is done to them". A major component of Project SimBeLT is the development of multi-layered technology-oriented virtual labs that realistically mimic workplace-like environments. Dynamic data exchange between simulations will be implemented and links with instant instructional messages and data handling tools will be realized. A second important goal of Project SimBeLT labs is to bridge technical skills and scientific knowledge by enhancing the teaching and learning of specific scientific or engineering subjects. SimBeLT builds upon research and outcomes of interactive teaching strategies and tools developed through prior NSF funding (http://webphysics.nhctc.edu/compact/index.html) (Project SimBeLT is partially supported by a grant from the National Science Foundation DUE-0603277)
Tappenden, Kelly A
2015-09-01
In 2014, recognizing the need to have a single document to guide scientific decision making at the Academy of Nutrition and Dietetics (Academy), the Council on Research was charged with developing a scientific integrity policy for the organization. From the Council on Research, four members volunteered to lead this workgroup, which reviewed the literature and best practices for scientific integrity from well-respected organizations, including federal funders of research. It became clear that the scope of this document would be quite broad, given the many scientific activities the Academy is involved in, and that it would be unreasonable to set policy for each of these many situations. Therefore, the workgroup set about defining the scope of scientific activities to be covered and envisioned a set of guiding principles, to which policies from every organizational unit of the Academy could be compared to ensure they were in alignment. While many relevant policies exist already, such as the requirement of a signed conflict of interest disclosure for Food & Nutrition Conference & Expo speakers, the Evidence Analysis Library funding policy, and the Academy's sponsorship policy, the scientific integrity principals are unique in that they provide a unifying vision to which future policies can be compared and approved based on their alignment with the principles. The six principles outlined in this article were approved by the full Council on Research in January 2015 and approved by the Academy's Board of Directors in March 2015. This article covers the scope of the principles, presents the principles and existing related resources, and outlines next steps for the Academy to review and revise current policies and create new ones in alignment with these principles. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Modern Scientific Visualization is more than Just Pretty Pictures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E Wes; Rubel, Oliver; Wu, Kesheng
2008-12-05
While the primary product of scientific visualization is images and movies, its primary objective is really scientific insight. Too often, the focus of visualization research is on the product, not the mission. This paper presents two case studies, both that appear in previous publications, that focus on using visualization technology to produce insight. The first applies"Query-Driven Visualization" concepts to laser wakefield simulation data to help identify and analyze the process of beam formation. The second uses topological analysis to provide a quantitative basis for (i) understanding the mixing process in hydrodynamic simulations, and (ii) performing comparative analysis of data frommore » two different types of simulations that model hydrodynamic instability.« less
Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data
NASA Astrophysics Data System (ADS)
Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.
2016-12-01
Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.
NASA Astrophysics Data System (ADS)
Benedict-Chambers, Amanda; Kademian, Sylvie M.; Davis, Elizabeth A.; Palincsar, Annemarie Sullivan
2017-10-01
Science education reforms articulate a vision of ambitious science teaching where teachers engage students in sensemaking discussions and emphasise the integration of scientific practices with science content. Learning to teach in this way is complex, and there are few examples of sensemaking discussions in schools where textbook lessons and teacher-directed discussions are the norm. The purpose of this study was to characterise the questioning practices of an experienced teacher who taught a curricular unit enhanced with educative features that emphasised students' engagement in scientific practices integrated with science content. Analyses indicated the teacher asked four types of questions: explication questions, explanation questions, science concept questions, and scientific practice questions, and she used three questioning patterns including: (1) focusing students on scientific practices, which involved a sequence of questions to turn students back to the scientific practice; (2) supporting students in naming observed phenomena, which involved a sequence of questions to help students use scientific language; and (3) guiding students in sensemaking, which involved a sequence of questions to help students learn about scientific practices, describe evidence, and develop explanations. Although many of the discussions in this study were not yet student-centred, they provide an image of a teacher asking specific questions that move students towards reform-oriented instruction. Implications for classroom practice are discussed and recommendations for future research are provided.
Flux-driven turbulence GDB simulations of the IWL Alcator C-Mod L-mode edge compared with experiment
NASA Astrophysics Data System (ADS)
Francisquez, Manaure; Zhu, Ben; Rogers, Barrett
2017-10-01
Prior to predicting confinement regime transitions in tokamaks one may need an accurate description of L-mode profiles and turbulence properties. These features determine the heat-flux width upon which wall integrity depends, a topic of major interest for research aid to ITER. To this end our work uses the GDB model to simulate the Alcator C-Mod edge and contributes support for its use in studying critical edge phenomena in current and future tokamaks. We carried out 3D electromagnetic flux-driven two-fluid turbulence simulations of inner wall limited (IWL) C-Mod shots spanning closed and open flux surfaces. These simulations are compared with gas puff imaging (GPI) and mirror Langmuir probe (MLP) data, examining global features and statistical properties of turbulent dynamics. GDB reproduces important qualitative aspects of the C-Mod edge regarding global density and temperature profiles, within reasonable margins, and though the turbulence statistics of the simulated turbulence follow similar quantitative trends questions remain about the code's difficulty in exactly predicting quantities like the autocorrelation time A proposed breakpoint in the near SOL pressure and the posited separation between drift and ballooning dynamics it represents are examined This work was supported by DOE-SC-0010508. This research used resources of the National Energy Research Scientific Computing Center (NERSC).
An adaptable XML based approach for scientific data management and integration
NASA Astrophysics Data System (ADS)
Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo
2008-03-01
Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.
An Adaptable XML Based Approach for Scientific Data Management and Integration.
Wang, Fusheng; Thiel, Florian; Furrer, Daniel; Vergara-Niedermayr, Cristobal; Qin, Chen; Hackenberg, Georg; Bourgue, Pierre-Emmanuel; Kaltschmidt, David; Wang, Mo
2008-02-20
Increased complexity of scientific research poses new challenges to scientific data management. Meanwhile, scientific collaboration is becoming increasing important, which relies on integrating and sharing data from distributed institutions. We develop SciPort, a Web-based platform on supporting scientific data management and integration based on a central server based distributed architecture, where researchers can easily collect, publish, and share their complex scientific data across multi-institutions. SciPort provides an XML based general approach to model complex scientific data by representing them as XML documents. The documents capture not only hierarchical structured data, but also images and raw data through references. In addition, SciPort provides an XML based hierarchical organization of the overall data space to make it convenient for quick browsing. To provide generalization, schemas and hierarchies are customizable with XML-based definitions, thus it is possible to quickly adapt the system to different applications. While each institution can manage documents on a Local SciPort Server independently, selected documents can be published to a Central Server to form a global view of shared data across all sites. By storing documents in a native XML database, SciPort provides high schema extensibility and supports comprehensive queries through XQuery. By providing a unified and effective means for data modeling, data access and customization with XML, SciPort provides a flexible and powerful platform for sharing scientific data for scientific research communities, and has been successfully used in both biomedical research and clinical trials.
2011-09-30
and easy to apply in large-scale physical-biogeochemical simulations. We also collaborate with Dr. Curt Mobley at Sequoia Scientific for the second...we are collaborating with Dr. Curtis Mobley of Sequoia Scientific on improving the link between the radiative transfer model (EcoLight) within the
Computer Series, 52: Scientific Exploration with a Microcomputer: Simulations for Nonscientists.
ERIC Educational Resources Information Center
Whisnant, David M.
1984-01-01
Describes two simulations, written for Apple II microcomputers, focusing on scientific methodology. The first is based on the tendency of colloidal iron in high concentrations to stick to fish gills and cause breathing difficulties. The second, modeled after the dioxin controversy, examines a hypothetical chemical thought to cause cancer. (JN)
A Perspective on Coupled Multiscale Simulation and Validation in Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. P. Short; D. Gaston; C. R. Stanek
2014-01-01
The field of nuclear materials encompasses numerous opportunities to address and ultimately solve longstanding industrial problems by improving the fundamental understanding of materials through the integration of experiments with multiscale modeling and high-performance simulation. A particularly noteworthy example is an ongoing study of axial power distortions in a nuclear reactor induced by corrosion deposits, known as CRUD (Chalk River unidentified deposits). We describe how progress is being made toward achieving scientific advances and technological solutions on two fronts. Specifically, the study of thermal conductivity of CRUD phases has augmented missing data as well as revealed new mechanisms. Additionally, the developmentmore » of a multiscale simulation framework shows potential for the validation of a new capability to predict the power distribution of a reactor, in effect direct evidence of technological impact. The material- and system-level challenges identified in the study of CRUD are similar to other well-known vexing problems in nuclear materials, such as irradiation accelerated corrosion, stress corrosion cracking, and void swelling; they all involve connecting materials science fundamentals at the atomistic- and mesoscales to technology challenges at the macroscale.« less
Development of Hydro-Informatic Modelling System and its Application
NASA Astrophysics Data System (ADS)
Wang, Z.; Liu, C.; Zheng, H.; Zhang, L.; Wu, X.
2009-12-01
The understanding of hydrological cycle is the core of hydrology and the scientific base of water resources management. Meanwhile, simulation of hydrological cycle has long been regarded as an important tool for the assessment, utilization and protection of water resources. In this paper, a new tool named Hydro-Informatic Modelling System (HIMS) has been developed and introduced with case studies in the Yellow River Basin in China and 331 catchments in Australia. The case studies showed that HIMS can be employed as an integrated platform for hydrological simulation in different regions. HIMS is a modular based framework of hydrological model designed for different utilization such as flood forecasting, water resources planning and evaluating hydrological impacts of climate change and human activities. The unique of HIMS is its flexibility in providing alternative modules in the simulation of hydrological cycle, which successfully overcome the difficulties in the availability of input data, the uncertainty of parameters, and the difference of rainfall-runoff processes. The modular based structure of HIMS makes it possible for developing new hydrological models by the users.
A Virtual Aluminum Reduction Cell
NASA Astrophysics Data System (ADS)
Zhang, Hongliang; Zhou, Chenn Q.; Wu, Bing; Li, Jie
2013-11-01
The most important component in the aluminum industry is the aluminum reduction cell; it has received considerable interests and resources to conduct research to improve its productivity and energy efficiency. The current study focused on the integration of numerical simulation data and virtual reality technology to create a scientifically and practically realistic virtual aluminum reduction cell by presenting complex cell structures and physical-chemical phenomena. The multiphysical field simulation models were first built and solved in ANSYS software (ANSYS Inc., Canonsburg, PA, USA). Then, the methodology of combining the simulation results with virtual reality was introduced, and a virtual aluminum reduction cell was created. The demonstration showed that a computer-based world could be created in which people who are not analysis experts can see the detailed cell structure in a context that they can understand easily. With the application of the virtual aluminum reduction cell, even people who are familiar with aluminum reduction cell operations can gain insights that make it possible to understand the root causes of observed problems and plan design changes in much less time.
NASA Astrophysics Data System (ADS)
Barrett, K.
2017-12-01
Scientific integrity is the hallmark of any assessment and is a paramount consideration in the Intergovernmental Panel on Climate Change (IPCC) assessment process. Procedures are in place for rigorous scientific review and to quantify confidence levels and uncertainty in the communication of key findings. However, the IPCC is unique in that its reports are formally accepted by governments through consensus agreement. This presentation will present the unique requirements of the IPCC intergovernmental assessment and discuss the advantages and challenges of its approach.
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
ERIC Educational Resources Information Center
Pallant, Amy; Lee, Hee-Sun
2015-01-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students (N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation…
NASA Astrophysics Data System (ADS)
Donà, G.; Faletra, M.
2015-09-01
This paper presents the TT&C performance simulator toolkit developed internally at Thales Alenia Space Italia (TAS-I) to support the design of TT&C subsystems for space exploration and scientific satellites. The simulator has a modular architecture and has been designed using a model-based approach using standard engineering tools such as MATLAB/SIMULINK and mission analysis tools (e.g. STK). The simulator is easily reconfigurable to fit different types of satellites, different mission requirements and different scenarios parameters. This paper provides a brief description of the simulator architecture together with two examples of applications used to demonstrate some of the simulator’s capabilities.
NASA Astrophysics Data System (ADS)
Huismann, Tyler D.
Due to the rapidly expanding role of electric propulsion (EP) devices, it is important to evaluate their integration with other spacecraft systems. Specifically, EP device plumes can play a major role in spacecraft integration, and as such, accurate characterization of plume structure bears on mission success. This dissertation addresses issues related to accurate prediction of plume structure in a particular type of EP device, a Hall thruster. This is done in two ways: first, by coupling current plume simulation models with current models that simulate a Hall thruster's internal plasma behavior; second, by improving plume simulation models and thereby increasing physical fidelity. These methods are assessed by comparing simulated results to experimental measurements. Assessment indicates the two methods improve plume modeling capabilities significantly: using far-field ion current density as a metric, these approaches used in conjunction improve agreement with measurements by a factor of 2.5, as compared to previous methods. Based on comparison to experimental measurements, recent computational work on discharge chamber modeling has been largely successful in predicting properties of internal thruster plasmas. This model can provide detailed information on plasma properties at a variety of locations. Frequently, experimental data is not available at many locations that are of interest regarding computational models. Excepting the presence of experimental data, there are limited alternatives for scientifically determining plasma properties that are necessary as inputs into plume simulations. Therefore, this dissertation focuses on coupling current models that simulate internal thruster plasma behavior with plume simulation models. Further, recent experimental work on atom-ion interactions has provided a better understanding of particle collisions within plasmas. This experimental work is used to update collision models in a current plume simulation code. Previous versions of the code assume an unknown dependence between particles' pre-collision velocities and post-collision scattering angles. This dissertation focuses on updating several of these types of collisions by assuming a curve fit based on the measurements of atom-ion interactions, such that previously unknown angular dependences are well-characterized.
75 FR 53325 - Proposed Scientific Integrity Policy of the Department of the Interior
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-31
... September 20, 2010. ADDRESSES: Send comments to: [email protected]ios.doi.gov . FOR FURTHER INFORMATION... scientific products, or on documents compiled and translated from scientific products, to ensure that agency... involving inventorying, monitoring, experimentation, study, research, modeling, and scientific assessment...
[A practical guide for writing an original scientific article].
Rosenberg, Jacob; Burcharth, Jakob; Pommergaard, Hans-Christian
2014-06-30
Writing scientific articles is an integrated part of being a medical doctor at academic institutions, and the demand for publishing scientific work has increased during recent years. The discipline of writing scientific articles can be troublesome and complicated, especially for young inexperienced researchers. This article is a guide to structuring and writing an original scientific article.
ERIC Educational Resources Information Center
Gormally, Cara; Brickman, Peggy; Lutz, Mary
2012-01-01
Life sciences faculty agree that developing scientific literacy is an integral part of undergraduate education and report that they teach these skills. However, few measures of scientific literacy are available to assess students' proficiency in using scientific literacy skills to solve scenarios in and beyond the undergraduate biology classroom.…
Data management and analysis for the Earth System Grid
NASA Astrophysics Data System (ADS)
Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.
2008-07-01
The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.
ERIC Educational Resources Information Center
Zhang, Danhui; Campbell, Todd
2012-01-01
This study examines the effectiveness of the Integrated Experiential Learning Curriculum (IELC) in China. This curriculum was developed to engage Chinese elementary students in science to cultivate a scientifically literate society by focusing science instruction on practical applications of scientific knowledge. Cornerstones of the approach…
USDA-ARS?s Scientific Manuscript database
Scientific data integration and computational service discovery are challenges for the bioinformatic community. This process is made more difficult by the separate and independent construction of biological databases, which makes the exchange of scientific data between information resources difficu...
ERIC Educational Resources Information Center
Benedict-Chambers, Amanda; Kademian, Sylvie M.; Davis, Elizabeth A.; Palincsar, Annemarie Sullivan
2017-01-01
Science education reforms articulate a vision of ambitious science teaching where teachers engage students in sensemaking discussions and emphasise the integration of scientific practices with science content. Learning to teach in this way is complex, and there are few examples of sensemaking discussions in schools where textbook lessons and…
Neurocardiology: Therapeutic Implications for Cardiovascular Disease
Goldstein, David S.
2016-01-01
SUMMARY The term “neurocardiology” refers to physiologic and pathophysiological interplays of the nervous and cardiovascular systems. This selective review provides an update about cardiovascular therapeutic implications of neurocardiology, with emphasis on disorders involving primary or secondary abnormalities of catecholamine systems. Concepts of scientific integrative medicine help understand these disorders. Scientific integrative medicine is not a treatment method or discipline but a way of thinking that applies systems concepts to acute and chronic disorders of regulation. Some of these concepts include stability by negative feedback regulation, multiple effectors, effector sharing, instability by positive feedback loops, allostasis, and allostatic load. Scientific integrative medicine builds on systems biology but is also distinct in several ways. A large variety of drugs and non-drug treatments are now available or under study for neurocardiologic disorders in which catecholamine systems are hyperfunctional or hypofunctional. The future of therapeutics in neurocardiology is not so much in new curative drugs as in applying scientific integrative medical ideas that take into account concurrent chronic degenerative disorders and interactions of multiple drug and non-drug treatments with each other and with those disorders. PMID:21108771
Phipps, Eric T.; D'Elia, Marta; Edwards, Harold C.; ...
2017-04-18
In this study, quantifying simulation uncertainties is a critical component of rigorous predictive simulation. A key component of this is forward propagation of uncertainties in simulation input data to output quantities of interest. Typical approaches involve repeated sampling of the simulation over the uncertain input data, and can require numerous samples when accurately propagating uncertainties from large numbers of sources. Often simulation processes from sample to sample are similar and much of the data generated from each sample evaluation could be reused. We explore a new method for implementing sampling methods that simultaneously propagates groups of samples together in anmore » embedded fashion, which we call embedded ensemble propagation. We show how this approach takes advantage of properties of modern computer architectures to improve performance by enabling reuse between samples, reducing memory bandwidth requirements, improving memory access patterns, improving opportunities for fine-grained parallelization, and reducing communication costs. We describe a software technique for implementing embedded ensemble propagation based on the use of C++ templates and describe its integration with various scientific computing libraries within Trilinos. We demonstrate improved performance, portability and scalability for the approach applied to the simulation of partial differential equations on a variety of CPU, GPU, and accelerator architectures, including up to 131,072 cores on a Cray XK7 (Titan).« less
Study of physiological responses to acute carbon monoxide exposure with a human patient simulator.
Cesari, Whitney A; Caruso, Dominique M; Zyka, Enela L; Schroff, Stuart T; Evans, Charles H; Hyatt, Jon-Philippe K
2006-12-01
Human patient simulators are widely used to train health professionals and students in a clinical setting, but they also can be used to enhance physiology education in a laboratory setting. Our course incorporates the human patient simulator for experiential learning in which undergraduate university juniors and seniors are instructed to design, conduct, and present (orally and in written form) their project testing physiological adaptation to an extreme environment. This article is a student report on the physiological response to acute carbon monoxide exposure in a simulated healthy adult male and a coal miner and represents how 1) human patient simulators can be used in a nonclinical way for experiential hypothesis testing; 2) students can transition from traditional textbook learning to practical application of their knowledge; and 3) student-initiated group investigation drives critical thought. While the course instructors remain available for consultation throughout the project, the relatively unstructured framework of the assignment drives the students to create an experiment independently, troubleshoot problems, and interpret the results. The only stipulation of the project is that the students must generate an experiment that is physiologically realistic and that requires them to search out and incorporate appropriate data from primary scientific literature. In this context, the human patient simulator is a viable educational tool for teaching integrative physiology in a laboratory environment by bridging textual information with experiential investigation.
Teaching Scientific Communication Skills in Science Studies: Does It Make a Difference?
ERIC Educational Resources Information Center
Spektor-Levy, Ornit; Eylon, Bat-Sheva; Scherz, Zahava
2009-01-01
This study explores the impact of "Scientific Communication" (SC) skills instruction on students' performances in scientific literacy assessment tasks. We present a general model for skills instruction, characterized by explicit and spiral instruction, integration into content learning, practice in several scientific topics, and application of…
NASA Astrophysics Data System (ADS)
Zimmerman, Timothy David
2005-11-01
Students and citizens need to apply science to important issues every day. Yet the design of science curricula that foster integration of science and everyday decisions is not well understood. For example, can curricula be designed that help learners apply scientific reasons for choosing only environmentally sustainable seafood for dinner? Learners must develop integrated understandings of scientific principles, prior experiences, and current decisions in order to comprehend how everyday decisions impact environmental resources. In order to investigate how such integrated understandings can be promoted within school science classes, research was conducted with an inquiry-oriented curriculum that utilizes technology and a visit to an informal learning environment (aquarium) to promote the integration of scientific principles (adaptation) with environmental stewardship. This research used a knowledge integration approach to teaching and learning that provided a framework for promoting the application of science to environmental issues. Marine biology, often forsaken in classrooms for terrestrial biology, served as the scientific context for the curriculum. The curriculum design incorporated a three-phase pedagogical strategy and new technology tools to help students integrate knowledge and experiences across the classroom and aquarium learning environments. The research design and assessment protocols included comparisons among and within student populations using two versions of the curriculum: an issue-based version and a principle-based version. These inquiry curricula were tested with sophomore biology students attending a marine-focused academy within a coastal California high school. Pretest-posttest outcomes were compared between and within the curricular treatments. Additionally, comparisons were made between the inquiry groups and seniors in an Advanced Placement biology course who attend the same high school. Results indicate that the inquiry curricula enabled students to integrate and apply knowledge of evolutionary biology to real-world environmental stewardship issues. Over the course of the curriculum, students' ideas became more scientifically normative and tended to focus around concepts of natural selection. Students using the inquiry curricula outperformed the Advanced Placement biology students on several measures, including knowledge of evolutionary biology. These results have implications for designing science curricula that seek to promote the application of science to environmental stewardship and integrate formal and informal learning environments.
The electrical ground support equipment for the ExoMars 2016 DREAMS scientific instrument
NASA Astrophysics Data System (ADS)
Molfese, C.; Schipani, P.; Marty, L.; Esposito, F.; D'Orsi, S.; Mannetta, M.; Debei, S.; Bettanini, C.; Aboudan, A.; Colombatti, G.; Mugnuolo, R.; Marchetti, E.; Pirrotta, S.
2014-08-01
This paper describes the Electrical Ground Support Equipment (EGSE) of the Dust characterization, Risk assessment, and Environment Analyser on the Martian Surface (DREAMS) scientific instrument, an autonomous surface payload package to be accommodated on the Entry, Descendent and landing Module (EDM) of the ExoMars 2016 European Space Agency (ESA) mission. DREAMS will perform several kinds of measurements, such as the solar irradiance with different optical detectors in the UVA band (315-400nm), NIR band (700-1100nm) and in "total luminosity" (200 -1100 nm). It will also measure environmental parameters such as the intensity of the electric field, temperature, pressure, humidity, speed and direction of the wind. The EGSE is built to control the instrument and manage the data acquisition before the integration of DREAMS within the Entry, Descendent and landing Module (EDM) and then to retrieve data from the EDM Central Checkout System (CCS), after the integration. Finally it will support also the data management during mission operations. The EGSE is based on commercial off-the-shelf components and runs custom software. It provides power supply and simulates the spacecraft, allowing the exchange of commands and telemetry according to the protocol defined by the spacecraft prime contractor. This paper describes the architecture of the system, as well as its functionalities to test the DREAMS instrument during all development activities before the ExoMars 2016 launch.
2010-01-01
Background The ability to write clearly and effectively is of central importance to the scientific enterprise. Encouraged by the success of simulation environments in other biomedical sciences, we developed WriteSim TCExam, an open-source, Web-based, textual simulation environment for teaching effective writing techniques to novice researchers. We shortlisted and modified an existing open source application - TCExam to serve as a textual simulation environment. After testing usability internally in our team, we conducted formal field usability studies with novice researchers. These were followed by formal surveys with researchers fitting the role of administrators and users (novice researchers) Results The development process was guided by feedback from usability tests within our research team. Online surveys and formal studies, involving members of the Research on Research group and selected novice researchers, show that the application is user-friendly. Additionally it has been used to train 25 novice researchers in scientific writing to date and has generated encouraging results. Conclusion WriteSim TCExam is the first Web-based, open-source textual simulation environment designed to complement traditional scientific writing instruction. While initial reviews by students and educators have been positive, a formal study is needed to measure its benefits in comparison to standard instructional methods. PMID:20509946
77 FR 54920 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-06
... . Name of Committee: Integrative, Functional and Cognitive Neuroscience Integrated Review [email protected] . Name of Committee: Integrative, Functional and Cognitive Neuroscience Integrated Review... Cognitive Neuroscience Integrated Review Group; Sensorimotor Integration Study Section. Date: October 2...
75 FR 52009 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-24
[email protected] . Name of Committee: Integrative, Functional and Cognitive Neuroscience Integrated Review... Cognitive Neuroscience Integrated Review Group; Sensorimotor Integration Study Section. Date: October 5...: Integrative, Functional and Cognitive Neuroscience Integrated Review Group; Cognitive Neuroscience Study...
ERIC Educational Resources Information Center
Lawless, Kimberly A.; Brown, Scott W.
2015-01-01
GlobalEd 2 (GE2) is a set of technology-mediated, problem-based learning (PBL) simulations for middle-grade students, that capitalises on the multidisciplinary nature of the social sciences as an expanded curricular space for students to learn and apply scientific literacies and concepts, while simultaneously also enriching their understanding of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Stephen R
2010-01-01
Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the laboratory, starting from the inception of the Laboratory in 1943. Themore » CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled mUlti-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CP AM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections): (1) Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the laboratory; (2) Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution of partial differential equations (broadly defined) in a variety of settings, including particle transport, solvers, and plasma physics; (3) Monte Carlo - Monte Carlo was invented at Los Alamos, and this theme discusses these vitally important methods and their application in everything from particle transport, to condensed matter theory, to biology; (4) Molecular Dynamics - This theme describes the widespread use of molecular dynamics for a variety of important applications, including nuclear energy, materials science, and biological modeling; (5) Discrete Event Simulation - The technical scope of this theme represents a class of complex system evolutions governed by the action of discrete events. Examples include network, communication, vehicle traffic, and epidemiology modeling; and (6) Integrated Codes - This theme discusses integrated applications (comprised of all of the supporting science represented in Themes 1-5) that are of strategic importance to the Laboratory and the nation. The laboratory has in approximately 10 million source lines of code in over 100 different such strategically important applications. Of these themes, four of them will be reviewed during the 2010 review cycle: Themes 1, 2, 3, and 6. Because these capability reviews occur every three years, Themes 4 and 5 will be reviewed in 2013, along with Theme 6 (which will be reviewed during each review, owing to this theme's role as an integrator of the supporting science represented by the other 5 themes). Yearly written status reports will be provided to the Capability Review Committee Chair during off-cycle years.« less
Computational physics and applied mathematics capability review June 8-10, 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Stephen R
2010-01-01
Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the Laboratory, starting from the inception of the Laboratory in 1943. Themore » CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled multi-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CPAM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections), as follows. Theme 1: Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the Laboratory. Theme 2: Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution of partial differential equations (broadly defined) in a variety of settings, including particle transport, solvers, and plasma physics. Theme 3: Monte Carlo - Monte Carlo was invented at Los Alamos. This theme discusses these vitally important methods and their application in everything from particle transport, to condensed matter theory, to biology. Theme 4: Molecular Dynamics - This theme describes the widespread use of molecular dynamics for a variety of important applications, including nuclear energy, materials science, and biological modeling. Theme 5: Discrete Event Simulation - The technical scope of this theme represents a class of complex system evolutions governed by the action of discrete events. Examples include network, communication, vehicle traffic, and epidemiology modeling. Theme 6: Integrated Codes - This theme discusses integrated applications (comprised of all of the supporting science represented in Themes 1-5) that are of strategic importance to the Laboratory and the nation. The Laboratory has in approximately 10 million source lines of code in over 100 different such strategically important applications. Of these themes, four of them will be reviewed during the 2010 review cycle: Themes 1,2, 3, and 6. Because these reviews occur every three years, Themes 4 and 5 will be reviewed in 2013, along with Theme 6 (which will be reviewed during each review, owing to this theme's role as an integrator of the supporting science represented by the other five themes). Yearly written status reports will be provided to the CPAM Committee Chair during off-cycle years.« less
NASA Astrophysics Data System (ADS)
McPhaden, Michael; Leinen, Margaret; McEntee, Christine; Townsend, Randy; Williams, Billy
2016-04-01
The American Geophysical Union, a scientific society of 62,000 members worldwide, has established a set of scientific integrity and professional ethics guidelines for the actions of its members, for the governance of the union in its internal activities, and for the operations and participation in its publications and scientific meetings. This presentation will provide an overview of the Ethics program at AGU, highlighting the reasons for its establishment, the process of dealing ethical breaches, the number and types of cases considered, how AGU helps educate its members on Ethics issues, and the rapidly evolving efforts at AGU to address issues related to the emerging field of GeoEthics. The presentation will also cover the most recent AGU Ethics program focus on the role for AGU and other scientific societies in addressing sexual harassment, and AGU's work to provide additional program strength in this area.
Dealing with scientific integrity issues: the Spanish experience.
Puigdomènech, Pere
2014-02-01
Integrity has been an important matter of concern for the scientific community as it affects the basis of its activities. Most countries having a significant scientific activity have dealt with this problem by different means, including drafting specific legal or soft law regulations and the appointment of stable or ad hoc committees that take care of these questions. This has also been the case in Spain. After the period of transition between dictatorship to a democratic regime, and, particularly, after the entrance in the European Union, scientific activity has increased in the country. As it could be expected, problems of misconduct have appeared and different institutions have been dealing with these matters. One of the best examples is that of Consejo Superior de Investigaciones Cientificas (CSIC), the largest institution devoted to scientific research belonging to the Spanish Government. The experience of the CSIC’s Ethics Committee in dealing with conflicts related to scientific practices is discussed here.
Conflict of interest, tailored science, and responsibility of scientific institutions and journals.
Ruff, Kathleen; Mirabelli, Dario
2014-11-01
Recent revelations have raised concerns on how conflicts of interest may involve even leading scientists and prestigious institutions and lead to bias in reporting and assessing scientific evidence. These have highlighted the need for action to safeguard scientific integrity and public health. The Italian Epidemiology Association has declared that the "biased and deliberately tailored use of the scientific evidence" by scientists with a conflict of interest serves to delay needed measures to prevent harm to public health from a polluting Italian steel plant's continuing chemical emissions. In France, unresolved concerns over conflict of interest forced the Centre for Research in Epidemiology and Public Health to cancel its imminent appointment of a prominent scientist as its Director. These negative events demonstrate the necessity for scientific institutions and journals to implement rigorous measures regarding conflict of interest and the safeguarding of scientific integrity and public health.
ERIC Educational Resources Information Center
Bromme, Rainer; Scharrer, Lisa; Stadtler, Marc; Hömberg, Johanna; Torspecken, Ronja
2015-01-01
Scientific texts are a genre in which adherence to specific discourse conventions allows for conclusions on the scientific integrity of the information and thus on its validity. This study examines whether genre-typical features of scientific discourse influence how laypeople handle conflicting science-based knowledge claims. In two experiments…
NASA Astrophysics Data System (ADS)
Mitrofanova, O. V.; Ivlev, O. A.; Urtenov, D. S.
2018-03-01
Hydrodynamics and heat exchange in the elements of thermal hydraulic tracts of ship nuclear reactors of the new generation were numerically simulated in this work. Parts of the coolant circuit in the collector and piping systems with geometries that may lead to generation of stable large-scale vortexes, causing a wide range of acoustic oscillations of the coolant, were selected as modeling objects. The purpose of the research is to develop principles of physical and mathematical modeling for scientific substantiation of optimal layout solutions that ensure enhanced operational life of icebreaker’s nuclear power installations of new generation with reactors of integral type.
Applications of Pharmacometrics in the Clinical Development and Pharmacotherapy of Anti-Infectives
Trivedi, Ashit; Lee, Richard E; Meibohm, Bernd
2013-01-01
With the increased emergence of anti-infective resistance in recent years, much focus has recently been drawn to the development of new anti-infectives and the optimization of treatment regimens and combination therapies for established antimicrobials. In this context, the field of pharmacometrics using quantitative numerical modeling and simulation techniques has in recent years emerged as an invaluable tool in the pharmaceutical industry, academia and regulatory agencies to facilitate the integration of preclinical and clinical development data and to provide a scientifically based framework for rationale dosage regimen design and treatment optimization. This review highlights the usefulness of pharmacometric analyses in anti-infective drug development and applied pharmacotherapy with select examples. PMID:23473593
NAS Technical Summaries, March 1993 - February 1994
NASA Technical Reports Server (NTRS)
1995-01-01
NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefitting other supercomputer centers in government and industry. The 1993-94 operational year concluded with 448 high-speed processor projects and 95 parallel projects representing NASA, the Department of Defense, other government agencies, private industry, and universities. This document provides a glimpse at some of the significant scientific results for the year.
NASA's Hybrid Reality Lab: One Giant Leap for Full Dive
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2017-01-01
This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.
[Problems of world outlook and methodology of science integration in biological studies].
Khododova, Iu D
1981-01-01
Problems of worldoutlook and methodology of the natural-science knowledge are considered basing on the analysis of tendencies in the development of the membrane theory of cell processes and the use of principles of biological membrane functioning when solving some scientific and applied problems pertaining to different branches of chemistry and biology. The notion scientific knowledge integration is defined as interpenetration of approaches, methods and ideas of different branches of knowledge and enrichment on this basis of their content resulting in knowledge augmentation in each field taken separately. These processes are accompanied by appearance of new branches of knowledge - sciences "on junction" and their subsequent differentiations. The analysis of some gnoseological situations shows that integration of sciences contributes to coordination and some agreement of thinking styles of different specialists, puts forward keen personality of a scientist demanding, in particular, his high professional mobility. Problems of scientific activity organization are considered, which involve social sciences into the integration processes. The role of philosophy in the integration processes is emphasized.
Development of integrative bioethics in the Mediterranean area of South-East Europe.
Kukoč, Mislav
2012-11-01
With regards to its origin, foundation and development, bioethics is a relatively new discipline, scientific and theoretical field, where different and even contradicting definition models and methodological patterns of its formation and application meet. In some philosophical orientations, bioethics is considered to be a sub-discipline of applied ethics as a traditional philosophical discipline. Yet in biomedical and other sciences, bioethics is designated as a specialist scientific discipline, or a sort of a new medical ethics. The concept of integrative bioethics as an interdisciplinary scholarly and pluriperspectivistic area goes beyond such one-sided determinations, both philosophical and scientistic, and intends to integrate the philosophical approach to bioethics with its particular scientific contents, as well as different cultural dimensions and perspectives. This concept of integrative bioethics has gradually developed at philosophical and interdisciplinary conferences and institutions on the "bioethical islands" of the Croatian Mediterranean. In this paper, the author follows the formation, development and prospects of integrative bioethics in the wider region of the Mediterranean and Southeast Europe.
NASA Astrophysics Data System (ADS)
Orgel, Csilla; Kereszturi, Ákos; Váczi, Tamás; Groemer, Gernot; Sattler, Birgit
2014-02-01
Between 15 and 25 April 2011 in the framework of the PolAres programme of the Austrian Space Forum, a five-day field test of the Aouda.X spacesuit simulator was conducted at the Rio Tinto Mars-analogue site in southern Spain. The field crew was supported by a full-scale Mission Control Center (MCC) in Innsbruck, Austria. The field telemetry data were relayed to the MCC, enabling a Remote Science Support (RSS) team to study field data in near-real-time and adjust the flight planning in a flexible manner. We report on the experiences in the field of robotics, geophysics (Ground Penetrating Radar) and geology as well as life sciences in a simulated spaceflight operational environment. Extravehicular Activity (EVA) maps had been prepared using Google Earth and aerial images. The Rio Tinto mining area offers an excellent location for Mars analogue simulations. It is recognised as a terrestrial Mars analogue site because of the presence of jarosite and related sulphates, which have been identified by the NASA Mars Exploration Rover "Opportunity" in the El Capitan region of Meridiani Planum on Mars. The acidic, high ferric-sulphate content water of Rio Tinto is also considered as a possible analogue in astrobiology regarding the analysis of ferric sulphate related biochemical pathways and produced biomarkers. During our Mars simulation, 18 different types of soil and rock samples were collected by the spacesuit tester. The Raman results confirm the presence of minerals expected, such as jarosite, different Fe oxides and oxi-hydroxides, pyrite and complex Mg and Ca sulphates. Eight science experiments were conducted in the field. In this contribution first we list the important findings during the management and realisation of tests, and also a first summary of the scientific results. Based on these experiences suggestions for future analogue work are also summarised. We finish with recommendations for future field missions, including the preparation of the experiments, communication and data transfer - as an aid to the planning of future simulations.
The INTErnational Gamma Ray Astrophysics Laboratory: INTEGRAL Highlights
NASA Astrophysics Data System (ADS)
Ubertini, Pietro; Bazzano, Angela
2014-04-01
The INTEGRAL Space Observatory was selected as the second Medium size mission (M2) of the ESAs Horizon 2000 vision programme. INTEGRAL is the first high angular and spectral resolution hard X-ray and soft γ-ray observatory with a wide band spectral response ranging from 3 keV up to 10 MeV energy band. This capability is supplemented by an unprecedented sensitivity enhanced by the 3 days orbit allowing long and uninterrupted observations over very wide field of view (up to ~ 1000 squared degrees to zero response) and sub-ms time resolution. Part of the observatory success is due to its capability to link the high energy sky with the lower energy band. The complementarity and synergy with pointing soft X-ray missions such as XMM-Newton and CHANDRA and more recently with NuSTAR is a strategic feature to link the "thermal" and the "non-thermal" Universe observed at higher energies by space missions such as Fermi and AGILE and ground based TeV observatories sensitive to extremely high energies. INTEGRAL was launched on 17 October 2002 from the Baikonur Cosmodrome (Kazakistan) aboard a Proton rocket as part of the Russian contribution to the mission, and has successfully spent almost 11 years in orbit. In view of its successful science outcome the ESA Space Programme Committee haw recently approved its scientific operation till the end of 2016. To date the spacecraft, ground segment and scientific payload are in excellent state-of-health, and INTEGRAL is continuing its scientific operations, originally planned for a 5-year technical design and scientific nominal operation plan. This paper summarizes the current INTEGRAL scientific achievements and future prospects, with particular regard to the high energy domain.
NASA Astrophysics Data System (ADS)
Yenni, Rita; Hernani, Widodo, Ari
2017-05-01
The study aims to determine the increasing of students' science literacy skills on content aspects and competency of science by using Integrated Science teaching materials based Socio-scientific Issues (SSI) for environmental pollution theme. The method used in the study is quasi-experiment with nonequivalent pretest and posttest control group design. The students of experimental class used teaching materials based SSI, whereas the students of control class were still using the usual textbooks. The result of this study showed a significant difference between the value of N-gain of experimental class and control class, whichalso occurred in every indicator of content aspects and competency of science. This result indicates that using of Integrated Science teaching materials based SSI can improve content aspect and competency of science and can be used as teaching materials alternative in teaching of Integrated Science.
Brall, Caroline; Maeckelberghe, Els; Porz, Rouven; Makhoul, Jihad; Schröder-Bäck, Peter
2017-01-01
Research ethics anew gained importance due to the changing scientific landscape and increasing demands and competition in the academic field. These changes are further exaggerated because of scarce(r) resources in some countries on the one hand and advances in genomics on the other. In this paper, we will highlight the current challenges thereof to scientific integrity. To mark key developments in research ethics, we will distinguish between what we call research ethics 1.0 and research ethics 2.0. Whereas research ethics 1.0 focuses on individual integrity and informed consent, research ethics 2.0 entails social scientific integrity within a broader perspective of a research network. This research network can be regarded as a network of responsibilities in which every stakeholder involved has to jointly meet the ethical challenges posed to research. PMID:28288472
Brall, Caroline; Maeckelberghe, Els; Porz, Rouven; Makhoul, Jihad; Schröder-Bäck, Peter
2017-01-01
Research ethics anew gained importance due to the changing scientific landscape and increasing demands and competition in the academic field. These changes are further exaggerated because of scarce(r) resources in some countries on the one hand and advances in genomics on the other. In this paper, we will highlight the current challenges thereof to scientific integrity. To mark key developments in research ethics, we will distinguish between what we call research ethics 1.0 and research ethics 2.0. Whereas research ethics 1.0 focuses on individual integrity and informed consent, research ethics 2.0 entails social scientific integrity within a broader perspective of a research network. This research network can be regarded as a network of responsibilities in which every stakeholder involved has to jointly meet the ethical challenges posed to research. © 2017 S. Karger AG, Basel.
ERIC Educational Resources Information Center
Tekkumru-Kisa, Miray; Stein, Mary Kay; Schunn, Christian
2015-01-01
Many countries, including the United States, emphasize the importance of developing students' scientific habits of mind and their capacity to think deeply about scientific ideas in an integrated fashion. Recent science education policies in the United States portray a related vision of science teaching and learning that is meant to guide the…
ERIC Educational Resources Information Center
Fortino, Carol; Gerretson, Helen; Button, Linda J.; Johnson, Sharon
The professional development program Using Literacy Integration for Communicating Scientifically (ULINCS) is a joint program of the University of Northern Colorado and Adams Twelve Five Star School District. It had been noted that the increased emphasis on literacy skills was leading educators to place less emphasis on science. The ULINCS project…
NASA Technical Reports Server (NTRS)
Frazier, Donald O.
2000-01-01
Technically, the field of integrated optics using organic/polymer materials as a new means of information processing, has emerged as of vital importance to optical computers, optical switching, optical communications, the defense industry, etc. The goal is to replace conventional electronic integrated circuits and wires by equivalent miniaturized optical integrated circuits and fibers, offering larger bandwidths, more compactness and reliability, immunity to electromagnetic interference and less cost. From the Code E perspective, this research area represents an opportunity to marry "front-line" education in science and technology with national scientific and technological interests while maximizing human resources utilization. This can be achieved by the development of untapped resources for scientific research - such as minorities, women, and universities traditionally uninvolved in scientific research.
Desai, Sapan S; Shortell, Cynthia K
2011-09-01
Competition of interest may exist at all levels in the medical publication process. Ensuring the integrity of scientific scholarship involves protecting editorial independence, promoting the use of scientific arbitration boards, promoting transparency throughout all stages of publication, and protecting the relationship between the publisher and its editors through an effective legal framework. It is incumbent upon the publisher, editors, authors, and readers to ensure that the highest standards of scientific scholarship are upheld. Doing so will help reduce fraud and misrepresentation in medical research and increase the trustworthiness of landmark findings in science. Copyright © 2011 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Teaching Scientific Ethics Using the Example of Hendrik Schon
ERIC Educational Resources Information Center
Feldman, Bernard J.
2012-01-01
It has been almost 10 years since one of the greatest frauds in the history of physics was uncovered, namely, the case of Hendrik Schon. This case provides a wonderful opportunity to discuss scientific integrity and scientific misconduct with both undergraduate and graduate science students. This article explains the scientific data at the heart…
The Units Ontology: a tool for integrating units of measurement in science
Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert
2012-01-01
Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
ERIC Educational Resources Information Center
Chang, Hsin-Yi; Hsu, Ying-Shao; Wu, Hsin-Kai
2016-01-01
We investigated the impact of an augmented reality (AR) versus interactive simulation (IS) activity incorporated in a computer learning environment to facilitate students' learning of a socio-scientific issue (SSI) on nuclear power plants and radiation pollution. We employed a quasi-experimental research design. Two classes (a total of 45…
Code of Federal Regulations, 2010 CFR
2010-01-01
... Administration on a wide range of issues, including improvement of public health, protection of the environment... technological findings and conclusions. If scientific and technological information is developed and used by the..., there should be transparency in the preparation, identification, and use of scientific and technological...
Modeling and Simulation in Healthcare Future Directions
2010-07-13
Collaborate Evidence Based Medicine is . . . The Scientific Method as Applied to Medicine The Evidence IS the Science In order to accept evidence ... based medicine . . . we must accept the current method in Science The Scientific Method is Dead Scientific Method . . . . . . is DEAD? Not necessarily
Collaborative Group Learning Approaches for Teaching Comparative Planetology
NASA Astrophysics Data System (ADS)
Slater, S. J.; Slater, T. F.
2013-12-01
Modern science education reform documents propose that the teaching of contemporary students should focus on doing science, rather than simply memorizing science. Duschl, Schweingruber, and Shouse (2007) eloquently argue for four science proficiencies for students. Students should: (i) Know, use, and interpret scientific explanations of the natural world; (ii) Generate and evaluate scientific evidence and explanations; (iii) Understand the nature and development of scientific knowledge; and (iv) Participate productively in scientific practices and discourse. In response, scholars with the CAPER Center for Astronomy & Physics Education Research are creating and field-tested two separate instructional approaches. The first of these is a series of computer-mediated, inquiry learning experiences for non-science majoring undergraduates based upon an inquiry-oriented teaching approach framed by the notions of backwards faded-scaffolding as an overarching theme for instruction. Backwards faded-scaffolding is a strategy where the conventional and rigidly linear scientific method is turned on its head and students are first taught how to create conclusions based on evidence, then how experimental design creates evidence, and only at the end introduces students to the most challenging part of inquiry - inventing scientifically appropriate questions. Planetary science databases and virtual environments used by students to conduct scientific investigations include the NASA and JPL Solar System Simulator and Eyes on the Solar System as well as the USGS Moon and Mars Global GIS Viewers. The second of these is known widely as a Lecture-Tutorial approach. Lecture-Tutorials are self-contained, collaborative group activities. The materials are designed specifically to be easily integrated into the lecture course and directly address the needs of busy and heavily-loaded teaching faculty for effective, student-centered, classroom-ready materials that do not require a drastic course revision for implementation. Students are asked to reason about difficult concepts, while working in pairs, and to discuss their ideas openly. Extensive evaluation results consistently suggest that both the backwards faded-scaffolding and the Lecture-Tutorials approaches are successful at engaging students in self-directed scientific discourse as measured by the Views on Scientific Inquiry (VOSI) as well as increasing their knowledge of science as measured by the Test Of Atronomy STandards (TOAST).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E. Wes; Frank, Randy; Fulcomer, Sam
Scientific visualization is the transformation of abstract information into images, and it plays an integral role in the scientific process by facilitating insight into observed or simulated phenomena. Visualization as a discipline spans many research areas from computer science, cognitive psychology and even art. Yet the most successful visualization applications are created when close synergistic interactions with domain scientists are part of the algorithmic design and implementation process, leading to visual representations with clear scientific meaning. Visualization is used to explore, to debug, to gain understanding, and as an analysis tool. Visualization is literally everywhere--images are present in this report,more » on television, on the web, in books and magazines--the common theme is the ability to present information visually that is rapidly assimilated by human observers, and transformed into understanding or insight. As an indispensable part a modern science laboratory, visualization is akin to the biologist's microscope or the electrical engineer's oscilloscope. Whereas the microscope is limited to small specimens or use of optics to focus light, the power of scientific visualization is virtually limitless: visualization provides the means to examine data that can be at galactic or atomic scales, or at any size in between. Unlike the traditional scientific tools for visual inspection, visualization offers the means to ''see the unseeable.'' Trends in demographics or changes in levels of atmospheric CO{sub 2} as a function of greenhouse gas emissions are familiar examples of such unseeable phenomena. Over time, visualization techniques evolve in response to scientific need. Each scientific discipline has its ''own language,'' verbal and visual, used for communication. The visual language for depicting electrical circuits is much different than the visual language for depicting theoretical molecules or trends in the stock market. There is no ''one visualization too'' that can serve as a panacea for all science disciplines. Instead, visualization researchers work hand in hand with domain scientists as part of the scientific research process to define, create, adapt and refine software that ''speaks the visual language'' of each scientific domain.« less
NASA Astrophysics Data System (ADS)
McPhaden, Michael; Davidson, Eric; McEntee, Christine; Williams, Billy
2017-04-01
The American Geophysical Union (AGU), a scientific society of 62,000 members worldwide, has established a set of scientific integrity and professional ethics guidelines for the actions of its members, for the governance of the union in its internal activities, and for the operations and participation in its publications and scientific meetings. More recently AGU has undertaken strategies and actions to help address the issue of harassment in the sciences and other work climate issues. This presentation will provide an overview of the role of scientific societies in helping to address these important issues, as well as specific strategies and actions underway at AGU and other societies. Progress to date and remaining challenges of this effort will be discussed, including AGU's work to provide additional program strength in this area.
NASA Astrophysics Data System (ADS)
Pujiastuti, E.; Mashuri
2017-04-01
Not all of teachers of Mathematics in Junior High School (JHS) can design and create teaching aids. Moreover, if teaching aids should be designed so that it can be used in learning through scientific approaches. The problem: How to conduct an integrated and sustainable training that the math teacher of JHS, especially in Semarang can design and create teaching aids that can be presented to the scientific approach? The purpose of this study to find a way of integrated and continuous training so that the math teacher of JHS can design and create teaching aids that can be presented to the scientific approach. This article was based on research with a qualitative approach. Through trials activities of resulting of training model, Focus Group Discussions (FGD), interviews, and triangulation of the results of the research were: (1) Produced a training model of integrated and sustainable that the mathematics teacher of JHS can design and create teaching aids that can be presented to the scientific approach. (2) In training, there was the provision of material and workshop (3) There was a mentoring in the classroom. (4) Sustainability of the consultation. Our advice: (1) the trainer should be clever, (2) the training can be held at the holidays, while the assistance during the holiday season was over.
Integrating Data and Networks: Human Factors
NASA Astrophysics Data System (ADS)
Chen, R. S.
2012-12-01
The development of technical linkages and interoperability between scientific networks is a necessary but not sufficient step towards integrated use and application of networked data and information for scientific and societal benefit. A range of "human factors" must also be addressed to ensure the long-term integration, sustainability, and utility of both the interoperable networks themselves and the scientific data and information to which they provide access. These human factors encompass the behavior of both individual humans and human institutions, and include system governance, a common framework for intellectual property rights and data sharing, consensus on terminology, metadata, and quality control processes, agreement on key system metrics and milestones, the compatibility of "business models" in the short and long term, harmonization of incentives for cooperation, and minimization of disincentives. Experience with several national and international initiatives and research programs such as the International Polar Year, the Group on Earth Observations, the NASA Earth Observing Data and Information System, the U.S. National Spatial Data Infrastructure, the Global Earthquake Model, and the United Nations Spatial Data Infrastructure provide a range of lessons regarding these human factors. Ongoing changes in science, technology, institutions, relationships, and even culture are creating both opportunities and challenges for expanded interoperability of scientific networks and significant improvement in data integration to advance science and the use of scientific data and information to achieve benefits for society as a whole.
NASA Astrophysics Data System (ADS)
Sezen-Barrie, Asli; Moore, Joel; Roig, Cara E.
2015-08-01
Drawn from the norms and rules of their fields, scientists use variety of practices, such as asking questions and arguing based on evidence, to engage in research that will contribute to our understanding of Earth and beyond. In this study, we explore how preservice teachers' learn to teach scientific practices while teaching plate tectonic theory. In particular, our aim is to observe which scientific practices preservice teachers use while teaching an earth science unit, how do they integrate these practices into their lessons, and what challenges do they face during their first time teaching of an earth science content area integrated with scientific practices. The study is designed as a qualitative, exploratory case study of seven preservice teachers while they were learning to teach plate tectonic theory to a group of middle school students. The data were driven from the video records and artifacts of the preservice teachers' learning and teaching processes as well as written reflections on the teaching. Intertextual discourse analysis was used to understand what scientific practices preservice teachers choose to integrate into their teaching experience. Our results showed that preservice teachers chose to focus on four aspects of scientific practices: (1) employing historical understanding of how the theory emerged, (2) encouraging the use of evidence to build up a theory, (3) observation and interpretation of data maps, and (4) collaborative practices in making up the theory. For each of these practices, we also looked at the common challenges faced by preservice teachers by using constant comparative analysis. We observed the practices that preservice teachers decided to use and the challenges they faced, which were determined by what might have come as in their personal history as learners. Therefore, in order to strengthen preservice teachers' background, college courses should be arranged to teach important scientific ideas through scientific practices. In addition, such practices should also reflect the authentic practices of earth scientists such as use of historical record and differentiating observation versus interpretation.
DoSSiER: Database of scientific simulation and experimental results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, Hans; Yarba, Julia; Genser, Krzystof
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
DoSSiER: Database of scientific simulation and experimental results
Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...
2016-08-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
78 FR 54664 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
..., Functional and Cognitive Neuroscience Integrated Review Group; Sensorimotor Integration Study Section. Date... Cognitive Neuroscience Integrated Review Group; Neuroendocrinology, Neuroimmunology, Rhythms and Sleep Study...
MicroCameras and Photometers (MCP) on board the TARANIS satellite
NASA Astrophysics Data System (ADS)
Farges, T.; Hébert, P.; Le Mer-Dachard, F.; Ravel, K.; Gaillac, S.
2017-12-01
TARANIS (Tool for the Analysis of Radiations from lightNing and Sprites) is a CNES micro satellite. Its main objective is to study impulsive transfers of energy between the Earth atmosphere and the space environment. It will be sun-synchronous at an altitude of 700 km. It will be launched in 2019 for at least 2 years. Its payload is composed of several electromagnetic instruments in different wavelengths (from gamma-rays to radio waves including optical). TARANIS instruments are currently in calibration and qualification phase. The purpose is to present the MicroCameras and Photometers (MCP) design, to show its performances after its recent characterization and at last to discuss the scientific objectives and how we want to answer it with the MCP observations. The MicroCameras, developed by Sodern, are dedicated to the spatial description of TLEs and their parent lightning. They are able to differentiate sprite and lightning thanks to two narrow bands ([757-767 nm] and [772-782 nm]) that provide simultaneous pairs of images of an Event. Simulation results of the differentiation method will be shown. After calibration and tests, the MicroCameras are now delivered to the CNES for integration on the payload. The Photometers, developed by Bertin Technologies, will provide temporal measurements and spectral characteristics of TLEs and lightning. There are key instrument because of their capability to detect on-board TLEs and then switch all the instruments of the scientific payload in their high resolution acquisition mode. Photometers use four spectral bands in the [170-260 nm], [332-342 nm], [757-767 nm] and [600-900 nm] and have the same field of view as cameras. The on-board TLE detection algorithm remote-controlled parameters have been tuned before launch using the electronic board and simulated or real events waveforms. After calibration, the Photometers are now going through the environmental tests. They will be delivered to the CNES for integration on the payload in September 2017.
Integrated Science Assessment (ISA) for Carbon Monoxide ...
EPA announced the availability of the final report, Integrated Science Assessment (ISA) for Carbon Monoxide (CO). This report is EPA’s latest evaluation of the scientific literature on the potential human health and welfare effects associated with ambient exposures to CO. The development of this document is part of the Agency's periodic review of the national ambient air quality standards (NAAQS) for CO. The recently completed CO ISA and supplementary annexes, in conjunction with additional technical and policy assessments developed by EPA’s Office of Air and Radiation, will provide the scientific basis to inform EPA decisions related to the review of the current CO NAAQS. The integrated Plan for Review of the National Ambient Air Quality Standards for Carbon Monoxide (U.S. EPA, 2008, 193995) identifies key policy-relevant questions that provide a framework for this assessment of the scientific evidence. These questions frame the entire review of the NAAQS for CO and thus are informed by both science and policy considerations. The ISA organizes, presents, and integrates the scientific evidence which is considered along with findings from risk analyses and policy considerations to help the U.S. Environmental Protection Agency (EPA) address these questions during the NAAQS review.
77 FR 50703 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... . Name of Committee: Bioengineering Sciences & Technologies Integrated Review Group; Nanotechnology Study... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Dermatology and...
NASA Astrophysics Data System (ADS)
Plasson, Ph.
2006-11-01
LESIA, in close cooperation with CNES, DLR and IWF, is responsible for the tests and validation of the CoRoT instrument digital process unit which is made up of the BEX and DPU assembly. The main part of the work has consisted in validating the DPU software and in testing the BEX/DPU coupling. This work took more than two years due to the central role of the software tested and its technical complexity. The first task, in the validation process, was to carry out the acceptance tests of the DPU software. These tests consisted in checking each of the 325 requirements identified in the URD (User Requirements Document) and were played in a configuration using the DPU coupled to a BEX simulator. During the acceptance tests, all the transversal functionalities of the DPU software, like the TC/TM management, the state machine management, the BEX driving, the system monitoring or the maintenance functionalities were checked in depth. The functionalities associated with the seismology and exoplanetology processing, like the loading of window and mask descriptors or the configuration of the service execution parameters, were also exhaustively tested. After having validated the DPU software against the user requirements using a BEX simulator, the following step consisted in coupling the DPU and the BEX in order to check that the formed unit worked correctly and met the performance requirements. These tests were conducted in two phases: the first one was devoted to the functional aspects and the tests of interface, the second one to the performance aspects. The performance tests were based on the use of the DPU software scientific services and on the use of full images representative of a realistic sky as inputs. These tests were also based on the use of a reference set of windows and parameters, which was provided by the scientific team and was representative, in terms of load and complexity, of the one that could be used during the observation mode of the CoRoT instrument. Theywere played in a configuration using either a BCC simulator or a real BCC coupled to a video simulator, to feed the BEX/DPU unit. The validation of the scientific algorithms was conducted in parallel to the phase of the BEX/DPU coupling tests. The objective of this phase was to check that the algorithms implemented in the scientific services of the DPU software were in good conformity with those specified in the URD and that the obtained numerical precision corresponded to that expected. Forty cases of tests were defined covering the fine and rough angular error measurement processing, the rejection of the brilliant pixels, the subtraction of the offset and the sky background, the photometry algorithms, the SAA handling and reference image management. For each test case, the LESIA scientific team produced, by simulation, using the model instrument, the dynamic data files and the parameter sets allowing to feed the DPU on the one hand, and, on the other hand, a model of the onboard software. These data files correspond to FITS images (black windows, star windows, offset windows) containing more or less disturbances and making it possible to test the DPU software in dynamic mode over durations of up to 48 hours. To perform the test and validation activities of the CoRoT instrument digital process unit, a set of software testing tools was developed by LESIA (Software Ground Support Equipment, hereafter "SGSE"). Thanks to their versatility and modularity, these software testing tools were actually used during all the activities of integration, tests and validation of the instrument and its subsystems CoRoTCase and CoRoTCam. The CoRoT SGSE were specified, designed and developed by LESIA. The objective was to have a software system allowing the users (validation team of the onboard software, instrument integration team, etc.) to remotely control and monitor the whole instrument or only one of the subsystems of the instrument like the DPU coupled to a simulator BEX or the BEX/DPU unit coupled to a BCC simulator. The idea was to be able to interact in real time with the system under test by driving the various EGSE, but also to play test procedures implemented as scripts organized into libraries, to record the telemetries and housekeeping data in a database, and to be able to carry out post-mortem analyses.
GPU accelerated Monte Carlo simulation of Brownian motors dynamics with CUDA
NASA Astrophysics Data System (ADS)
Spiechowicz, J.; Kostur, M.; Machura, L.
2015-06-01
This work presents an updated and extended guide on methods of a proper acceleration of the Monte Carlo integration of stochastic differential equations with the commonly available NVIDIA Graphics Processing Units using the CUDA programming environment. We outline the general aspects of the scientific computing on graphics cards and demonstrate them with two models of a well known phenomenon of the noise induced transport of Brownian motors in periodic structures. As a source of fluctuations in the considered systems we selected the three most commonly occurring noises: the Gaussian white noise, the white Poissonian noise and the dichotomous process also known as a random telegraph signal. The detailed discussion on various aspects of the applied numerical schemes is also presented. The measured speedup can be of the astonishing order of about 3000 when compared to a typical CPU. This number significantly expands the range of problems solvable by use of stochastic simulations, allowing even an interactive research in some cases.
Preparing for in situ processing on upcoming leading-edge supercomputers
Kress, James; Churchill, Randy Michael; Klasky, Scott; ...
2016-10-01
High performance computing applications are producing increasingly large amounts of data and placing enormous stress on current capabilities for traditional post-hoc visualization techniques. Because of the growing compute and I/O imbalance, data reductions, including in situ visualization, are required. These reduced data are used for analysis and visualization in a variety of different ways. Many of he visualization and analysis requirements are known a priori, but when they are not, scientists are dependent on the reduced data to accurately represent the simulation in post hoc analysis. The contributions of this paper is a description of the directions we are pursuingmore » to assist a large scale fusion simulation code succeed on the next generation of supercomputers. Finally, these directions include the role of in situ processing for performing data reductions, as well as the tradeoffs between data size and data integrity within the context of complex operations in a typical scientific workflow.« less
Cheung, Luthur Siu-Lun; Kanwar, Manu; Ostermeier, Marc; Konstantopoulos, Konstantinos
2012-01-01
Nonantibody scaffolds such as designed ankyrin repeat proteins (DARPins) can be rapidly engineered to detect diverse target proteins with high specificity and offer an attractive alternative to antibodies. Using molecular simulations, we predicted that the binding interface between DARPin off7 and its ligand (maltose binding protein; MBP) is characterized by a hot-spot motif in which binding energy is largely concentrated on a few amino acids. To experimentally test this prediction, we fused MBP to a transmembrane domain to properly orient the protein into a polymer-cushioned lipid bilayer, and characterized its interaction with off7 using force spectroscopy. Using this, to our knowledge, novel technique along with surface plasmon resonance, we validated the simulation predictions and characterized the effects of select mutations on the kinetics of the off7-MBP interaction. Our integrated approach offers scientific insights on how the engineered protein interacts with the target molecule. PMID:22325262
Challenges of the Cassini Test Bed Simulating the Saturnian Environment
NASA Technical Reports Server (NTRS)
Hernandez, Juan C.; Badaruddin, Kareem S.
2007-01-01
The Cassini-Huygens mission is a joint NASA and European Space Agency (ESA) mission to collect scientific data of the Saturnian system and is managed by the Jet Propulsion Laboratory (JPL). After having arrived in Saturn orbit and releasing the ESA's Huygens probe for a highly successful descent and landing mission on Saturn's moon Titan, the Cassini orbiter continues on its tour of Saturn, its satellites, and the Saturnian environment. JPL's Cassini Integrated Test laboratory (ITL) is a dedicated high fidelity test bed that verifies and validates command sequences and flight software before upload to the Cassini spacecraft. The ITL provides artificial stimuli that allow a highly accurate hardware-in-the-loop test bed model that tests the operation of the Cassini spacecraft on the ground. This enables accurate prediction and recreation of mission events and flight software and hardware behavior. As we discovered more about the Saturnian environment, a combination of creative test methods and simulation changes were necessary to simulate the harmful effect that the optical and physical environment has on the pointing performance of Cassini. This paper presents the challenges experienced and overcome in that endeavor to simulate and test the post Saturn Orbit Insertion (SOI) and Probe Relay tour phase of the Cassini mission.
Multi-threaded ATLAS simulation on Intel Knights Landing processors
NASA Astrophysics Data System (ADS)
Farrell, Steven; Calafiura, Paolo; Leggett, Charles; Tsulaia, Vakhtang; Dotti, Andrea; ATLAS Collaboration
2017-10-01
The Knights Landing (KNL) release of the Intel Many Integrated Core (MIC) Xeon Phi line of processors is a potential game changer for HEP computing. With 72 cores and deep vector registers, the KNL cards promise significant performance benefits for highly-parallel, compute-heavy applications. Cori, the newest supercomputer at the National Energy Research Scientific Computing Center (NERSC), was delivered to its users in two phases with the first phase online at the end of 2015 and the second phase now online at the end of 2016. Cori Phase 2 is based on the KNL architecture and contains over 9000 compute nodes with 96GB DDR4 memory. ATLAS simulation with the multithreaded Athena Framework (AthenaMT) is a good potential use-case for the KNL architecture and supercomputers like Cori. ATLAS simulation jobs have a high ratio of CPU computation to disk I/O and have been shown to scale well in multi-threading and across many nodes. In this paper we will give an overview of the ATLAS simulation application with details on its multi-threaded design. Then, we will present a performance analysis of the application on KNL devices and compare it to a traditional x86 platform to demonstrate the capabilities of the architecture and evaluate the benefits of utilizing KNL platforms like Cori for ATLAS production.
Climate and atmosphere simulator for experiments on ecological systems in changing environments.
Verdier, Bruno; Jouanneau, Isabelle; Simonnet, Benoit; Rabin, Christian; Van Dooren, Tom J M; Delpierre, Nicolas; Clobert, Jean; Abbadie, Luc; Ferrière, Régis; Le Galliard, Jean-François
2014-01-01
Grand challenges in global change research and environmental science raise the need for replicated experiments on ecosystems subjected to controlled changes in multiple environmental factors. We designed and developed the Ecolab as a variable climate and atmosphere simulator for multifactor experimentation on natural or artificial ecosystems. The Ecolab integrates atmosphere conditioning technology optimized for accuracy and reliability. The centerpiece is a highly contained, 13-m(3) chamber to host communities of aquatic and terrestrial species and control climate (temperature, humidity, rainfall, irradiance) and atmosphere conditions (O2 and CO2 concentrations). Temperature in the atmosphere and in the water or soil column can be controlled independently of each other. All climatic and atmospheric variables can be programmed to follow dynamical trajectories and simulate gradual as well as step changes. We demonstrate the Ecolab's capacity to simulate a broad range of atmospheric and climatic conditions, their diurnal and seasonal variations, and to support the growth of a model terrestrial plant in two contrasting climate scenarios. The adaptability of the Ecolab design makes it possible to study interactions between variable climate-atmosphere factors and biotic disturbances. Developed as an open-access, multichamber platform, this equipment is available to the international scientific community for exploring interactions and feedbacks between ecological and climate systems.
NASA Guidelines for Promoting Scientific and Research Integrity
NASA Technical Reports Server (NTRS)
Kaminski, Amy P.; Neogi, Natasha A.
2017-01-01
This guidebook provides an overarching summary of existing policies, activities, and guiding principles for scientific and research integrity with which NASA's workforce and affiliates must conform. This document addresses NASA's obligations as both a research institution and as a funder of research, NASA's use of federal advisory committees, NASA's public communication of research results, and professional development of NASA's workforce. This guidebook is intended to provide a single resource for NASA researchers, NASA research program administrators and project managers, external entities who do or might receive funding from NASA for research or technical projects, evaluators of NASA research proposals, NASA advisory committee members, NASA communications specialists, and members of the general public so that they can understand NASA's commitment to and expectations for scientific and integrity across the agency.
Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators
Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew
2014-01-01
Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950
Interactive, Online, Adsorption Lab to Support Discovery of the Scientific Process
NASA Astrophysics Data System (ADS)
Carroll, K. C.; Ulery, A. L.; Chamberlin, B.; Dettmer, A.
2014-12-01
Science students require more than methods practice in lab activities; they must gain an understanding of the application of the scientific process through lab work. Large classes, time constraints, and funding may limit student access to science labs, denying students access to the types of experiential learning needed to motivate and develop new scientists. Interactive, discovery-based computer simulations and virtual labs provide an alternative, low-risk opportunity for learners to engage in lab processes and activities. Students can conduct experiments, collect data, draw conclusions, and even abort a session. We have developed an online virtual lab, through which students can interactively develop as scientists as they learn about scientific concepts, lab equipment, and proper lab techniques. Our first lab topic is adsorption of chemicals to soil, but the methodology is transferrable to other topics. In addition to learning the specific procedures involved in each lab, the online activities will prompt exploration and practice in key scientific and mathematical concepts, such as unit conversion, significant digits, assessing risks, evaluating bias, and assessing quantity and quality of data. These labs are not designed to replace traditional lab instruction, but to supplement instruction on challenging or particularly time-consuming concepts. To complement classroom instruction, students can engage in a lab experience outside the lab and over a shorter time period than often required with real-world adsorption studies. More importantly, students can reflect, discuss, review, and even fail at their lab experience as part of the process to see why natural processes and scientific approaches work the way they do. Our Media Productions team has completed a series of online digital labs available at virtuallabs.nmsu.edu and scienceofsoil.com, and these virtual labs are being integrated into coursework to evaluate changes in student learning.
Ecosystem functioning is enveloped by hydrometeorological variability.
Pappas, Christoforos; Mahecha, Miguel D; Frank, David C; Babst, Flurin; Koutsoyiannis, Demetris
2017-09-01
Terrestrial ecosystem processes, and the associated vegetation carbon dynamics, respond differently to hydrometeorological variability across timescales, and so does our scientific understanding of the underlying mechanisms. Long-term variability of the terrestrial carbon cycle is not yet well constrained and the resulting climate-biosphere feedbacks are highly uncertain. Here we present a comprehensive overview of hydrometeorological and ecosystem variability from hourly to decadal timescales integrating multiple in situ and remote-sensing datasets characterizing extra-tropical forest sites. We find that ecosystem variability at all sites is confined within a hydrometeorological envelope across sites and timescales. Furthermore, ecosystem variability demonstrates long-term persistence, highlighting ecological memory and slow ecosystem recovery rates after disturbances. However, simulation results with state-of-the-art process-based models do not reflect this long-term persistent behaviour in ecosystem functioning. Accordingly, we develop a cross-time-scale stochastic framework that captures hydrometeorological and ecosystem variability. Our analysis offers a perspective for terrestrial ecosystem modelling and paves the way for new model-data integration opportunities in Earth system sciences.
NASA Technical Reports Server (NTRS)
1993-01-01
A description is given of each of the following Langley research and test facilities: 0.3-Meter Transonic Cryogenic Tunnel, 7-by 10-Foot High Speed Tunnel, 8-Foot Transonic Pressure Tunnel, 13-Inch Magnetic Suspension & Balance System, 14-by 22-Foot Subsonic Tunnel, 16-Foot Transonic Tunnel, 16-by 24-Inch Water Tunnel, 20-Foot Vertical Spin Tunnel, 30-by 60-Foot Wind Tunnel, Advanced Civil Transport Simulator (ACTS), Advanced Technology Research Laboratory, Aerospace Controls Research Laboratory (ACRL), Aerothermal Loads Complex, Aircraft Landing Dynamics Facility (ALDF), Avionics Integration Research Laboratory, Basic Aerodynamics Research Tunnel (BART), Compact Range Test Facility, Differential Maneuvering Simulator (DMS), Enhanced/Synthetic Vision & Spatial Displays Laboratory, Experimental Test Range (ETR) Flight Research Facility, General Aviation Simulator (GAS), High Intensity Radiated Fields Facility, Human Engineering Methods Laboratory, Hypersonic Facilities Complex, Impact Dynamics Research Facility, Jet Noise Laboratory & Anechoic Jet Facility, Light Alloy Laboratory, Low Frequency Antenna Test Facility, Low Turbulence Pressure Tunnel, Mechanics of Metals Laboratory, National Transonic Facility (NTF), NDE Research Laboratory, Polymers & Composites Laboratory, Pyrotechnic Test Facility, Quiet Flow Facility, Robotics Facilities, Scientific Visualization System, Scramjet Test Complex, Space Materials Research Laboratory, Space Simulation & Environmental Test Complex, Structural Dynamics Research Laboratory, Structural Dynamics Test Beds, Structures & Materials Research Laboratory, Supersonic Low Disturbance Pilot Tunnel, Thermal Acoustic Fatigue Apparatus (TAFA), Transonic Dynamics Tunnel (TDT), Transport Systems Research Vehicle, Unitary Plan Wind Tunnel, and the Visual Motion Simulator (VMS).
Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille; Kolla, Hemanth
This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- mericalmore » tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.« less
The NASA Space Radiation Research Program
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.
2006-01-01
We present a comprehensive overview of the NASA Space Radiation Research Program. This program combines basic research on the mechanisms of radiobiological action relevant for improving knowledge of the risks of cancer, central nervous system and other possible degenerative tissue effects, and acute radiation syndromes from space radiation. The keystones of the NASA Program are five NASA Specialized Center's of Research (NSCOR) investigating space radiation risks. Other research is carried out through peer-reviewed individual investigations and in collaboration with the US Department of Energies Low-Dose Research Program. The Space Radiation Research Program has established the Risk Assessment Project to integrate data from the NSCOR s and other peer-reviewed research into quantitative projection models with the goals of steering research into data and scientific breakthroughs that will reduce the uncertainties in current risk projections and developing the scientific knowledge needed for future individual risk assessment approaches and biological countermeasure assessments or design. The NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory was created by the Program to simulate space radiation on the ground in support of the above research programs. New results from NSRL will be described.
What does «integrative medicine» provide to daily scientific clinical care?
Bataller-Sifre, R; Bataller-Alberola, A
2015-11-01
Integrative medicine is an ambitious and noble-minded attempt to address the shortcomings of the current public health systems in our Western societies, which is restricted by the limited time available, especially in outpatient clinics. Integrative medicine also does not limit the possibilities of useful therapies that have been tested over the centuries (from China, India, etc.) or of certain resources that do not achieve the level of desired scientific credibility but that present certain therapeutic support in specific cases (homeopathy, acupuncture, etc.) but still require a scientific approach. Finally, the resource of botanical products (phytotherapy) constitutes a wide range of possibilities that universities can (and do) make progress on by providing drug brands for these products through the use of the scientific method and evidence-based medical criteria. This approach will help avoid the irrationality of the daily struggle between conventional scientific medicine (which we apply to the immense majority of patients) and the other diagnostic-therapeutic «guidelines» (natural medicine, alternative medicine, complementary medicine, patient-focused medicine and others). Copyright © 2015. Published by Elsevier España, S.L.U.
Gendermetrics.NET: a novel software for analyzing the gender representation in scientific authoring.
Bendels, Michael H K; Brüggmann, Dörthe; Schöffel, Norman; Groneberg, David A
2016-01-01
Imbalances in female career promotion are believed to be strong in the field of academic science. A primary parameter to analyze gender inequalities is the gender authoring in scientific publications. Since the presently available data on gender distribution is largely limited to underpowered studies, we here develop a new approach to analyze authors' genders in large bibliometric databases. A SQL-Server based multiuser software suite was developed that serves as an integrative tool for analyzing bibliometric data with a special emphasis on gender and topographical analysis. The presented system allows seamless integration, inspection, modification, evaluation and visualization of bibliometric data. By providing an adaptive and almost fully automatic integration and analysis process, the inter-individual variability of analysis is kept at a low level. Depending on the scientific question, the system enables the user to perform a scientometric analysis including its visualization within a short period of time. In summary, a new software suite for analyzing gender representations in scientific articles was established. The system is suitable for the comparative analysis of scientific structures on the level of continents, countries, cities, city regions, institutions, research fields and journals.
77 FR 28886 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-16
..., Functional and Cognitive Neuroscience Integrated Review Group; Sensorimotor Integration Study Section. Date... and Developmental Neuroscience Integrated Review Group; Neural Oxidative Metabolism and Death Study...: Molecular, Cellular and Developmental Neuroscience Integrated Review Group; Neurodifferentiation, Plasticity...
NASA Technical Reports Server (NTRS)
Dmbacher, Daniel L.; Lyles, Garry M.; McConnaughey, Paul
2008-01-01
Over the past 50 years, the National Aeronautics and Space Administration (NASA) has delivered space transportation solutions for America's complex missions, ranging from scientific payloads that expand knowledge, such as the Hubble Space Telescope, to astronauts and lunar rovers destined for voyages to the Moon. Currently, the venerable Space Shuttle, which has been in service since 1981, provides the United States' (U.S.) capability for both crew and heavy cargo to low-Earth orbit to' construct the International Space Station, before the Shuttle is retired in 2010. In the next decade, NASA will replace this system with a duo of launch vehicles: the Ares I Crew Launch Vehicle and the Ares V Cargo Launch Vehicle (Figure 1). The goals for this new system include increased safety and reliability coupled with lower operations costs that promote sustainable space exploration for decades to come. The Ares I will loft the Orion Crew Exploration Vehicle, while the heavy-lift Ares V will carry the Altair Lunar Lander and the equipment and supplies needed to construct a lunar outpost for a new generation of human and robotic space pioneers. This paper will provide details of the in-house systems engineering and vehicle integration work now being performed for the Ares I and planned for the Ares V. It will give an overview of the Ares I system-level test activities, such as the ground vibration testing that will be conducted in the Marshall Center's Dynamic Test Stand to verify the integrated vehicle stack's structural integrity and to validate computer modeling and simulation (Figure 2), as well as the main propulsion test article analysis to be conducted in the Static Test Stand. These activities also will help prove and refine mission concepts of operation, while supporting the spectrum of design and development work being performed by Marshall's Engineering Directorate, ranging from launch vehicles and lunar rovers to scientific spacecraft and associated experiments. Ultimately, fielding a robust space transportation solution that will carry international explorers and essential payloads will pave the way for a new century of scientific discovery beyond planet Earth.
Mathematical and Scientific Foundations for an Integrative Engineering Curriculum.
ERIC Educational Resources Information Center
Carr, Robin; And Others
1995-01-01
Describes the Mathematical and Scientific Foundations of Engineering curriculum which emphasizes the mathematical and scientific concepts common to all engineering fields. Scientists and engineers together devised topics and experiments that emphasize the relevance of theory to real-world applications. Presents material efficiently while building…
NASA Astrophysics Data System (ADS)
Barbosa, A.; Robertson, W. H.
2013-12-01
In the 2012, the National Research Council (NRC) of the National Academies' reported that one of the major issues associated with the development of climate change curriculum was the lack of interdisciplinary materials that also promoted a correlation between science standards and content. Therefore, in order to respond to this need, our group has developed an interdisciplinary climate change curriculum that has had as its fundamental basis the alignment with the guidelines presented by the Next Generation Science Standards (NGSS) and the ones presented by the international document entitled The Earth Charter. In this regards, while the alignment with NGSS disciplinary core ideas, cross-concepts and students' expectations intended to fulfill the need for the development of climate change curriculum activities that were directly associated with the appropriate set of NGSS guidelines, the alignment with The Earth Charter document intended to reinforce the need the for the integration of sociological, philosophical and intercultural analysis of the theme 'climate change'. Additionally, our curriculum was also developed as part of a collaborative project between climate scientists and engineers, who are responsible for the development of a Regional Arctic Simulation Model (RASM). Hence, another important curriculum constituent was the feedback, suggestions and reviews provided by these professionals, who have also contributed to these pedagogical materials' scientific accuracy by facilitating the integration of datasets and visualizations developed by RASM. Furthermore, our group has developed a climate change curriculum for two types of audience: high school and early undergraduate students. Each curriculum unit is divided into modules and each module contains a set of lesson plans. The topics selected to compose each unit and module were designated according to the surveys conducted with scientists and engineers involved with the development of the climate change simulation model. Inside each module, we have provided a description of the general topic being addressed, the appropriate grade levels, students' required prior knowledge, the correspondent NGSS topics, disciplinary core ideas and students' performance expectations, purpose of the activities, and lesson plan activities. Each lesson plan activity is composed by the following: an introductory text that aims at explaining the topic, activities description (classroom tasks and optional classroom activities), time frame, materials, assessment, additional readings and online resources (scientific journals, online simulation models, and books). Each module presents activities and discussions that incorporate historical, philosophical, sociological and/or scientific perspectives on the topics being addressed. Moreover, the activities and lesson plans designed to compose our curriculum have the potential of being used either individually or together, according to the teacher and topic of interest, at the same time that each unit can also be used as a full semester course.
Taylor, Kimberly A.; Short, A.
2009-01-01
Integrating science into resource management activities is a goal of the CALFED Bay-Delta Program, a multi-agency effort to address water supply reliability, ecological condition, drinking water quality, and levees in the Sacramento-San Joaquin Delta of northern California. Under CALFED, many different strategies were used to integrate science, including interaction between the research and management communities, public dialogues about scientific work, and peer review. This paper explores ways science was (and was not) integrated into CALFED's management actions and decision systems through three narratives describing different patterns of scientific integration and application in CALFED. Though a collaborative process and certain organizational conditions may be necessary for developing new understandings of the system of interest, we find that those factors are not sufficient for translating that knowledge into management actions and decision systems. We suggest that the application of knowledge may be facilitated or hindered by (1) differences in the objectives, approaches, and cultures of scientists operating in the research community and those operating in the management community and (2) other factors external to the collaborative process and organization.
Goldstein, David S
2013-10-01
This review presents concepts of scientific integrative medicine and relates them to the physiology of catecholamine systems and to the pathophysiology of catecholamine-related disorders. The applications to catecholamine systems exemplify how scientific integrative medicine links systems biology with integrative physiology. Concepts of scientific integrative medicine include (i) negative feedback regulation, maintaining stability of the body's monitored variables; (ii) homeostats, which compare information about monitored variables with algorithms for responding; (iii) multiple effectors, enabling compensatory activation of alternative effectors and primitive specificity of stress response patterns; (iv) effector sharing, accounting for interactions among homeostats and phenomena such as hyperglycemia attending gastrointestinal bleeding and hyponatremia attending congestive heart failure; (v) stress, applying a definition as a state rather than as an environmental stimulus or stereotyped response; (vi) distress, using a noncircular definition that does not presume pathology; (vii) allostasis, corresponding to adaptive plasticity of feedback-regulated systems; and (viii) allostatic load, explaining chronic degenerative diseases in terms of effects of cumulative wear and tear. From computer models one can predict mathematically the effects of stress and allostatic load on the transition from wellness to symptomatic disease. The review describes acute and chronic clinical disorders involving catecholamine systems-especially Parkinson disease-and how these concepts relate to pathophysiology, early detection, and treatment and prevention strategies in the post-genome era. Published 2013. Compr Physiol 3:1569-1610, 2013.
Goldstein, David S.
2016-01-01
This review presents concepts of scientific integrative medicine and relates them to the physiology of catecholamine systems and to the pathophysiology of catecholamine-related disorders. The applications to catecholamine systems exemplify how scientific integrative medicine links systems biology with integrative physiology. Concepts of scientific integrative medicine include (i) negative feedback regulation, maintaining stability of the body’s monitored variables; (ii) homeostats, which compare information about monitored variables with algorithms for responding; (iii) multiple effectors, enabling compensatory activation of alternative effectors and primitive specificity of stress response patterns; (iv) effector sharing, accounting for interactions among homeostats and phenomena such as hyperglycemia attending gastrointestinal bleeding and hyponatremia attending congestive heart failure; (v) stress, applying a definition as a state rather than as an environmental stimulus or stereotyped response; (vi) distress, using a noncircular definition that does not presume pathology; (vii) allostasis, corresponding to adaptive plasticity of feedback-regulated systems; and (viii) allostatic load, explaining chronic degenerative diseases in terms of effects of cumulative wear and tear. From computer models one can predict mathematically the effects of stress and allostatic load on the transition from wellness to symptomatic disease. The review describes acute and chronic clinical disorders involving catecholamine systems—especially Parkinson disease—and how these concepts relate to pathophysiology, early detection, and treatment and prevention strategies in the post-genome era. PMID:24265239
NASA Astrophysics Data System (ADS)
Jiang, Xikai; Li, Jiyuan; Zhao, Xujun; Qin, Jian; Karpeev, Dmitry; Hernandez-Ortiz, Juan; de Pablo, Juan J.; Heinonen, Olle
2016-08-01
Large classes of materials systems in physics and engineering are governed by magnetic and electrostatic interactions. Continuum or mesoscale descriptions of such systems can be cast in terms of integral equations, whose direct computational evaluation requires O(N2) operations, where N is the number of unknowns. Such a scaling, which arises from the many-body nature of the relevant Green's function, has precluded wide-spread adoption of integral methods for solution of large-scale scientific and engineering problems. In this work, a parallel computational approach is presented that relies on using scalable open source libraries and utilizes a kernel-independent Fast Multipole Method (FMM) to evaluate the integrals in O(N) operations, with O(N) memory cost, thereby substantially improving the scalability and efficiency of computational integral methods. We demonstrate the accuracy, efficiency, and scalability of our approach in the context of two examples. In the first, we solve a boundary value problem for a ferroelectric/ferromagnetic volume in free space. In the second, we solve an electrostatic problem involving polarizable dielectric bodies in an unbounded dielectric medium. The results from these test cases show that our proposed parallel approach, which is built on a kernel-independent FMM, can enable highly efficient and accurate simulations and allow for considerable flexibility in a broad range of applications.
Jiang, Xikai; Li, Jiyuan; Zhao, Xujun; ...
2016-08-10
Large classes of materials systems in physics and engineering are governed by magnetic and electrostatic interactions. Continuum or mesoscale descriptions of such systems can be cast in terms of integral equations, whose direct computational evaluation requires O( N 2) operations, where N is the number of unknowns. Such a scaling, which arises from the many-body nature of the relevant Green's function, has precluded wide-spread adoption of integral methods for solution of large-scale scientific and engineering problems. In this work, a parallel computational approach is presented that relies on using scalable open source libraries and utilizes a kernel-independent Fast Multipole Methodmore » (FMM) to evaluate the integrals in O( N) operations, with O( N) memory cost, thereby substantially improving the scalability and efficiency of computational integral methods. We demonstrate the accuracy, efficiency, and scalability of our approach in the context of two examples. In the first, we solve a boundary value problem for a ferroelectric/ferromagnetic volume in free space. In the second, we solve an electrostatic problem involving polarizable dielectric bodies in an unbounded dielectric medium. Lastly, the results from these test cases show that our proposed parallel approach, which is built on a kernel-independent FMM, can enable highly efficient and accurate simulations and allow for considerable flexibility in a broad range of applications.« less
Improving the result of forcasting using reservoir and surface network simulation
NASA Astrophysics Data System (ADS)
Hendri, R. S.; Winarta, J.
2018-01-01
This study was aimed to get more representative results in production forcasting using integrated simulation in pipeline gathering system of X field. There are 5 main scenarios which consist of the production forecast of the existing condition, work over, and infill drilling. Then, it’s determined the best development scenario. The methods of this study is Integrated Reservoir Simulator and Pipeline Simulator so-calle as Integrated Reservoir and Surface Network Simulation. After well data result from reservoir simulator was then integrated with pipeline networking simulator’s to construct a new schedule, which was input for all simulation procedure. The well design result was done by well modeling simulator then exported into pipeline simulator. Reservoir prediction depends on the minimum value of Tubing Head Pressure (THP) for each well, where the pressure drop on the Gathering Network is not necessary calculated. The same scenario was done also for the single-reservoir simulation. Integration Simulation produces results approaching the actual condition of the reservoir and was confirmed by the THP profile, which difference between those two methods. The difference between integrated simulation compared to single-modeling simulation is 6-9%. The aimed of solving back-pressure problem in pipeline gathering system of X field is achieved.
77 FR 511 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-05
... . Name of Committee: Genes, Genomes, and Genetics Integrated Review Group, Molecular Genetics B Study...: Digestive, Kidney and Urological Systems Integrated Review Group, Clinical, Integrative and Molecular...
NASA Astrophysics Data System (ADS)
Brinkhuis, D.; Peart, L.
2012-12-01
Scientific discourse generally takes place in appropriate journals, using the language and conventions of science. That's fine, as long as the discourse remains in scientific circles. It is only outside those circles that the rules and techniques of engaging social media tools gain importance. A young generation of scientists are eager to share their experiences by using social media, but is this effective? And how can we better integrate all outreach & media channels to engage general audiences? How can Facebook, Twitter, Skype and YouTube be used as synergy tools in scientific story telling? Case: during IODP Expedtion 342 (June-July 2012) onboard the scientific drillship JOIDES Resolution an onboard educator and videographer worked non-stop fort two months on an integrated outreach plan that tried and tested the limits of all social media tools available to interact with an international public while at sea. The results are spectacular!
Integrated Science Assessment (ISA) of Ozone and Related ...
EPA announced the availability of the final report, Integrated Science Assessment of Ozone and Related Photochemical Oxidants. This document represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA’s decision regarding the adequacy of the current national ambient air quality standards for ozone to protect human health, public welfare, and the environment. Critical evaluation and integration of the evidence on health and environmental effects of ozone to provide scientific support for the review of the NAAQS for ozone.
NASA Astrophysics Data System (ADS)
Gheorghe, Gh. Ion; Popan, Gheorghe
2013-10-01
This scientific paper presents in national premiere and in original concept of the author, the scientific national and the author's original concept, the technological and cross-border mixture value chain of science and engineering of multi-integrative Mechatronics-Integronics-Adaptronics, as high-tech vector support development, for viability and sustainability of a new intelligent and competitive labour market.
IRIS Toxicological Review of Ethylene Glycol Mono-Butyl ...
EPA has conducted a peer review of the scientific basis supporting the human health hazard and dose-response assessment of ethylene glycol monobutyl ether that will appear on the Integrated Risk Information System (IRIS) database. EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of propionaldehyde that will appear on the Integrated Risk Information System (IRIS) database.
NASA Astrophysics Data System (ADS)
Seamon, E.; Gessler, P. E.; Flathers, E.
2015-12-01
The creation and use of large amounts of data in scientific investigations has become common practice. Data collection and analysis for large scientific computing efforts are not only increasing in volume as well as number, the methods and analysis procedures are evolving toward greater complexity (Bell, 2009, Clarke, 2009, Maimon, 2010). In addition, the growth of diverse data-intensive scientific computing efforts (Soni, 2011, Turner, 2014, Wu, 2008) has demonstrated the value of supporting scientific data integration. Efforts to bridge this gap between the above perspectives have been attempted, in varying degrees, with modular scientific computing analysis regimes implemented with a modest amount of success (Perez, 2009). This constellation of effects - 1) an increasing growth in the volume and amount of data, 2) a growing data-intensive science base that has challenging needs, and 3) disparate data organization and integration efforts - has created a critical gap. Namely, systems of scientific data organization and management typically do not effectively enable integrated data collaboration or data-intensive science-based communications. Our research efforts attempt to address this gap by developing a modular technology framework for data science integration efforts - with climate variation as the focus. The intention is that this model, if successful, could be generalized to other application areas. Our research aim focused on the design and implementation of a modular, deployable technology architecture for data integration. Developed using aspects of R, interactive python, SciDB, THREDDS, Javascript, and varied data mining and machine learning techniques, the Modular Data Response Framework (MDRF) was implemented to explore case scenarios for bio-climatic variation as they relate to pacific northwest ecosystem regions. Our preliminary results, using historical NETCDF climate data for calibration purposes across the inland pacific northwest region (Abatzoglou, Brown, 2011), show clear ecosystems shifting over a ten-year period (2001-2011), based on multiple supervised classifier methods for bioclimatic indicators.
Institute for scientific computing research;fiscal year 1999 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D
2000-03-28
Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less
75 FR 52764 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-27
... Cognitive Neuroscience Integrated Review Group, Neurobiology of Motivated Behavior Study Section. Date....gov . Name of Committee: Integrative, Functional and Cognitive Neuroscience Integrated Review Group...- 9664. [email protected] . Name of Committee: Integrative, Functional and Cognitive Neuroscience...
Prototyping an online wetland ecosystem services model using open model sharing standards
Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.
2011-01-01
Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.
NASA Astrophysics Data System (ADS)
Klug Boonstra, S. L.; Swann, J.; Manfredi, L.; Zippay, A.; Boonstra, D.
2014-12-01
The Next Generation Science Standards (NGSS) brought many dynamic opportunities and capabilities to the K-12 science classroom - especially with the inclusion of engineering. Using science as a context to help students engage in the engineering practices and engineering disciplinary core ideas is an essential step to students' understanding of how science drives engineering and how engineering enables science. Real world examples and applications are critical for students to see how these disciplines are integrated. Furthermore, the interface of science and engineering raise the level of science understanding, and facilitate higher order thinking skills through relevant experiences. Astrobiobound! is designed for the NGSS (Next Generation Science Standards) and CCSS (Common Core State Standards). Students also practice and build 21st Century Skills. Astrobiobound! help students see how science and systems engineering are integrated to achieve a focused scientific goal. Students engage in the engineering design process to design a space mission which requires them to balance the return of their science data with engineering limitations such as power, mass and budget. Risk factors also play a role during this simulation and adds to the excitement and authenticity. Astrobiobound! presents the authentic first stages of NASA mission design process. This simulation mirrors the NASA process in which the science goals, type of mission, and instruments to return required data to meet mission goals are proposed within mission budget before any of the construction part of engineering can begin. NASA scientists and engineers were consulted in the development of this activity as an authentic simulation of their mission proposal process.
Towards an integrated forecasting system for fisheries on habitat-bound stocks
NASA Astrophysics Data System (ADS)
Christensen, A.; Butenschön, M.; Gürkan, Z.; Allen, I. J.
2013-03-01
First results of a coupled modelling and forecasting system for fisheries on habitat-bound stocks are being presented. The system consists currently of three mathematically, fundamentally different model subsystems coupled offline: POLCOMS providing the physical environment implemented in the domain of the north-west European shelf, the SPAM model which describes sandeel stocks in the North Sea, and the third component, the SLAM model, which connects POLCOMS and SPAM by computing the physical-biological interaction. Our major experience by the coupling model subsystems is that well-defined and generic model interfaces are very important for a successful and extendable coupled model framework. The integrated approach, simulating ecosystem dynamics from physics to fish, allows for analysis of the pathways in the ecosystem to investigate the propagation of changes in the ocean climate and to quantify the impacts on the higher trophic level, in this case the sandeel population, demonstrated here on the basis of hindcast data. The coupled forecasting system is tested for some typical scientific questions appearing in spatial fish stock management and marine spatial planning, including determination of local and basin-scale maximum sustainable yield, stock connectivity and source/sink structure. Our presented simulations indicate that sandeel stocks are currently exploited close to the maximum sustainable yield, even though periodic overfishing seems to have occurred, but large uncertainty is associated with determining stock maximum sustainable yield due to stock inherent dynamics and climatic variability. Our statistical ensemble simulations indicates that the predictive horizon set by climate interannual variability is 2-6 yr, after which only an asymptotic probability distribution of stock properties, like biomass, are predictable.
78 FR 2681 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-14
...-435-1212, [email protected] . Name of Committee: Immunology Integrated Review Group; Innate Immunity... Scientific Review Special Emphasis Panel; Member Conflicts: Pain and Hearing Date: February 12-13, 2013. Time... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Radiation Oncology. Date...
Bridge over troubled waters: A Synthesis Session to connect scientific and decision making sectors
Lack of access to relevant scientific data has limited decision makers from incorporating scientific information into their management and policy schemes. Yet, there is increasing interest among decision makers and scientists to integrate coastal and marine science into the polic...
NASA Astrophysics Data System (ADS)
Lin, T.; Lin, Z.; Lim, S.
2017-12-01
We present an integrated modeling framework to simulate groundwater level change under the dramatic increase of hydraulic fracturing water use in the Bakken Shale oil production area. The framework combines the agent-based model (ABM) with the Fox Hills-Hell Creek (FH-HC) groundwater model. In development of the ABM, institution theory is used to model the regulation policies from the North Dakota State Water Commission, while evolutionary programming and cognitive maps are used to model the social structure that emerges from the behavior of competing individual water businesses. Evolutionary programming allows individuals to select an appropriate strategy when annually applying for potential water use permits; whereas cognitive maps endow agent's ability and willingness to compete for more water sales. All agents have their own influence boundaries that inhibit their competitive behavior toward their neighbors but not to non-neighbors. The decision-making process is constructed and parameterized with both quantitative and qualitative information, i.e., empirical water use data and knowledge gained from surveys with stakeholders. By linking institution theory, evolutionary programming, and cognitive maps, our approach addresses a higher complexity of the real decision making process. Furthermore, this approach is a new exploration for modeling the dynamics of Coupled Human and Natural System. After integrating ABM with the FH-HC model, drought and limited water accessibility scenarios are simulated to predict FH-HC ground water level changes in the future. The integrated modeling framework of ABM and FH-HC model can be used to support making scientifically sound policies in water allocation and management.
Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit
NASA Astrophysics Data System (ADS)
Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.
2013-12-01
Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.
Cazzaniga, Paolo; Nobile, Marco S.; Besozzi, Daniela; Bellini, Matteo; Mauri, Giancarlo
2014-01-01
The introduction of general-purpose Graphics Processing Units (GPUs) is boosting scientific applications in Bioinformatics, Systems Biology, and Computational Biology. In these fields, the use of high-performance computing solutions is motivated by the need of performing large numbers of in silico analysis to study the behavior of biological systems in different conditions, which necessitate a computing power that usually overtakes the capability of standard desktop computers. In this work we present coagSODA, a CUDA-powered computational tool that was purposely developed for the analysis of a large mechanistic model of the blood coagulation cascade (BCC), defined according to both mass-action kinetics and Hill functions. coagSODA allows the execution of parallel simulations of the dynamics of the BCC by automatically deriving the system of ordinary differential equations and then exploiting the numerical integration algorithm LSODA. We present the biological results achieved with a massive exploration of perturbed conditions of the BCC, carried out with one-dimensional and bi-dimensional parameter sweep analysis, and show that GPU-accelerated parallel simulations of this model can increase the computational performances up to a 181× speedup compared to the corresponding sequential simulations. PMID:25025072
NASA Astrophysics Data System (ADS)
Corona, Thomas
The Karlsruhe Tritium Neutrino (KATRIN) experiment is a tritium beta decay experiment designed to make a direct, model independent measurement of the electron neutrino mass. The experimental apparatus employs strong ( O[T]) magnetostatic and (O[10 5 V/m]) electrostatic fields in regions of ultra high (O[10-11 mbar]) vacuum in order to obtain precise measurements of the electron energy spectrum near the endpoint of tritium beta-decay. The electrostatic fields in KATRIN are formed by multiscale electrode geometries, necessitating the development of high performance field simulation software. To this end, we present a Boundary Element Method (BEM) with analytic boundary integral terms in conjunction with the Robin Hood linear algebraic solver, a nonstationary successive subspace correction (SSC) method. We describe an implementation of these techniques for high performance computing environments in the software KEMField, along with the geometry modeling and discretization software KGeoBag. We detail the application of KEMField and KGeoBag to KATRIN's spectrometer and detector sections, and demonstrate its use in furthering several of KATRIN's scientific goals. Finally, we present the results of a measurement designed to probe the electrostatic profile of KATRIN's main spectrometer in comparison to simulated results.
Improving risk communication through interactive training in communication skills
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, D.A.; White, R.K.
1990-01-01
This paper describes a workshop in communication and public speaking skills recently conducted for a group of public officials whose responsibilities include presenting risk information at public meetings associated with hazardous waste sites. We detail the development and execution of the 2 1/2 day workshop, including the development and integration of a 45-minute video of a simulated public meeting used to illustrate examples of good and bad communication behaviors. The workshop uses a mock public meeting video, participatory video exercises, role-playing, and instructor, and a resource text. This interactive approach to teaching communication skills can help sensitize scientists to themore » public's understanding of risk and improve scientists' confidence and effectiveness in communicating scientific information. 10 refs., 1 fig.« less
Forming of science teacher thinking through integrated laboratory exercises
NASA Astrophysics Data System (ADS)
Horváthová, Daniela; Rakovská, Mária; Zelenický, Ľubomír
2017-01-01
Within the three-semester optional course Science we have also included into curricula the subject entitled Science Practicum consisting of laboratory exercises of complementary natural scientific disciplines whose content exceeds the boundaries of relevant a scientific discipline (physics, biology, …). The paper presents the structure and selected samples of laboratory exercises of physical part of Science Practicum in which we have processed in an integrated way the knowledge of physics and biology at secondary grammar school. When planning the exercises we have proceeded from those areas of mentioned disciplines in which we can appropriately apply integration of knowledge and where the measurement methods are used. We have focused on the integration of knowledge in the field of human sensory organs (eye, ear), dolphins, bats (spatial orientation) and bees (ommatidium of faceted eye) and their modelling. Laboratory exercises are designed in such a way that they would motivate future teachers of natural scientific subjects to work independently with specialized literature of the mentioned natural sciences and ICT.
NASA Astrophysics Data System (ADS)
Meyer, Hanna; Authmann, Christian; Dreber, Niels; Hess, Bastian; Kellner, Klaus; Morgenthal, Theunis; Nauss, Thomas; Seeger, Bernhard; Tsvuura, Zivanai; Wiegand, Kerstin
2017-04-01
Bush encroachment is a syndrome of land degradation that occurs in many savannas including those of southern Africa. The increase in density, cover or biomass of woody vegetation often has negative effects on a range of ecosystem functions and services, which are hardly reversible. However, despite its importance, neither the causes of bush encroachment, nor the consequences of different resource management strategies to combat or mitigate related shifts in savanna states are fully understood. The project "IDESSA" (An Integrative Decision Support System for Sustainable Rangeland Management in Southern African Savannas) aims to improve the understanding of the complex interplays between land use, climate patterns and vegetation dynamics and to implement an integrative monitoring and decision-support system for the sustainable management of different savanna types. For this purpose, IDESSA follows an innovative approach that integrates local knowledge, botanical surveys, remote-sensing and machine-learning based time-series of atmospheric and land-cover dynamics, spatially explicit simulation modeling and analytical database management. The integration of the heterogeneous data will be implemented in a user oriented database infrastructure and scientific workflow system. Accessible via web-based interfaces, this database and analysis system will allow scientists to manage and analyze monitoring data and scenario computations, as well as allow stakeholders (e. g. land users, policy makers) to retrieve current ecosystem information and seasonal outlooks. We present the concept of the project and show preliminary results of the realization steps towards the integrative savanna management and decision-support system.
NASA Astrophysics Data System (ADS)
Bower, Peter; Liddicoat, Joseph; Dittrick, Diane; Maenza-Gmelch, Terryanne; Kelsey, Ryan
2013-04-01
According to the Environmental Protection Agency, there are presently over half a million brownfields in the United States, but this number only includes sites for which an Environmental Site Assessment has been conducted. The actual number of brownfields is certainly into the millions and constitutes one of the major environmental issues confronting all communities today. Taught in part online for more than a decade in environmental science courses at over a dozen colleges, universities, and high schools in the United States, Brownfield Action (BA) is an interactive, web-based simulation that combines scientific expertise, constructivist education philosophy, and multimedia to advance the teaching of environmental science (Bower et al., 2011). In the online simulation and classroom, students form geotechnical consulting companies, conduct environmental site assessment investigations, and work collaboratively to solve a problem in environmental forensics. The BA model contains interdisciplinary scientific and social information that are integrated within a digital learning environment that encourages students to construct their knowledge as they learn by doing. As such, the approach improves the depth and coherence of students understanding of the course material. Like real-world environmental consultants, students are required to develop and apply expertise from a wide range of fields, including environmental science and engineering as well as journalism, medicine, public health, law, civics, economics, and business management. The overall objective is for students to gain an unprecedented appreciation of the complexity, ambiguity, and risk involved in any environmental issue or crisis.
Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment
NASA Technical Reports Server (NTRS)
Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.
2007-01-01
Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.
NASA Astrophysics Data System (ADS)
Steward, David R.; Peterson, Jeffrey M.; Yang, Xiaoying; Bulatewicz, Tom; Herrera-Rodriguez, Mauricio; Mao, Dazhi; Hendricks, Nathan
2009-05-01
An integrated foundation is presented to study the impacts of external forcings on irrigated agricultural systems. Individually, models are presented that simulate groundwater hydrogeology and econometric farm level crop choices and irrigated water use. The natural association between groundwater wells and agricultural parcels is employed to couple these models using geographic information science technology and open modeling interface protocols. This approach is used to study the collective action problem of the common pool. Three different policies (existing, regulation, and incentive based) are studied in the semiarid grasslands overlying the Ogallala Aquifer in the central United States. Results show that while regulation using the prior appropriation doctrine and incentives using a water buy-back program may each achieve the same level of water savings across the study region, each policy has a different impact on spatial patterns of groundwater declines and farm level economic activity. This represents the first time that groundwater and econometric models of irrigated agriculture have been integrated at the well-parcel level and provides methods for scientific investigation of this coupled natural-human system. Results are useful for science to inform decision making and public policy debate.
NASA Technical Reports Server (NTRS)
Jedlovec, Gary; Srikishen, Jayanthi; Edwards, Rita; Cross, David; Welch, Jon; Smith, Matt
2013-01-01
The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of "big data" available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Shortterm Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video-teleconferences, presentation slides, documents, spreadsheets or laptop screens. SAGE is cross-platform, community-driven, open-source visualization and collaboration middleware that utilizes shared national and international cyberinfrastructure for the advancement of scientific research and education.
NASA Astrophysics Data System (ADS)
Jedlovec, G.; Srikishen, J.; Edwards, R.; Cross, D.; Welch, J. D.; Smith, M. R.
2013-12-01
The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of 'big data' available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Short-term Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video-teleconferences, presentation slides, documents, spreadsheets or laptop screens. SAGE is cross-platform, community-driven, open-source visualization and collaboration middleware that utilizes shared national and international cyberinfrastructure for the advancement of scientific research and education.
Bremer, Peer-Timo; Weber, Gunther; Tierny, Julien; Pascucci, Valerio; Day, Marcus S; Bell, John B
2011-09-01
Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications, these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single-pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of features for any given parameter selection in a postprocessing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.
Assimilation of high resolution satellite imagery into the 3D-CMCC forest ecosystem model
NASA Astrophysics Data System (ADS)
Natali, S.; Collalti, A.; Candini, A.; Della Vecchia, A.; Valentini, R.
2012-04-01
The use of satellite observations for the accurate monitoring of the terrestrial biosphere has been carried out since the very early stage of remote sensing applications. The possibility to observe the ground surface with different wavelengths and different observation modes (namely active and passive observations) has given to the scientific community an invaluable tool for the observation of wide areas with a resolution down to the single tree. On the other hand, the continuous development of forest ecosystem models has permitted to perform simulations of complex ("natural") forest scenarios to evaluate forest status, forest growth and future dynamics. Both remote sensing and modelling forest assessment methods have advantages and disadvantages that could be overcome by the adoption of an integrated approach. In the framework of the European Space Agency Project KLAUS, high resolution optical satellite data has been integrated /assimilated into a forest ecosystem model (named 3D-CMCC) specifically developed for multi-specie, multi-age forests. 3D-CMCC permits to simulate forest areas with different forest layers, with different trees at different age on the same point. Moreover, the model permits to simulate management activities on the forest, thus evaluating the carbon stock evolution following a specific management scheme. The model has been modified including satellite data at 10m resolution, permitting the use of directly measured information, adding to the model the real phenological cycle of each simulated point. Satellite images have been collected by the JAXA ALOS-AVNIR-2 sensor. The integration schema has permitted to identify a spatial domain in which each pixel is characterised by a forest structure (species, ages, soil parameters), meteo-climatological parameters and estimated Leaf Area Index from satellite. The resulting software package (3D-CMCC-SAT) is built around 3D-CMCC: 2D / 3D input datasets are processed iterating on each point of the analysed domain to create a set of monthly/ yearly output maps. The integrated approach has been tested on the "Parco Nazionale dei Monti Sibillini, Italy". The high correlation showed between observed and computed data can be considered statistically meaningful and hence the model can be deemed a good predictor both for high resolution and for short period of simulation. Moreover the coupling satellite data at high resolution and field information as input data have shown that these data can be used in the 3D-CMCC Forest Model run. These data can be also successfully used to simulate the main physiological processes at regional scale and to produce with good accordance with measured and literature data, reliable output to better investigate forest growth, dynamic and carbon stock.
76 FR 55400 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-07
...: Integrative, Functional and Cognitive Neuroscience Integrated Review Group; Auditory System Study Section... Neuroscience Integrated Review Group; Neurotoxicology and Alcohol Study Section. Date: October 13, 2011. Time... Disorders and Clinical Neuroscience Integrated Review Group; Developmental Brain Disorders Study Section...
76 FR 55402 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-07
... Committee: Integrative, Functional and Cognitive Neuroscience Integrated Review Group, Neurobiology [email protected] . Name of Committee: Integrative, Functional and Cognitive Neuroscience Integrated Review Group, Mechanisms of Sensory, Perceptual, and Cognitive Processes Study Section. Date: October 11-12...
78 FR 26642 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
..., Functional and Cognitive Neuroscience Integrated Review Group; Somatosensory and Chemosensory Systems Study..., [email protected] . Name of Committee: Integrative, Functional and Cognitive Neuroscience Integrated... personal privacy. Name of Committee: Brain Disorders and Clinical Neuroscience Integrated Review Group...
78 FR 27247 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-09
... personal privacy. Name of Committee: Integrative, Functional and Cognitive Neuroscience Integrated [email protected] . Name of Committee: Brain Disorders and Clinical Neuroscience Integrated Review Group... Neuroscience Integrated Review Group; Neurotransporters, Receptors, and Calcium Signaling Study Section. Date...
77 FR 57571 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-18
...: Genes, Genomes, and Genetics Integrated Review Group; Genomics, Computational Biology and Technology... Reproductive Sciences Integrated Review Group; Cellular, Molecular and Integrative Reproduction Study Section...: Immunology Integrated Review Group; Cellular and Molecular Immunology--B Study Section. [[Page 57572
78 FR 26378 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-06
..., Genomes, and Genetics Integrated Review Group; Prokaryotic Cell and Molecular Biology Study Section. Date..., Kidney and Urological Systems Integrated Review Group; Clinical, Integrative and Molecular... Respiratory Sciences Integrated Review Group; Lung Cellular, Molecular, and Immunobiology Study Section. Date...
Tilden, Samuel J
2010-12-01
Following its determination of a finding of scientific misconduct the Office of Research Integrity (ORI) will seek redress for any injury sustained. Several remedies both administrative and statutory may be available depending on the strength of the evidentiary findings of the misconduct investigation. Pursuant to federal regulations administrative remedies are primarily remedial in nature and designed to protect the integrity of the affected research program, whereas statutory remedies including civil fines and criminal penalties are designed to deter and punish wrongdoers. This commentary discusses the available administrative and statutory remedies in the context of a specific case, that of former University of Vermont nutrition researcher Eric Poehlman, and supplies a possible rationale for the legal result.
Integrated Science Assessment (ISA) for Sulfur Oxides ...
EPA announced the availability of the final report, Integrated Science Assessment (ISA) for Sulfur Oxides – Health Criteria final assessment. This report represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA’s decision regarding whether the current standard for oxides of sulfur (SO2) sufficiently protects public health. The Integrated Plan for Review of the Primary NAAQS for SOx U.S. 2: EPA (2007) identifies key policy-relevant questions that provide a framework for this review of the scientific evidence. These questions frame the entire review of the NAAQS, and thus are informed by both science and policy considerations. The ISA organizes and presents the scientific evidence such that, when considered along with findings from risk analyses and policy considerations, will help the EPA address these questions in completing the NAAQS review.
ERIC Educational Resources Information Center
Kern, Cynthia Lee
2013-01-01
Scientific inscriptions--graphs, diagrams, and data--and argumentation are integral to generating and communicating scientific understanding. Scientific inscriptions and argumentation are also important to learning science. However, previous research has indicated that learners struggle to understand and learn science content represented in…
Investigating the Impact of Automated Feedback on Students' Scientific Argumentation
ERIC Educational Resources Information Center
Zhu, Mengxiao; Lee, Hee-Sun; Wang, Ting; Liu, Ou Lydia; Belur, Vinetha; Pallant, Amy
2017-01-01
This study investigates the role of automated scoring and feedback in supporting students' construction of written scientific arguments while learning about factors that affect climate change in the classroom. The automated scoring and feedback technology was integrated into an online module. Students' written scientific argumentation occurred…
Scientific Notation Watercolor
ERIC Educational Resources Information Center
Linford, Kyle; Oltman, Kathleen; Daisey, Peggy
2016-01-01
(Purpose) The purpose of this paper is to describe visual literacy, an adapted version of Visual Thinking Strategy (VTS), and an art-integrated middle school mathematics lesson about scientific notation. The intent of this lesson was to provide students with a real life use of scientific notation and exponents, and to motivate them to apply their…
Scientific Research: Commodities or Commons?
ERIC Educational Resources Information Center
Vermeir, Koen
2013-01-01
Truth is for sale today, some critics claim. The increased commodification of science corrupts it, scientific fraud is rampant and the age-old trust in science is shattered. This cynical view, although gaining in prominence, does not explain very well the surprising motivation and integrity that is still central to the scientific life. Although…
78 FR 52552 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-23
... & Technologies Integrated Review Group, Nanotechnology Study Section. Date: September 26-27, 2013. Time: 8:00 a.m... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict...
Teaching the Anatomy of a Scientific Journal Article
ERIC Educational Resources Information Center
Schinske, Jeffrey N.; Clayman, Karen; Busch, Allison K.; Tanner, Kimberly D.
2008-01-01
To promote inquiry-based learning, the authors integrate the anatomy of a scientific journal article into their secondary science curriculum. In this article, they present three classroom activities used to teach students about the function and format of scientific journal articles. The first focuses on journal article figures, the second on…
The CompreHensive collaborativE Framework (CHEF)
NASA Astrophysics Data System (ADS)
Knoop, P. A.; Hardin, J.; Killeen, T.; Middleton, D.
2002-12-01
Data integration, publication, and archiving have become important considerations in most fields of science as experiments and models increase in complexity, and the collaborations necessary to conduct the research grow broader. The development of well thought out strategies and standards for such data handling, however, only goes part way in supporting the scientific process. A primary driving force for such efforts is the need of scientists to access and work with data in a timely, reasonable, and often collaborative fashion. Internet-based collaborative environments are one way to help complete this picture, linking scientists to the data they seek and to one another (e.g., Towards a Robust, Agile, and Comprehensive Information Infrastructure for the Geosciences: A Strategic Plan For High Performance Simulation, NCAR, 2000, http://www.ncar.ucar.edu/Director/plan.pdf). The CompreHensive collaborativE Framework (CHEF, http://chefproject.org) is a generic, extensible, web-based, open-source environment for collaboration. CHEF's goal is to provide the basic building blocks from which a community can assemble a collaborative environment that fits their needs. The design of CHEF has been influenced by our experience developing the Space Physics and Aeronomy Research Collaboratory (SPARC, http://www.si.umich.edu/SPARC), which provides integrated access to a wide variety of heterogeneous data sources, including community-standardized data bases. The design has also been heavily influenced by our involvement with an effort to extract and codify the broad underlying technical and social elements that lead to successful collaboratories (http://www.scienceofcollaboratories.org). A collaborative environment is in itself also not the complete answer to data handling, rather, it provides a facilitating environment in which community efforts to integrate, publish, archive, and share data using standard formats and practices can be taken advantage of by the end-users, the scientists. We present examples of how CHEF and its predecessors are utilized in a wide variety of scientific communities, including engineering, chemistry, and the geosciences. In particular, we focus on CHEF's utilization by the earthquake engineering community, whose Network for Earthquake Engineering Simulation (NEES, http://www.nees.org) involves a community effort to develop data standards and practices. In this context NEES is using CHEF as the "integration" environment in which to place the "tools" that bring together scientists and data; this includes data browsers, meta-data search engines, real-time and archival data viewers, etc. By developing these tools within the CHEF framework and exposing the community-developed data standards to the framework, they automatically gain the features, functionality, and capabilities offered by the collaborative environment. We also explore how a collaborative environment, in conjunction with community developed standards and practices for data integration, publishing, and archiving, could benefit the ocean science community.
Sutton, Abigail M; Rudd, Murray A
2016-10-01
The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on 'expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent 'shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.
NASA Astrophysics Data System (ADS)
Sutton, Abigail M.; Rudd, Murray A.
2016-10-01
The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on `expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent `shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.
NASA Astrophysics Data System (ADS)
Carmack, Gay Lynn Dickinson
2000-10-01
This two-part quasi-experimental repeated measures study examined whether computer simulated experiments have an effect on the problem solving skills of high school biology students in a school-within-a-school magnet program. Specifically, the study identified episodes in a simulation sequence where problem solving skills improved. In the Fall academic semester, experimental group students (n = 30) were exposed to two simulations: CaseIt! and EVOLVE!. Control group students participated in an internet research project and a paper Hardy-Weinberg activity. In the Spring academic semester, experimental group students were exposed to three simulations: Genetics Construction Kit, CaseIt! and EVOLVE! . Spring control group students participated in a Drosophila lab, an internet research project, and Advanced Placement lab 8. Results indicate that the Fall and Spring experimental groups experienced significant gains in scientific problem solving after the second simulation in the sequence. These gains were independent of the simulation sequence or the amount of time spent on the simulations. These gains were significantly greater than control group scores in the Fall. The Spring control group significantly outscored all other study groups on both pretest measures. Even so, the Spring experimental group problem solving performance caught up to the Spring control group performance after the third simulation. There were no significant differences between control and experimental groups on content achievement. Results indicate that CSE is as effective as traditional laboratories in promoting scientific problem solving and that CSE is a useful tool for improving students' scientific problem solving skills. Moreover, retention of problem solving skills is enhanced by utilizing more than one simulation.
NASA Astrophysics Data System (ADS)
Grizans, Jurijs; Vanags, Janis
2010-01-01
The aim of this paper is to analyse possibilities of the integration of the method of the ecologically oriented independent scientific research in the study process. In order to achieve the set aim, the following scientific research methods were used: analysis of the conceptual guidelines for the development of environmentally oriented entrepreneurship, interpretation of the experts' evaluation of the ecologically oriented management, analysis of the results of the students' ecologically oriented independent scientific research, as well as monographic and logically constructive methods. The results of the study give an opportunity to make conclusions and to develop conceptual recommendations on how to introduce future economics and business professionals with the theoretical and practical aspects of ecologically oriented management during the study process.
SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog
NASA Astrophysics Data System (ADS)
Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely
2014-05-01
Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dahlburg, Jill; Corones, James; Batchelor, Donald
Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less
ERIC Educational Resources Information Center
Brekke, John S.
2014-01-01
There are two purposes to this article. The first is to update the science of social work framework. The second is to use recent discussions on the nature of realist science and on social work science to propose a definition of social work as an integrative scientific discipline that complements its definition as a profession.
77 FR 1699 - Center for Scientific Review: Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
... of Committee: Risk, Prevention and Health Behavior Integrated Review Group; Social Psychology..., (Virtual Meeting). Contact Person: Gagan Pandya, Ph.D., Scientific Review Officer, National Institutes of...
75 FR 27351 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-14
... Committee: Biology of Development and Aging Integrated Review Group; Development--2 Study Section. Date...: Center for Scientific Review Special Emphasis Panel; ARRA: Developmental Brain Disorders Competitive...
UAS-NAS Flight Test Series 3: Test Environment Report
NASA Technical Reports Server (NTRS)
Hoang, Ty; Murphy, Jim; Otto, Neil
2016-01-01
The desire and ability to fly Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS) is of increasing urgency. The application of unmanned aircraft to perform national security, defense, scientific, and emergency management are driving the critical need for less restrictive access by UAS to the NAS. UAS represent a new capability that will provide a variety of services in the government (public) and commercial (civil) aviation sectors. The growth of this potential industry has not yet been realized due to the lack of a common understanding of what is required to safely operate UAS in the NAS. NASA's UAS Integration in the NAS Project is conducting research in the areas of Separation Assurance/Sense and Avoid Interoperability (SSI), Human Systems Integration (HSI), and Communications (Comm), and Certification to support reducing the barriers of UAS access to the NAS. This research is broken into two research themes namely, UAS Integration and Test Infrastructure. UAS Integration focuses on airspace integration procedures and performance standards to enable UAS integration in the air transportation system, covering Detect and Avoid (DAA) performance standards, command and control performance standards, and human systems integration. The focus of Test Infrastructure is to enable development and validation of airspace integration procedures and performance standards, including integrated test and evaluation. In support of the integrated test and evaluation efforts, the Project will develop an adaptable, scalable, and schedulable relevant test environment capable of evaluating concepts and technologies for unmanned aircraft systems to safely operate in the NAS. To accomplish this task, the Project is conducting a series of human-in-the-loop (HITL) and flight test activities that integrate key concepts, technologies and/or procedures in a relevant air traffic environment. Each of the integrated events will build on the technical achievements, fidelity, and complexity of the previous tests and technical simulations, resulting in research findings that support the development of regulations governing the access of UAS into the NAS. The integrated events started with two initial flight test used to develop and test early integrations and components of the test environment. Test subjects and a relevant test environment were brought in for the integrated HITL (or IHITL) conducted in 2014. The IHITL collected data to evaluate the effectiveness of DAA Well Clear (DWC) algorithms and the acceptability of UAS concepts integrated into the NAS. The first integrated flight test (and the subject of this report) followed the IHITL by replacing the simulation components with live aircraft. The project finishes the integrated events with a final flight test to be conducted in 2016 that provides the researchers with an opportunity to collect DWC and Collision Avoidance (CA) interoperability data during flight encounters.
Schmoldt, D.L.; Peterson, D.L.; Keane, R.E.; Lenihan, J.M.; McKenzie, D.; Weise, D.R.; Sandberg, D.V.
1999-01-01
A team of fire scientists and resource managers convened 17-19 April 1996 in Seattle, Washington, to assess the effects of fire disturbance on ecosystems. Objectives of this workshop were to develop scientific recommendations for future fire research and management activities. These recommendations included a series of numerically ranked scientific and managerial questions and responses focusing on (1) links among fire effects, fuels, and climate; (2) fire as a large-scale disturbance; (3) fire-effects modeling structures; and (4) managerial concerns, applications, and decision support. At the present time, understanding of fire effects and the ability to extrapolate fire-effects knowledge to large spatial scales are limited, because most data have been collected at small spatial scales for specific applications. Although we clearly need more large-scale fire-effects data, it will be more expedient to concentrate efforts on improving and linking existing models that simulate fire effects in a georeferenced format while integrating empirical data as they become available. A significant component of this effort should be improved communication between modelers and managers to develop modeling tools to use in a planning context. Another component of this modeling effort should improve our ability to predict the interactions of fire and potential climatic change at very large spatial scales. The priority issues and approaches described here provide a template for fire science and fire management programs in the next decade and beyond.
Hurricanes and the U.S. Gulf Coast: Science and Sustainable Rebuilding
NASA Astrophysics Data System (ADS)
2006-06-01
The knowledge available among AGU members provides scientific expertise on nearly all of the physical environment of the dynamic Gulf Coast ecosystem complex. Intelligently rebuilding features such as fisheries, oil fields, seaports, farms, and wetlands after hurricanes Katrina and Rita will require ``a well-constructed collaborative effort to maximize the role of science in decisions made about the rebuilding,'' wrote Charles Groat, former director of the U.S. Geological Survey, in a news article published in Eos that simulated an AGU meeting of experts. As a step toward developing a scientific basis for safer communities along the Florida-Alabama-Mississippi-Louisiana-Texas coastline, AGU convened an interdisciplinary `Conference of Experts' on 11-12 January 2006 to discuss what we, as Earth and space scientists, know about the present and projected environment in New Orleans and the Gulf Coast areas affected by the hurricanes of 2005. Twenty scientists, all experts in the fields of science relevant to the Gulf Coast, met to consider ideas for a coordinated effort to integrate science into the decision-making processes necessary for the area's sustainable rebirth. Political, economic, and social issues were intentionally not discussed. Nevertheless, it was recognized that these issues are intertwined with science and are of paramount importance. This report contains a summary of the discussion and is intended to be helpful in providing scientific understanding useful in redevelopment of the affected area.
Building Cognition: The Construction of Computational Representations for Scientific Discovery.
Chandrasekharan, Sanjay; Nersessian, Nancy J
2015-11-01
Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a theoretical analysis of the cognitive roles such representations play, based on an ethnographic study of the building of computational models in a systems biology laboratory. Specifically, we focus on a case of model-building by an engineer that led to a remarkable discovery in basic bioscience. Accounting for such discoveries requires a distributed cognition (DC) analysis, as DC focuses on the roles played by external representations in cognitive processes. However, DC analyses by and large have not examined scientific discovery, and they mostly focus on memory offloading, particularly how the use of existing external representations changes the nature of cognitive tasks. In contrast, we study discovery processes and argue that discoveries emerge from the processes of building the computational representation. The building process integrates manipulations in imagination and in the representation, creating a coupled cognitive system of model and modeler, where the model is incorporated into the modeler's imagination. This account extends DC significantly, and we present some of the theoretical and application implications of this extended account. Copyright © 2014 Cognitive Science Society, Inc.
Cholera and the Scientific Method.
ERIC Educational Resources Information Center
Cronin, Jim
1993-01-01
Describes an approach to teaching the scientific method where an outbreak of cholera within the school is simulated. Students act like epidemiologists in an attempt to track down the source of the contamination. (PR)
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
The Scientific Computing for Chemists course taught at Wabash College teaches chemistry students to use the Python programming language, Jupyter notebooks, and a number of common Python scientific libraries to process, analyze, and visualize data. Assuming no prior programming experience, the course introduces students to basic programming and…
Effects of Students' Prior Knowledge on Scientific Reasoning in Density.
ERIC Educational Resources Information Center
Yang, Il-Ho; Kwon, Yong-Ju; Kim, Young-Shin; Jang, Myoung-Duk; Jeong, Jin-Woo; Park, Kuk-Tae
2002-01-01
Investigates the effects of students' prior knowledge on the scientific reasoning processes of performing the task of controlling variables with computer simulation and identifies a number of problems that students encounter in scientific discovery. Involves (n=27) 5th grade students and (n=33) 7th grade students. Indicates that students' prior…
76 FR 573 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-05
..., [email protected] . Name of Committee: Biology of Development and Aging Integrated Review Group... Committee: Biology of Development and Aging Integrated Review Group, Development--1 Study Section. Date..., Metabolism, Nutrition and Reproductive Sciences Integrated Review Group, Integrative Physiology of Obesity...
78 FR 55752 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-11
... Epidemiology Integrated Review Group, Behavioral Genetics and Epidemiology Study Section. Date: October 8, 2013... Committee: Oncology 1-Basic Translational Integrated Review Group, Cancer Genetics Study Section. Date... Biology Integrated Review Group, Molecular and Integrative Signal Transduction Study Section. Date...
Design of FastQuery: How to Generalize Indexing and Querying System for Scientific Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Jerry; Wu, Kesheng
2011-04-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies such as FastBit are critical for facilitating interactive exploration of large datasets. These technologies rely on adding auxiliary information to existing datasets to accelerate query processing. To use these indices, we need to match the relational data model used by the indexing systems with the array data model used by most scientific data, and to provide an efficient input and output layer for reading and writing the indices. In this work, we present a flexible design that can be easily applied to most scientific datamore » formats. We demonstrate this flexibility by applying it to two of the most commonly used scientific data formats, HDF5 and NetCDF. We present two case studies using simulation data from the particle accelerator and climate simulation communities. To demonstrate the effectiveness of the new design, we also present a detailed performance study using both synthetic and real scientific workloads.« less
Can beaches survive climate change?
Vitousek, Sean; Barnard, Patrick L.; Limber, Patrick W.
2017-01-01
Anthropogenic climate change is driving sea level rise, leading to numerous impacts on the coastal zone, such as increased coastal flooding, beach erosion, cliff failure, saltwater intrusion in aquifers, and groundwater inundation. Many beaches around the world are currently experiencing chronic erosion as a result of gradual, present-day rates of sea level rise (about 3 mm/year) and human-driven restrictions in sand supply (e.g., harbor dredging and river damming). Accelerated sea level rise threatens to worsen coastal erosion and challenge the very existence of natural beaches throughout the world. Understanding and predicting the rates of sea level rise and coastal erosion depends on integrating data on natural systems with computer simulations. Although many computer modeling approaches are available to simulate shoreline change, few are capable of making reliable long-term predictions needed for full adaption or to enhance resilience. Recent advancements have allowed convincing decadal to centennial-scale predictions of shoreline evolution. For example, along 500 km of the Southern California coast, a new model featuring data assimilation predicts that up to 67% of beaches may completely erode by 2100 without large-scale human interventions. In spite of recent advancements, coastal evolution models must continue to improve in their theoretical framework, quantification of accuracy and uncertainty, computational efficiency, predictive capability, and integration with observed data, in order to meet the scientific and engineering challenges produced by a changing climate.
Integration of Information and Scientific Literacy: Promoting Literacy in Undergraduates
Wolbach, Kevin C.; Purzycki, Catherine B.; Bowman, Leslie A.; Agbada, Eva; Mostrom, Alison M.
2010-01-01
The Association of College and Research Libraries recommends incorporating information literacy (IL) skills across university and college curricula, for the goal of developing information literate graduates. Congruent with this goal, the Departments of Biological Sciences and Information Science developed an integrated IL and scientific literacy (SL) exercise for use in a first-year biology course. Students were provided the opportunity to access, retrieve, analyze, and evaluate primary scientific literature. By the completion of this project, student responses improved concerning knowledge and relevance of IL and SL skills. This project exposes students to IL and SL early in their undergraduate experience, preparing them for future academic advancement. PMID:21123700
Understanding as Integration of Heterogeneous Representations
NASA Astrophysics Data System (ADS)
Martínez, Sergio F.
2014-03-01
The search for understanding is a major aim of science. Traditionally, understanding has been undervalued in the philosophy of science because of its psychological underpinnings; nowadays, however, it is widely recognized that epistemology cannot be divorced from psychology as sharp as traditional epistemology required. This eliminates the main obstacle to give scientific understanding due attention in philosophy of science. My aim in this paper is to describe an account of scientific understanding as an emergent feature of our mastering of different (causal) explanatory frameworks that takes place through the mastering of scientific practices. Different practices lead to different kinds of representations. Such representations are often heterogeneous. The integration of such representations constitute understanding.
77 FR 30540 - Center for Scientific Review Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-23
... Committee: Center for Scientific Review Special Emphasis Panel; Drug Discovery for the Nervous System...: Digestive, Kidney and Urological Systems Integrated Review Group; Kidney Molecular Biology and Genitourinary...
Johnson, R Jeremy
2014-01-01
HIV protease has served as a model protein for understanding protein structure, enzyme kinetics, structure-based drug design, and protein evolution. Inhibitors of HIV protease are also an essential part of effective HIV/AIDS treatment and have provided great societal benefits. The broad applications for HIV protease and its inhibitors make it a perfect framework for integrating foundational topics in biochemistry around a big picture scientific and societal issue. Herein, I describe a series of classroom exercises that integrate foundational topics in biochemistry around the structure, biology, and therapeutic inhibition of HIV protease. These exercises center on foundational topics in biochemistry including thermodynamics, acid/base properties, protein structure, ligand binding, and enzymatic catalysis. The exercises also incorporate regular student practice of scientific skills including analysis of primary literature, evaluation of scientific data, and presentation of technical scientific arguments. Through the exercises, students also gain experience accessing computational biochemical resources such as the protein data bank, Proteopedia, and protein visualization software. As these HIV centered exercises cover foundational topics common to all first semester biochemistry courses, these exercises should appeal to a broad audience of undergraduate students and should be readily integrated into a variety of teaching styles and classroom sizes. © 2014 The International Union of Biochemistry and Molecular Biology.
Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N
2017-03-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.
Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.
2016-01-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692
NASA Astrophysics Data System (ADS)
Shao, Hongbing
Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.
VOLCWORKS: A suite for optimization of hazards mapping
NASA Astrophysics Data System (ADS)
Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.
2012-04-01
Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the suite VOLCWORKS, whose principle is to have a flexible-implementation architecture allowing rapid development of software to the extent specified by the needs including calculations, routines, or algorithms, both new and through redesign of available software in the volcanological community, but especially allowing to include new knowledge, models or software transferring them to software modules. The design is component-oriented platform, which allows incorporating particular solutions (routines, simulations, etc.), which can be concatenated for integration or highlighting information. The platform includes a graphical interface with capabilities for working in different visual environments that can be focused to the particular work of different types of users (researchers, lecturers, students, etc.). This platform aims to integrate simulation and visualization phases, incorporating proven tools (now isolated). VOLCWORKS can be used under different operating systems (Windows, Linux and Mac OS) and fit the context of use automatically and at runtime: in both tasks and their sequence, such as utilization of hardware resources (CPU, GPU, special monitors, etc.). The application has the ability to run on a laptop or even in a virtual reality room with access to supercomputers.
78 FR 68462 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Brain Injury and... Methodologies Integrated Review Group; Biomedical Computing and Health Informatics Study Section. Date: December...
77 FR 14028 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
....gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Biological Chemistry and...-1323, [email protected] . Name of Committee: Biology of Development and Aging Integrated Review...
Fostering Scientific Reasoning in Education--Meta-Analytic Evidence from Intervention Studies
ERIC Educational Resources Information Center
Engelmann, Katharina; Neuhaus, Birgit J.; Fischer, Frank
2016-01-01
Scientific reasoning skills are not just for researchers, they are also increasingly relevant for making informed decisions in our everyday lives. How can these skills be facilitated? The current state of research on supporting scientific reasoning includes intervention studies but lacks an integrated analysis of the approaches to foster…
ERIC Educational Resources Information Center
Koffman, Bess G.; Kreutz,Karl J.; Trenbath, Kim
2017-01-01
We present a strategy for using scientific argumentation in an early undergraduate laboratory course to teach disciplinary writing practices and to promote critical thinking, knowledge transformation, and understanding of the scientific method. The approach combines targeted writing instruction; data analysis and interpretation; formulation of a…
ERIC Educational Resources Information Center
Gultepe, Nejla; Kilic, Ziya
2015-01-01
This study was conducted in order to determine the differences in integrated scientific process skills (designing experiments, forming data tables, drawing graphs, graph interpretation, determining the variables and hypothesizing, changing and controlling variables) of students (n = 17) who were taught with an approach based on scientific…